Hi,
YuviPanda, prtksxna, and myself (with help from Tim and Aaron) have been
working the UrlShortener extension, which is designed to implement the
URL shortener RfC[1] (specifically Tim's implementation suggestion).
I've filed T108557[2] to deploy the extension to Wikimedia wikis. We'd
like to use the "w.wiki" short domain, which the WMF is already in
control of.
A test wiki has been set up mimicking what Wikimedia's configuration
would be like: http://urlshortener.wmflabs.org/, and has an accompanying
"short" domain at us.wmflabs.org (e.g. http://us.wmflabs.org/3). Please
play with it and report any bugs you might find :)
[1] https://www.mediawiki.org/wiki/Requests_for_comment/URL_shortener
[2] https://phabricator.wikimedia.org/T108557
Thanks,
-- Legoktm
Hi all,
I would like to introduce to the Wikimedia community WikiToLearn, a FOSS
project of which I am a participant and which is lately getting a lot of
contributions and momentum.
It is a KDE project sponsored (among the others) by Wikimedia Italy and
recently joined by institutions such as HEP Software Foundation (CERN,
Fermilab, Princeton...) or Universities such as University of Pisa and Milano-
Bicocca. These institutions are already populating the website with content.
We aim to provide a platform where learners and teachers can complete, refine
and re-assemble lecture notes in order to create free, collaborative and
accessible textbooks, tailored precisely to their needs.
Although the project is quite young (only a few months old), it is already
growing in allure at an unexpected rate. Thanks to this we are now counting on
nearly 40 developers, and growing (including content developers).
We are different from Wikipedia and other WMF projects in several ways, and in
a sense, complementary. Our focus is on creating complete textbooks (and not
encyclopedic articles), drawing from a professor’s or a student’s own notes,
either existing or that have to be written down.
We also have a strong focus on offline use: all the content of WikiToLearn
should be easily printable by any student for offline use and serious
studying.
Besides a good team for content development, we can count on a small but
motivated team of developers, and we would like to improve communication with
upstream (a.k.a. you ;-) ), because we found ourselves developing a few
features which could probably be made available to the general public, with
some generalization and polishing. ;-)
Is this a right place to start such a discussion?
We would like to help as much as we can, but we might need some mentoring in
how to best approach MediaWiki development, as many of us are relatively new
to OSS/Web development.
Bye,
-Riccardo
I am writing to announce mediawiki-containers [1], a simple installer for
MediaWiki with VisualEditor, Parsoid, RESTBase and other services, using
Linux containers.
The main goal of this project is to make it really easy to set up and
maintain a fully featured MediaWiki system on a wide range of platforms.
The project is in an early stage, but already supports full installation on
Ubuntu, Debian and other systemd-based distributions, as well as starting
containers on OS X via the Docker toolbox [6].
These are the basic steps involved in setting up your own MediaWiki
instance with VisualEditor:
1) Get a Linux VM in labs or from a hosting provider, and select Debian
(Jessie or newer) or Ubuntu 15.04+ as the distribution. Commercial VMs with
reasonable specifications cost about $5 per month [2].
2) Log into your VM, and run this command [7]:
curl
https://raw.githubusercontent.com/wikimedia/mediawiki-containers/master/med…
| sudo bash
3) Answer the questions in the installer.
Here is a screencast of an installer run, illustrating steps 2) and 3):
https://people.wikimedia.org/~gwicke/mediawiki-containers-install.ogv
Under the hood, mediawiki-containers uses several Docker containers:
- wikimedia/mediawiki [3] with MediaWiki 1.27-wmf9 and VisualEditor.
- wikimedia/mediawiki-node-services [4] with Parsoid and RESTBase running
in a single process to minimize memory use.
- MariaDB as the database backend [5].
Data and configurations are stored on the host system in
/srv/mediawiki-containers/data, which means that upgrading is as simple as
fetching the latest container images by re-running the installer.
Optionally, the installer can set up automated nightly updates, which helps
to keep your wiki installation up to date.
The project is brand new, so there is a fair chance that you will encounter
bugs. Please report issues at
https://phabricator.wikimedia.org/maniphest/task/create/?projects=mediawiki…
.
Here are some ideas we have for the next steps:
- Forward `/api/rest_v1/` to RESTBase & configure RESTBase updates. Enable
Wikitext / HTML switching in VE.
- Improve security:
- Run each container under a different, unprivileged user.
- Secure the install / update process with signatures.
- Add popular extensions, and streamline the support for custom extensions.
- Add services like mathoid, graphoid.
- Use the HHVM PHP runtime instead of Zend, possibly using ideas from
https://github.com/kasperisager/php-dockerized.
- Support developer use cases:
- Optionally mount code volumes from the host system.
- Improve configuration customization support.
- Support for more distributions.
Let us know what you think & what you would like to see next at
https://phabricator.wikimedia.org/T92826.
Happy holidays,
Gabriel Wicke and the Services team
[1]: https://github.com/wikimedia/mediawiki-containers
[2]:
http://serverbear.com/compare?Sort=BearScore&Order=desc&Server+Type=VPS&Mon…
[3]: https://github.com/wikimedia/mediawiki-docker and
https://hub.docker.com/r/wikimedia/mediawiki
[4]: https://github.com/wikimedia/mediawiki-node-services and
https://hub.docker.com/r/wikimedia/mediawiki-node-services/
[5]: https://hub.docker.com/_/mariadb/
[6]: https://docs.docker.com/mac/step_one/
[7]: We agree that `curl | bash` has its risks, but it is hard to beat for
simplicity. The Chef project has a good discussion of pros, cons &
alternatives at
https://www.chef.io/blog/2015/07/16/5-ways-to-deal-with-the-install-sh-curl…
.
I ran into an issue trying to save MediaWiki:Common.js I get this page not available (index.php?title=MediaWiki:Common.js&action=submit)
MediaWiki:Common.css save fine
Any Idea what I'm doing wrong?
Thank you!
---------- Forwarded message ----------
From: Chen Davidi <chen(a)wikimedia.org.il>
Date: Wed, Dec 23, 2015 at 9:02 AM
Subject: [Wikitech-ambassadors] Wikimedia Hackathon Jerusalem 2016:
registration is now open!
To: wikitech-ambassadors(a)lists.wikimedia.org, wikitech-l(a)lists.wikimedia.org,
wikitech-announce(a)lists.wikimedia.org, engineering(a)lists.wikimedia.org,
wikimedia-l(a)lists.wikimedia.org
Hi everyone,
I'm thrilled to tell you all that the registration for Wikimedia Hackathon
2016 is now open!
The Hackathon will be held in Jerusalem, between March 31st to April 3rd,
2016, by Wikimedia Israel.
Scholarship applications are open until January 22nd.
You know the drill!
Registration here -
https://docs.google.com/forms/d/17WFRHTCX5_dnCD5hFk1gHEQUz6pTrKp85QsX01rJGa…
More info and updates on
https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2016
If you have any questions, please contact us at
hackathon2016(a)wikimedia.org.il
Hope to see you all in Jerusalem!
Chen Davidi-Almog,
Activity & Resources Coordinator.
Wikimedia Israel
Henning,
If we're going to solve the problem of dead links, it needs to involve
automation, at least for the heavy lifting. Obviously, if a human
contributor can add a better source, that's great. But there are more dead
links than people willing to replace them.
On English Wikipedia, there's Category:All articles with dead external
links, and it contains more than 134,000 articles[1] -- and those are just
the pages where somebody's added the Dead link template. There are a lot of
missing references -- not just on English WP, but on all the projects --
and connecting those links to a live archive makes them useful again.
For links that were moved, we may be able to collect and use that
information -- I know that we're looking into what kind of metadata we can
collect when a new link is added to the page. But I think finding
alternative sources has to come from human contributors, and that's hard to
scale.
Danny
PM, Community Tech
[1]:
https://en.wikipedia.org/wiki/Category:All_articles_with_dead_external_links
On Mon, Dec 28, 2015 at 9:51 AM, Henning Schlottmann <h.schlottmann(a)gmx.net>
wrote:
> On 16.12.2015 21:12, Danny Horn wrote:
>
> > #1. Migrate dead links to the Wayback Machine (111 support votes)
>
> I really hope, you don't follow that wish, as it is detrimental to the
> quality of Wikipedia.
>
> Switching dead links to the archive is a move to a dead end, instead of
> looking for
>
> a) the new correct URL, as many links were just moved.
> b) alternative sources for the same fact.
>
> Ciao Henning
>
>
>
> _______________________________________________
> Wikimedia-l mailing list, guidelines at:
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
> New messages to: Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
>
The XML database dumps are missing all through May, apparently
because of a memory leak that is being worked on, as described
here,
https://phabricator.wikimedia.org/T98585
However, that information doesn't reach the person who wants to
download a fresh dump and looks here,
http://dumps.wikimedia.org/backup-index.html
I think it should be possible to make a regular schedule for
when these dumps should be produced, e.g. once each month or
once every second month, and treat any delay as a bug. The
process to produce them has been halted by errors many times
in the past, and even when it runs as intended the interval
is unpredictable. Now when there is a bug, all dumps are
halted, i.e. much delayed. For a user of the dumps, this is
extremely frustrating. With proper release management, it
should be possible to run the old version of the process
until the new version has been tested, first on some smaller
wikis, and gradually on the larger ones.
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
Hi,
I cannot find a piece of information about $subject.
As help page <https://www.mediawiki.org/wiki/Help:Range_blocks/IPv6> was
moved from Meta to MediaWiki, it gives now a general talk instead of actual
information. It says: "Like IPv4, IPv6 rangeblocks are limited by
$wgBlockCIDRLimit <https://www.mediawiki.org/wiki/Manual:$wgBlockCIDRLimit>,
which by default allows rangeblocks of up to /64 in size (before MediaWiki
1.20wmf5, which changes the default to /19)."
All right, but what is the actual value for WMF wikis, and where is this
information available?
--
Bináris