I would like to introduce to the Wikimedia community WikiToLearn, a FOSS
project of which I am a participant and which is lately getting a lot of
contributions and momentum.
It is a KDE project sponsored (among the others) by Wikimedia Italy and
recently joined by institutions such as HEP Software Foundation (CERN,
Fermilab, Princeton...) or Universities such as University of Pisa and Milano-
Bicocca. These institutions are already populating the website with content.
We aim to provide a platform where learners and teachers can complete, refine
and re-assemble lecture notes in order to create free, collaborative and
accessible textbooks, tailored precisely to their needs.
Although the project is quite young (only a few months old), it is already
growing in allure at an unexpected rate. Thanks to this we are now counting on
nearly 40 developers, and growing (including content developers).
We are different from Wikipedia and other WMF projects in several ways, and in
a sense, complementary. Our focus is on creating complete textbooks (and not
encyclopedic articles), drawing from a professor’s or a student’s own notes,
either existing or that have to be written down.
We also have a strong focus on offline use: all the content of WikiToLearn
should be easily printable by any student for offline use and serious
Besides a good team for content development, we can count on a small but
motivated team of developers, and we would like to improve communication with
upstream (a.k.a. you ;-) ), because we found ourselves developing a few
features which could probably be made available to the general public, with
some generalization and polishing. ;-)
Is this a right place to start such a discussion?
We would like to help as much as we can, but we might need some mentoring in
how to best approach MediaWiki development, as many of us are relatively new
to OSS/Web development.
I ran into an issue trying to save MediaWiki:Common.js I get this page not available (index.php?title=MediaWiki:Common.js&action=submit)
MediaWiki:Common.css save fine
Any Idea what I'm doing wrong?
---------- Forwarded message ----------
From: Chen Davidi <chen(a)wikimedia.org.il>
Date: Wed, Dec 23, 2015 at 9:02 AM
Subject: [Wikitech-ambassadors] Wikimedia Hackathon Jerusalem 2016:
registration is now open!
To: wikitech-ambassadors(a)lists.wikimedia.org, wikitech-l(a)lists.wikimedia.org,
I'm thrilled to tell you all that the registration for Wikimedia Hackathon
2016 is now open!
The Hackathon will be held in Jerusalem, between March 31st to April 3rd,
2016, by Wikimedia Israel.
Scholarship applications are open until January 22nd.
You know the drill!
Registration here -
More info and updates on
If you have any questions, please contact us at
Hope to see you all in Jerusalem!
Activity & Resources Coordinator.
If we're going to solve the problem of dead links, it needs to involve
automation, at least for the heavy lifting. Obviously, if a human
contributor can add a better source, that's great. But there are more dead
links than people willing to replace them.
On English Wikipedia, there's Category:All articles with dead external
links, and it contains more than 134,000 articles -- and those are just
the pages where somebody's added the Dead link template. There are a lot of
missing references -- not just on English WP, but on all the projects --
and connecting those links to a live archive makes them useful again.
For links that were moved, we may be able to collect and use that
information -- I know that we're looking into what kind of metadata we can
collect when a new link is added to the page. But I think finding
alternative sources has to come from human contributors, and that's hard to
PM, Community Tech
On Mon, Dec 28, 2015 at 9:51 AM, Henning Schlottmann <h.schlottmann(a)gmx.net>
> On 16.12.2015 21:12, Danny Horn wrote:
> > #1. Migrate dead links to the Wayback Machine (111 support votes)
> I really hope, you don't follow that wish, as it is detrimental to the
> quality of Wikipedia.
> Switching dead links to the archive is a move to a dead end, instead of
> looking for
> a) the new correct URL, as many links were just moved.
> b) alternative sources for the same fact.
> Ciao Henning
> Wikimedia-l mailing list, guidelines at:
> New messages to: Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
The XML database dumps are missing all through May, apparently
because of a memory leak that is being worked on, as described
However, that information doesn't reach the person who wants to
download a fresh dump and looks here,
I think it should be possible to make a regular schedule for
when these dumps should be produced, e.g. once each month or
once every second month, and treat any delay as a bug. The
process to produce them has been halted by errors many times
in the past, and even when it runs as intended the interval
is unpredictable. Now when there is a bug, all dumps are
halted, i.e. much delayed. For a user of the dumps, this is
extremely frustrating. With proper release management, it
should be possible to run the old version of the process
until the new version has been tested, first on some smaller
wikis, and gradually on the larger ones.
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
I cannot find a piece of information about $subject.
As help page <https://www.mediawiki.org/wiki/Help:Range_blocks/IPv6> was
moved from Meta to MediaWiki, it gives now a general talk instead of actual
information. It says: "Like IPv4, IPv6 rangeblocks are limited by
which by default allows rangeblocks of up to /64 in size (before MediaWiki
1.20wmf5, which changes the default to /19)."
All right, but what is the actual value for WMF wikis, and where is this