The XML database dumps are missing all through May, apparently
because of a memory leak that is being worked on, as described
here,
https://phabricator.wikimedia.org/T98585
However, that information doesn't reach the person who wants to
download a fresh dump and looks here,
http://dumps.wikimedia.org/backup-index.html
I think it should be possible to make a regular schedule for
when these dumps should be produced, e.g. once each month or
once every second month, and treat any delay as a bug. The
process to produce them has been halted by errors many times
in the past, and even when it runs as intended the interval
is unpredictable. Now when there is a bug, all dumps are
halted, i.e. much delayed. For a user of the dumps, this is
extremely frustrating. With proper release management, it
should be possible to run the old version of the process
until the new version has been tested, first on some smaller
wikis, and gradually on the larger ones.
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
This drag and drop from a CSV and other mentions of VE neatness would make a great blog post. I would love see a post about being productive in VisualEditor. Tips and tricks as it were.
-Chris K.
This electronic mail and any attached documents are intended solely for the named addressee(s) and contain confidential information. If you are not an addressee, or responsible for delivering this email to an addressee, you have received this email in error and are notified that reading, copying, or disclosing this email is prohibited. If you received this email in error, immediately reply to the sender and delete the message completely from your computer system.
Hi, I'd like to present a new RFC for your consideration:
https://www.mediawiki.org/wiki/Requests_for_comment/Minifier
It is about how we can shave 10-15% off the size if JavaScript
delivered to users.
Your comments are highly welcome!:)
--
Best regards,
Max Semenik ([[User:MaxSem]])
In the next RFC meeting, we will discuss the following RFC:
* Improving extension management
<https://www.mediawiki.org/wiki/Requests_for_comment/Improving_extension_man…>
The meeting will be on the IRC channel #wikimedia-office on
chat.freenode.net at the following time:
* UTC: Wednesday 21:00
* US PDT: Wednesday 14:00
* Europe CEST: Wednesday 23:00
* Australia AEST: Thursday 07:00
-- Tim Starling
There's a pretty hilarious American police procedural TV show in 2015
called "CSI: Cyber", featuring mostly cybercrime. Obviously they have to
dredge up snippets of code from places for screenshots on the show.
Episode 4 happened to include a tidbit from MediaWiki 1.25/wmf3. Supposedly
the code was a hack to make your printer blow up.
Original lulz and screenshots via
http://moviecode.tumblr.com/post/114815574587/this-is-from-csi-cyber-s01e04…
When api.php was basically the only API in MediaWiki, calling it "the API"
worked well. But now we've got a Parsoid API, Gabriel's work on a REST
content API, Gabriel's work on an internal storage API, and more on the
way. So just saying "the API" is getting confusing.
So let's bikeshed a reasonably short name for it that isn't something awful
like "the api.php API". I'm horrible at naming.
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
As part of the on-going UI standardisation work[0], we have just merged
into MediaWiki master a patch that will allow users of HTMLForm[1] to
easily switch over to use OOUI[2]. This allows you to make your code use
the standard UI styling and components with very little effort, improving
consistency and accessibility.
To convert your code over from the existing styling of MediaWiki's HTMLForm
(or its VForm version), you can replace the instantiation command of
- $form = new HTMLForm( … );
- or:
- $form = HTMLForm::factory( 'vform', … );
-
with
- $form = HTMLForm::factory( 'ooui', … );
… and everything Should Just Work™.
This doesn't break anything for existing use of this code, and you can
explicitly retain the existing styling by using $form = HTMLForm::factory(
'table', … ); or 'div' or 'vform' instead if you wish. Feel free to add
your code to the epic task on Phabricator which is the central liaison
point for this work[3].
Finally, please be careful to check your interfaces after converting them –
although we have tested this and believe it works well, there may be some
edge cases where it doesn't quite work. We're still missing a few features
[4] (such as prettier radio buttons), which we're hoping to have ready in a
week or two. If do you find any such issues, please report them so we can
fix them. Also note that OOUI already implements the new "MediaWiki" theme
for all users, so please ensure you advertise the visual changes as they
roll out to minimise disruption.
[0] - https://phabricator.wikimedia.org/T49145
[1] - https://www.mediawiki.org/wiki/HTMLForm
[2] - https://phabricator.wikimedia.org/T85291
[3] - https://phabricator.wikimedia.org/T100270
[4] - https://phabricator.wikimedia.org/T100279
Yours,
--
James D. Forrester
Product Manager, Editing
Wikimedia Foundation, Inc.
jforrester(a)wikimedia.org | @jdforrester
[ cross-posted to MediaWiki-i18n, Wikimedia-L and Wikitech-L ]
Dear Wikimedians,
The 2000th article that was written using the ContentTranslation extension
was published today.
Article #2000 was translated from English to Greek, and it's about Škocjan
Caves, a UNESCO World Heritage site in Slovenia.
Original: https://en.wikipedia.org/wiki/%C5%A0kocjan_Caves
Translated:
https://el.wikipedia.org/wiki/%CE%A3%CF%80%CE%AE%CE%BB%CE%B1%CE%B9%CE%B1_%C…
In case you're wondering what ContentTranslation is, here's a brief
summary: ContentTranslation is an extension that helps Wikipedia editors to
create articles quickly and easily by translating them from other
languages. It's being developed by the Language Engineering team. Its
design started in the summer of 2013 and its coding started in early 2014.
You can find more info at https://www.mediawiki.org/wiki/CX as well as in
the following blog posts:
* http://blog.wikimedia.org/2015/01/10/content-translation-beta-coming-soon/
* http://blog.wikimedia.org/2015/01/20/try-content-translation/
*
http://blog.wikimedia.org/2015/04/06/content-translation-improved-my-edits/
* http://blog.wikimedia.org/2015/04/08/the-new-content-translation-tool/
Some more data about ContentTranslation:
* Our first deployment was in mid-January to Catalan, Spanish, Portuguese,
Esperanto, Norwegian Bokmal, Danish, Indonesian and Malay. Now we support
43 languages, and this number is growing every week as we extend the
deployment (a special thank-you to the Ops and Release Engineering people,
who continuously and tirelessly support our deployment effort).
* In all the Wikipedias in which ContentTranslation is deployed, it is
currently defined as a Beta feature, which means that it is only available
to logged-in users who opted into it in the preferences.
* The 1000th article was written on April 10th, so it took much less to get
to 2000 than to 1000.
* The language into which the most articles were translated is Catalan:
762. The Catalan Wikipedia community always had a strong inclination to
translation, it was the first one that volunteered to test the tool in labs
in the summer of 2014 and provided a lot of useful feedback, and it also
has good machine translation support thanks to the Freely-licensed Apertium
engine.
* The second most popular target language is Spanish. It started slowly in
the first couple of months, but it's quickly growing since March.
* Other target languages that are quickly growing lately are French,
Portuguese and Ukrainian.
* The language from which the largest number of articles is translated is
English. It is followed by Spanish, from which a lot of articles are
translated to the closely related Portuguese and Catalan.
* The total number of people who published at least one translated article
into any language is 663.
* Of more than 2000 articles that were created, about 60 were deleted, so
we have a reason to think that the quality of the created articles is
pretty OK.
* In Catalan we see that ContentTranslation has some influence on the
number of articles created per day - it was usually between 60 and 90
before 2015, and in January and February it was over a 100. It's too early
to say how does it influence other languages, but we are optimistic ;)
* A community discussion about enabling the tool in the French Wikipedia
ended with 50 "votes" in support of the tool and 0 "votes" against it ;)
Some of our plans for the coming months are:
* Enabling more languages, including big ones like English, Russian and
Italian, as well as right-to-left languages.
* Improving the support for links.
* Creating support for smart suggestions of articles to translate, as well
as "task lists" for translation projects.
* Starting to get the tool out of beta status :)
I'd like to thank all the Wikimedia volunteers around the planet who are
participating in this effort by translating articles, translating the
extension's user interface, testing the tool, assisting other wikipedians
to translate, organizing translation workshops, reporting useful bugs,
submitting patches, and generally proving day after day what an incredible
community they are - hard-working, massively-multilingual, helpful,
patient, creative and talented.
Thank you - we have a lot more to achieve together \o/
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
Hi, anybody interested in participating at the
Open Help Conference & Sprints
September 26-30
Cincinnati, Ohio, USA
http://conf.openhelp.cc/
A Wikimedia expedition attended a couple of years ago with a Wikipedia &
user help focus, and they were happy about the event.
http://blog.wikimedia.org/2013/07/03/wikipedians-open-help-conference/
Lately we are focusing on users of our APIs, datasets, tools,
infrastructure... Would it make sense to organize something at that event?
Wikimedia volunteers, remember that we might be able to support your travel
expenses.
https://meta.wikimedia.org/wiki/Grants:TPS
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hi,
I want to build a tool which will generate the statistics for my native
wiki. Before starting to build a tool i was searching for an existing one.
The "Super Counter"[1] has some of the features i want to implement in my
tool.
So is there any way to use the "Super Counter" as an API? if so, can anyone
please show me the path of the documentation?
[1] - https://tools.wmflabs.org/supercount/index.php
--
*Nasir Khan Saikat*
www.nasirkhn.com