When api.php was basically the only API in MediaWiki, calling it "the API"
worked well. But now we've got a Parsoid API, Gabriel's work on a REST
content API, Gabriel's work on an internal storage API, and more on the
way. So just saying "the API" is getting confusing.
So let's bikeshed a reasonably short name for it that isn't something awful
like "the api.php API". I'm horrible at naming.
Brad Jorsch (Anomie)
The reading team are currently focusing energy on speeding up the site
for all our users (https://phabricator.wikimedia.org/T98986 is the
tracking bug where this work can be followed)
Off the back of https://phabricator.wikimedia.org/T105361 I had a
quick chat with Ori to document how the performance team is currently
identifying problems with MediaWiki's code. I'm sharing here, so
anyone who is interested in helping us improve the time our users can
load our content can analyse our data, raise tasks, and submit
I'm hoping this will be useful for anyone who wants to get involved in
an effort to make our site faster for our users (this is not desktop
specific). If you have anything useful to add please do, after some
discussion or nods I'd love to share some best practices on
Tool 1) Use http://webpagetest.org (no credentials necessary)
* Use https://en.wikipedia.org/wiki/Facebook as an example wiki page
* Choose a region of the world and browser
* Select first view only since this is what we are currently
interested in (repeat view is when they load again - and it should be
quicker as it is from cache).
* Capture video can be turned off - I personally find the screenshots
To shout out some of the advanced settings, the more
interesting/useful features include:
*Chrome > capture dev tools timeline
* Setting speed 3G or 2G
* Script can be used to conditionally turn on things which are not yet
available to everyone e.g. VisualEditor
You can do a lot of this in your Chrome browser locally, but different
browsers may have different behaviours and are in a fixed location so
this does not get captured in this tool. The visual screenshots also
make it easier to see where things get blocked. With the timeline from
advanced tools you can match up white screens with blocking
Tool 2) Add http://performance.wikimedia.org to your browser
bookmarks. Navigation timing section is probably the most interesting
right now. It points to https://grafana.wikimedia.org (no credentials
needed) which is powered by http://graphite.wikimedia.org (Access
graphite with your wikitech credentials). This data is sourced from
our users, so is a good representation of how we are doing.
If a graph is missing you can create a new one from data in graphite
by clicking "add row" or editing an existing graph.
Clicking edit on
you'll be able to understand where the data comes from on graphite
Note for graphs median data is less sensitive to edge cases so best to
use this as a more realistic indicator.
Folders in graphite, are populated by scripts that live in:
To create a graph, simply go to an existing workboard, save it under a
different name (this clones it) - don't worry you can't mess up and
delete existing workboards.
Tool 3) Speedcurve requires you to setup an account but it gives you
an opiniated view about things you care about and is nicely presented
so could be a good source of inspiration for your own grafana
To oversimplify what it does: each day it will access a page, store
result, allow you to see historic data.
Note the performance team has plans to setup infrastructure to automate this.
Tool 4) is one we are not using - http://sitespeed.io. We might want
to use it for performance regressions test.
In the grand scheme of things it would be great to get to a place
where Jenkins complains if you cause a regression in firstPaint time
but we are a long way from that but let's work in that direction :-)
Let's live up to the Hawaiian word after which we are named!
Apologies if this is oversimplified, please take this as an
opportunity to share how you/your team/your company test page
performance. I see this mailing list as a good place to share these
sort of things!
In the XML dump files, I get <text ...>plaintext</text>.
When I build a mirror using XML dump files, I get:
However, when I then create a new page on my mirror, I get:
When I build a mirror, I would like to compress the <text
...>plaintext</text> to get:
I would like this done for every text revision, so as to save both disk
space and communication bandwidth between web server and browser.
There is little relevant documentation on <https://www.mediawiki.org>. So I
have run a few experiments.
exp1) I pipe the plaintext through gzip, escape for MySQL, and build the
However, when I browse, I get the message:
``The revision #165770 of the page named "Main Page" does not exist''
When I look in the database, some kind of ciphertext does indeed exist.
Many utilities compress plaintext using LZ77 and Huffman encoding, but each
differs as to the file header and tail. Some versions of deflate have no
header at all. So I try four more experiments:
exp2) gzip, but throw away the 10 byte header (to simulate deflate)
/bin/gzip | tail -c +11
exp3) perl compress
/usr/bin/perl -MCompress::Zlib -e 'undef $/; print compress(<>)'
exp4) python compress, then throw away the single-quotes
/usr/bin/python -c \"import zlib,sys;print
repr(zlib.compress(sys.stdin.read()))\" | /bin/sed 's/^.//; s/.$//'
exp5) zlib-flate from the qpdf DEB package
For all experiments, the browser gives the same error message.
4) Reading compressed old_text
It should be possible to read the old_text ciphertext using command-line
I created a user page which mediawiki stored compressed. It is displayed
correctly by the browser. But when I tried to read it directly from the
database, there were problems.
(shell) mysql --host=localhost --user=root --password simplewiki
--skip-column-names --silent --execute 'select old_text from
simplewiki.text where old_id=5146705' | zlib-flate -uncompress
flate: inflate: data: incorrect data check
Please provide documentation as to how mediawiki handles compressed
a) How is plaintext compressed?
b) Is the ciphertext escaped for MySQL after compression?
c) How does mediawiki handle old_flags=utf-8,gzip?
d) How are the contents of old_text unescaped and decompressed for
e) Where in the mediawiki code should I be looking to understand this
What if we added extra projects to phabricator for programming
languages (such as language-php, language-c) which could be optionally
added to some tickets if help of people who know these languages would
be needed. So that it would be possible for example to c++ experts to
filter out open tasks that need c++ expert to look in them and so on?
Currently I have few of such tasks that I would like to have experts
in some language to look at, but there isn't really an easy way to do
What you think? Should we add these meta-projects?
I'm writing with plans for the Wikimedia iOS engineering team to move its
workflow to GitHub with Travis CI, much like RESTbase.
The Wikimedia iOS engineers have been maintaining their own CI and build
server and using Gerrit for code review. The more time efficient and
commonplace approach for open source iOS software development leans heavily
on GitHub with Travis CI instead (e.g., WordPress and Firefox).
By using GitHub with Travis CI, the team believes it will work faster,
improve testing, grow developer confidence in making code changes, and,
most importantly, deploy fewer bugs to production.
For builds requiring sensitive information (e.g., prod certs), will
continue to run on WMF's Mac Mini. As is done for Android, when betas are
pushed, the team will notify mobile-l.
Feel free to reply or email me directly with any questions or comments.
EN Wikipedia user page: https://en.wikipedia.org/wiki/User:Brian.gerstle
I'm happy to hear that VE is coming to mobile web.
I'd like to know more about what the plans are for user testing of VE on
mobile. Would that happen in late Q2 at the earliest, and how much emphasis
will there be on user testing mobile VE while it's in beta? My prime
interest is in making sure that the transition out of beta is smooth and
that end users have a good experience with mobile VE from the moment that
it leaves beta.
On Jul 27, 2015 10:12 AM, "Adam Baso" <abaso(a)wikimedia.org> wrote:
> Cross posting to mobile-l.
> ---------- Forwarded message ----------
> From: Adam Baso <abaso(a)wikimedia.org>
> Date: Mon, Jul 27, 2015 at 10:11 AM
> Subject: Editing + Reading Meeting Notes
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Hi all,
> James Forrester, Florian, and I met Wednesday, 22-July-2015 to discuss the
> Editing roadmap, with the backdrop of editing modes available in mobile web
> and apps but both the mobile web and apps teams now being in the Reading
> department. Notes:
> * FY 2015-2016: active Editing development for apps not planned
> * General plan is to replace the current mobile web editing experiences
> with new (replacement) VE and wikitext editor maintained by the VE team for
> mobile web
> * Q1: VE mobile prototyping (doesn't require Reading involvement)
> * Q2: Editing for mobile web replacement *coding* starting, but rollout on
> mobile web would begin in some future quarter _after_ Q2
> * Feature submission practices for volunteers submitting editing related
> stuff for the mobile web for the time being (feature submissions
> discouraged for now as it will end up being replaced; mainline VisualEditor
> / next gen wikitext editing in collaboration with VE team would probably
> make more sense) -
> ** Create task in reading-web Phabricator board and indicate the details
> of what you were thinking to work on and roughly when. Add James Forrester
> and Joaquin Hernandez to card.
> ** Reach out to James_F (Senior Product Manager, VisualEditor) and joakino
> (Reading Web engineering product owner and tech lead) on #wikimedia-mobile
> on Freenode to discuss the idea and to determine who would need to code
> review and test
> * Code review for bugfixes for the existing mobile editing code should be
> done by Reading Web, and code review plus testing should be done by Editing
> as well. Ping joakino and James_F on IRC to figure out who to add to
> * As the Editing team gets into the practice of submitting patches for
> MobileFrontend to swap out the editor, as usual, tasks should be filed well
> ahead of time in the reading-web Phabricator board so there's a heads up
> about potential code review. Also, Editing and Reading should be tracking
> Q2 and subsequent quarter planning together to ensure dependencies are
> clearly defined and agreed upon.
> I also spoke to Roan from Collaboration after the Scrum of Scrums the same
> day. Roan indicated that there isn't an emphasis on rolling out Flow to
> mobile Wikipedias en masse for FY 2015-2016. And generally, when Flow does
> become slated for rollout on the mobile Wikipedias and sister projects in a
> broader sense it shouldn't require work - or anything substantial, anyway -
> from the Reading team.
> Mobile-l mailing list
I'm writing an extension for MediaWiki so users can use a second factor
authentication in their MediaWiki accounts.
I would like to know if there is any project on this topic, or if there is
already someone working on something like this.
I'm planning on working in a MediaWiki extension that uses Latch (
https://latch.elevenpaths.com/www/service.html ) as a second factor.
This is my first time collaborating on a project this size so any advice
will be very welcome.
Thank you for your time.