The call for submissions for Wikimania 2018
<https://wikimania2018.wikimedia.org/wiki/Wikimania> is now open (with
apologies for cross-posting), woohoo!
Full instructions on making a proposal is here:
https://wikimania2018.wikimedia.org/wiki/Submissions (deadline is March
18th, 2018) and additional event information is in the email below.
We're looking forward to seeing you all in Cape Town, South Africa this
Program Manager, Engineering
---------- Forwarded message ----------
From: Liam Wyatt <liamwyatt(a)gmail.com>
Date: Sun, Feb 4, 2018 at 11:21 PM
Subject: [Wikimania-program] Wikimania 2018 Call for Submissions
To: "Wikimania general list (open subscription)" <
wikimania-l(a)lists.wikimedia.org>, Wikimedia Mailing List <
Cc: Program committee list <wikimania-program(a)lists.wikimedia.org>
Dear Wikimedia community,
We are pleased to announce that Wikimania 2018 is now accepting proposals
for workshops, discussions, presentations, or research posters to give
during the conference. To read the full instructions visit the event wiki
and click on the link provided there to make your proposal:
The deadline for submissions is 23:59 UTC on *Sunday March 18, 2018*.
This is approximately 6 weeks away. Whether you are a community member of
one of the Wikimedia projects, or a fellow open content creator or
consumer, we welcome your proposal for a session.
This year, the conference will be taking place in Cape Town, South Africa,
where the organisers are giving this Wikimania a unique flavor — an
explicit theme based in African philosophy:
“Bridging knowledge gaps, the *ubuntu* way forward.”
Read more about this theme, why it was chosen, and what it means for the
conference program at the Wikimedia blog:
Throughout the conference program, this theme will be tightly held, but
loosely defined - in order to encourage a diverse range of responses to the
theme. It is our hope that this theme will give us the opportunity to
further our goal of creating the “sum of human knowledge”, by encouraging
greater diversity and inclusion in who participates and what we discuss at
To learn more, and to make a proposal for the conference, please visit:
Please forward this announcement to other lists and groups across the
We look forward to reading your submissions. Sincerely,
*Program committee co-chairs Emna Mizouni, Felix Nartey, and Liam Wyatt.*
Wikimania-program mailing list
Since Friday, CI was unable to process changes receiving a CR+2 if they
declared a Depends-On on an opened change. The issue should be fixed as
of 21:30 UTC when I hotfixed Zuul.
I would like to thanks:
* Kunal Mehta to have taken on his personal time to restart Zuul over
the week-end and restore the service multiple time.
* Paladox also assisted in pointing out the root cause.
* Tyler Cipriani who live peer reviewed the debug session
My apologizes to all patches and developers that got affected.
I will write an incident report later on and reply on this thread when
it is complete.
Antoine "hashar" Musso
Sorry for cross-posting!
Reminder: Technical Advice IRC meeting again **tomorrow, Wednesday 4-5 pm
UTC** on #wikimedia-tech.
The Technical Advice IRC meeting is open for all volunteer developers,
topics and questions. This can be anything from "how to get started" over
"who would be the best contact for X" to specific questions on your project.
If you know already what you would like to discuss or ask, please add your
topic to the next meeting: https://www.mediawiki.org/
Hope to see you there!
Michi (for WMDE’s tech team)
Michael F. Schönitzer
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
I made a breaking change last year when I moved Minerva out of
MobileFrontend. I anticipated disruption so made sure it broke in an
obvious way (otherwise many mobile sites would start using the Vector skin
for their mobile web experience and this might be harder for sysadmins to
detect) and had instructions on how to fix it. We use semvar so I bumped
the version to show it was a breaking change and I updated mediawiki.org with
lots of docs.
What I didn't realise is lots of 3rd parties do not check their server logs
when updating and do not necessarily know what semvar means.
I'm getting feedback  from impacted third parties that they check the
release notes and the update script, but as far as I'm aware extensions and
skins cannot update these in any way. Is my understanding correct? If not,
is there a way we could make them do so? Either via hook or convention
(e.g. identical file name in repo)?
Do we have any guidelines for breaking changes in extensions and skins on
Senior Software Engineer
Google's doing some research into Gerrit and is looking for some
participants in a research study. Considering how critical Gerrit is to our
work, I thought people might be interested in participating. They will be
doing video/audio sessions with users remotely (I'm assuming something via
Obviously, this is totally optional, but if you think you would like to
participate then follow the signup link in the forwarded e-mail. The first
sessions are next week :)
The original thread is at
On Thursday, February 1, 2018 at 11:00:35 PM UTC-8, Billy Leet wrote:
> Hi friends!
> I'm Billy from the Google User Research team working on Gerrit. If you're
> interested in shaping the future of Gerrit, sign up for a user study to
> share your thoughts with our team.
> We're currently scheduling 60 minute research sessions in the coming
> months to get feedback from users like you. The signup process is quick –
> please fill out the form below, and we'll be in touch soon to schedule a
> time to chat with you. (If you're a site admin, feel free to sign up too,
> and it'd be great if you could forward this to your developers!)
> *Sign up here! <https://goo.gl/forms/KQRkfRVT69tQ3xZN2>*
> Our current study dates are *Feb 6 - 7*, and *Mar 6 - 7*. Even if you're
> not available on those days, feel free to sign up and we'd be happy to
> reach out to you when future study dates are available.
> Billy Leet
> UX Research Coordinator at Google
> P.S. we offer participants a token of appreciation for taking part in a
> study. We know your time is valuable!
thanks to Brad for the information regarding the discrepancies between
the actual numbers fetched from the API and the Statistics which stem
from a dedicated (but not necessarily sync) table.
initSiteStats.php --update --active --use-master
solved most issues. Since I do not own the Wiki nor have access to the
system I politely asked the Admin to do so.
What remains is a gap between the number of edits/revisions from the
statistics page and the numbers I get when iterating ov6er the pages and
fetching queries via:
The number of pages and users do match- so Iam sure I do not miss any pages.
Statistics report 37660 revisions, querying via API over all pages sums
up to 32659, thus missing 5001 revisions.
Running initSiteStats.php did also update the Statistics report from
46329 to 37660.
=> Is my assumpting correct that the sum of all Revisions queried via
api.php?action=query&prop=revisions&pageids= should match the statistics
number of revisions?
=> At first I also queried for content which resulted in 968 cases where
I lacked permissions to query. So I omitted the content for test
purposes. But this did not solve the problem.
=> What is the difference between continue=|| and rvcontinue=123456?
Until now I have only been using rvcontinue. Including continue did not
make a difference, but I could not find out what the meaning of
Thank you (again) for your time and support.
Exporting pages on en.wikipedia may result on a failure, this is being
investigated right now. For stability reasons, time to export pages has
been temporarily limited to avoid an worse outage, afecting regular page
views and edits. While we do not have right now any advice regarding
changing behaviours when exporting pages, we advice to check any exports
done are successful until this is resolved, specially if done unattended by
bots- a portion of those exports could be failing and people may not be
Right now only a single, not-logged-in user is probably affected on
en.wikipedia around 1200 times in the last 12 hours, but it could affect in
the future other users on other wikis, too.
This is publicly tracked on:
If you use the wikiexport function and it is still working for you/started
failing, feel free to provide us feedback on the ticket above. Pregenerated
dumps at https://dumps.wikimedia.org/backup-index.html or wikireplicas
would be almost universally a better way to get mass-revisions.