A late notice for people at MWDS, and my apologies for everyone else. We
(ArchCom + me) have belatedly added a session called "Future of the
Architecture Committee" after the "How MediaWiki Slows Us Down" plenary,
which will be at 1:45pm PST on Tuesday (tomorrow)
Task associated with this:
Here's the list of questions we currently plan to tackle:
1. Should the ArchCom continue to exist as a committee with regular
2. Who takes care of RFCs?
3. Who drives paying down technical debt?
4. How does ArchCom/whatever integrate with quarterly planning at WMF
and in the MediaWiki dev community in general?
I've included my earlier email below titled "No more Architecture
Committee?" as refresher prereading for this (and ignore my rampant
transposition of the middle letters of BDFL, though in my defense, the BDFL
issue is a BFD) :-)
---------- Forwarded message ----------
From: Rob Lanphier <robla(a)wikimedia.org>
Date: Thu, Jan 15, 2015 at 8:04 PM
Subject: Fwd: No more Architecture Committee?
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
(Alright...let's try this again!)
The current MediaWiki Architecture Committee has its roots in a
2013 Amsterdam Hackathon session, where we had a pair of sessions
to try to establish our architectural guidelines. It was there
that we agreed that it would be good to revive our then moribund
process for reviewing RFCs. Since no one there really knew whose
job it was to review these things, I believe I said "how about we
start with everyone with 'architect' in their title at WMF?", which
was met with uncomfortable shrugging that I interpreted as
"consensus!", and no one corrected me. Thus Brion Vibber, Mark
Bergsma, and Tim Starling became the founding members of the Arch
Subsequent to that meeting, I pretended to proceed as though a
decision was made. However, over the past year and half since then,
there's been much more uncomfortable shrugging. Even Brion, Mark, and
Tim have not seemed entirely comfortable with the idea. It was widely
acknowledged that the group was heavily biased toward the lower parts
of our server software stack. The committee agreed to add Roan
Kattouw and Daniel Kinzler to the group as a means of providing a
wider perspective, with the added bonus of adding at least one person
who isn't a WMF employee.
So, here we are today. I believe no one would dispute the credentials
of every member of the group. Brion, Tim, and Mark have an extremely
long history with the project, being employees #1, #2, and #3 of the
WMF respectively, and all having contributed massively to the success
of Wikipedia and to MediaWiki as general purpose wiki software. In
most open source projects, one of them would probably be BFDL.
Roan and Daniel are more "recent", but only in relative terms, and
also have very significant contributions to their name. All have the
widespread respect of pretty much everyone in the MediaWiki developer
Additionally, I hear quite a bit of relief that the previously
moribund RFC process is doing much better now. Things are moving, and
if you know how to work the process and aren't proposing anything too
wild, you can get an RFC approved pretty quickly. The committee has
made a lap through the entire backlog of RFCs.
Still, the uncomfortable shrugging continues. The group is broader,
but still lacks the breadth, particularly in front end and in the
development of newer services such as Parsoid and RESTBase. This
aspect is pretty obviously something that can be fixed. Another
problem is that the scope of the group isn't clear to everyone. Is
this group responsible for leading, or merely responsible for
reviewing big ideas from others to ensure continuity and sanity? How
big does an idea need to be before an RFC needs to be written (as
opposed to just dropping a patch in Gerrit)? Defining the scope of
the group is also a fixable problem.
However, I don't sense much of a desire to fix things. The dominant
meme that I hear is that we should go back to the day before
uncomfortable shrugging led to a committee becoming BFDL. What I
fear, though, is that we will develop a system lacking in conceptual
integrity, as individual warring fiefdoms emerge. It's quite
simple to argue this is already happening.
So, where does that leave us? Do we need a BFDL? If so, who should
pick? Should it be someone in the project? Should the WMF hire
someone to lead this? If not, do we keep the committee? Do we just
let this be consensus based?
On the leadership front, let me throw out a hypothetical: should we
have MediaWiki 2.0, where we start with an empty repository and build
up? If so, who makes that decision? If not, what is our alternative
vision? Who is going to define it? Is what we have good enough?
In general, I feel a sense of urgency that seems lacking in the status
quo. We've made progress over the past couple of years, but it
doesn't feel like our progress is entirely up to the task. We have a
collection of *many* instances of individual or small team excellence
that are sadly the mere sum of their parts. My intuition is that we
lose out on multiplicative effects as we fail to engage the wider
group in our activities, and as we lack engineering-level
orchestration. Team-level pride in fantastic work drifts into
project-level despair, as many excellent engineers fail to grasp how
to make big changes outside of their limited domains.
Perhaps I'm being too hyperbolic. Perhaps the answer is "embrace the
chaos; it's the Wiki Way(tm)" I don't buy it, but I'm probably one of
the easier people to convince of this. I think if this is the way
it's gonna be, someone needs to make the case how this is actually
working now. Step up.
Perhaps I'm also suffering from living inside the WMF echo chamber for
too long. It could be that the general pessimism about the direction
of MediaWiki (or lack thereof) is not shared out here. Perhaps people
who get most of their news from this mailing list are perfectly happy
with the status quo, and appreciate the balance we've struck with our
weekly meetings and a committee whose membership is not entirely
To be clear here: this is not an announcement about actually
disbanding the committee. Even though I had a hand in creating it,
it'd be dumb for me to unilaterally pressure the committee to disband.
I may nudge and cajole (like I'm doing here), but I'm first and
foremost looking to figure out what the consensus is, and then follow
through on it.
I know most of you hate reinventing a wheel so I first send it here,
before I launch that project :)
Some of you probably know kiwix - kiwix.org which is offline wikipedia
reader. I think the idea of this reader is cool, most of you probably
sometimes wanted to access wikipedia while being offline somewhere,
but couldn't. Kiwix can help with this, however it has one big problem
and solution for it is so complex that it would basically need a
rewrite of whole thing.
That problem is that you need to download pretty huge file (40+GB) in
order to use it for en wikipedia for example. And if you wanted to
update those few wikipages you are interested in, to a latest
revision, then you again need to download that huge file.
That suck. Especially with GPRS internet and similar connectivity and
it also suck because mobile phones don't even have space for so much
data. My idea is to create app similar to kiwix, that would use SQLite
DB and using wikipedia API it would (slowly, apache friendly) download
contents of any mediawiki installation based on user selection, so
that you could download just a 1 page for offline reading, or 1
category. Or 1000 categories. Or precompiled sets of pages created by
users (books). You could easily update these using API anytime to
latest version. You could get media files for these pages, etc, etc...
(You could probably even edit the pages offline, and then update them
when you are online, but that is just extra feature)
I think this approach would work much better and it's sad kiwix
already doesn't support it. At some point, if it worked I think this
new code could be merged back into kiwix, I am going to use C++ in the
end, which kiwix uses as well.
What do you think about it, is it worth of working on? Is there
actually a community of "offline wikipedia readers" that would
I am kind of late to the party but I have upgraded one of
my throaway development wikis with the usual
"git remote update && git merge && php maintenance/update.php" process
and after the above succeeded I was nevertheless greeted by:
Fatal error: Class 'Cdb\Reader' not found
exception coming out of includes/cache/LocalisationCache.php on line 1263
It seems that I just forgot to update the "vendor" directory
(I am somehow reluctant to run composer due to allow_url_fopen=1)
Would that be reasonable to add some basic external libraries
checks to update.php to remind users to update those core
components prior to accessing the wiki?
Btw. I think UPGRADE doc does not (yet) mention the new process.
Apologies for cross-listing this--I initially (and erroneously) only sent
For those of you I haven't yet met, I am a new product manager in SF,
working with the mobile web team (I look forward to meeting you)!
I just wanted to let you all know about a new project we are starting for
Wikipedia's mobile website. The project has been dubbed "Collections", and
our pilot will let users create and share collections of articles. Here
are some ways that this project is exploring new ways to move our mission
- *New ways to contribute: * through curation, wikipedia readers who are
not interested in traditional editing can have meaningful, creative
interactions with our content
- *Personal:* gives users a way to make content more relevant for
list of most important Philosophers" is not subject to consensus or editing
by others. For the time being, a list will only be accessible via shared
- *Shareable: *this project will experiment with the ability to use
Wikipedia to share ones' perspective with others and, in doing so,
encourage new users to engage
- *[Future] Browseable:* because each list is not exhaustive, there is a
possibility of using popular lists to promote meaningful content. It's
nice to know the full list of statisticians
<https://en.wikipedia.org/wiki/List_of_statisticians> is there, but I
might want to find a subset that have been picked out by a human for one
reason or another.
Though the pilot is targeting and supporting readers, we think there are
also potential use cases for editors that we could explore in the future.
One can easily imagine lists for editors to track or share their
contributions, such as: "Articles that I want to write/expand during 2015"
or "Articles I created in 2013".
A fair amount of thought has gone into why we are launching this particular
project and how we might approach it, but we have just started exploratory
development work last week. Here is the team:
[I know that this project overlaps with several existing features in terms
of raw functionality, so if you have any relevant experience with either
the editing or technical support of lists, collections extension (books) or
watchlist and are interested in sharing your experience, please feel reach
out to me directly. We very much want to learn from previous efforts]
Also, please reach out to me or anyone else on the team if you have any
questions or concerns about the feature, team, etc.
Cool. Good idea. /me flags for reading this weekend.
On Tue, Jan 20, 2015 at 10:54 AM, Dario Taraborelli <
> I’ve been discussing with the folks at CrossRef (the largest registry of
> Digital Object Identifiers, think of it as the ICANN of science) how to
> accurately measure the impact of traffic driven from Wikipedia/Wikimedia to
> scholarly resources.
> While digging into their data, we realized that since Wikimedia started
> the HTTPS switchover and an increasing portion of inbound traffic happens
> over SSL, Wikimedia sites may have stopped advertising themselves as
> sources of referred traffic to external sites. While this is a literal
> implication of HTTPS, it means that Wikimedia's impact on traffic directed
> to other sites is becoming largely invisible and Wikimedia might be turning
> into a large source of dark traffic.
> I wrote a proposal reviewing the CrossRef use case and discussing how
> other top web properties deal with this issue by adopting a so-called
> "Referrer Policy”:
> Feedback is welcome on the talk page:
> Analytics mailing list
On Jan 22, 2015 6:43 PM, "Brian Wolff" <bawolff(a)gmail.com> wrote:
> On Jan 22, 2015 2:08 PM, "Tyler Romeo" <tylerromeo(a)gmail.com> wrote:
> > I think that’s kind of insulting to those of us who don’t work at the
WMF. Just because they hire the “best and the brightest” does not mean
there are not people out there who are just as intelligent, if not more,
but do not or cannot work for the WMF for whatever reason. Restricting
Archcom to WMF employees is just about the stupidest thing you could do for
an open source software project. It defeats the entire purpose of MediaWiki
I apologize, i didnt mean to imply non wmf employees are any less bright
than wmf employees.
What i more meant to say (which i didnt express very well) is that the arch
comitte (essentially bdfl by comittee in my understanding. Not just about
architecture but also "vision" for mediawiki) should be composed of leaders
of the community who have been in the mediawiki community a long time, and
have fairly universal respect due to demonstrating "wisdom" over the long
I dont think arch comitte should be composed solely of wmf'ers, i think
selection should be made entirely independent of affiliation (so working
for wmf should not disqualify someone). It just happens that the people who
i think are likely candidates all happen to currently work for the
This assumes of course that wmf wont force its employees to have certain
opinions. I dont think they have any intention of doing so.
After all, look at the current dev summit attendence list. How many people
on that list:
*has been fairly regularly active devs for at least 5 years
*has demonstrated "wisdom" (however you define that)
*does not currently work for wmf
Otoh perhaps other people have a different conception of what the arch
comitte should "be" or what the criteria for membership should be.
I just noticed wfRunHooks got deprecated. The hook mechanism is heavily
depended on by extensions. So if it is going away, what will it be replaced
by? There is no hint of this in the method doc.
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany