There has been a lot of talk over the last year or so of how and when
to move MediaWiki to a service oriented architecture [0]. So much so
that it is actually one of the a marquee topics at the upcoming
Developer Summit.
I think the problem statement for the services RFC is dead on in
describing issues that MediaWiki and the Wikimedia Foundation face
today with the current monolithic MediaWiki implementation. We have
organic entanglements between subsystems that make reasoning about the
code base difficult. We have a growing need to API access to data and
computations that have historically been only presented via generated
HTML. We have cross-team and cross-project communication issues that
lead to siloed implementations and duplication of effort.
The solution to these issues proposed in the RFC is to create
independent services (eg Parsoid, RESTBase) to implement features that
were previously handled by the core MediaWiki application. Thus far
Parsoid is only required if a wiki wants to use VisualEditor. There
has been discussion however of it being required in some future
version of MediaWiki where HTML is the canonical representation of
articles {{citation needed}}. This particular future may or may not be
far off on the calendar, but there are other services that have been
proposed (storage service, REST content API) that are likely to appear
in production use at least for the Foundation projects within the next
year.
One of the bigger questions I have about the potential shift to
requiring services is the fate of shared hosting deployments of
MediaWiki. What will happen to the numerous MediaWiki installs on
shared hosting providers like 1and1, Dreamhost or GoDaddy when running
MediaWiki requires multiple node.js/java/hack/python stand alone
processes to function? Is the MediaWiki community making a conscious
decision to abandon these customers? If so should we start looking for
a suitable replacement that can be recommended and possibly develop
tools to easy the migration away from MediaWiki to another monolithic
wiki application? If not, how are we going to ensure that pure PHP
alternate implementations get equal testing and feature development if
they are not actively used on the Foundation's project wikis?
[0]: https://www.mediawiki.org/wiki/Requests_for_comment/Services_and_narrow_int…
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Greetings,
Please join the Affiliations Committee in congratulating the MediaWiki
Farmers User Group on their official approval as a Wikimedia User Group:
https://meta.wikimedia.org/wiki/Affiliations_Committee/Resolutions/MediaWik…
The MediaWiki Farmers User Group is "A user group of third-party developers
who work on wiki farms. Our mission is to improve and standardize the way
MediaWiki wiki farms are setup and run."
Anyone interested in more information about the group can visit:
https://www.mediawiki.org/wiki/Project:MediaWiki_Farmers_user_group
Again - congratulations on the recognition and best wishes for the group's
future work!
-greg aka varnent
Vice Chair, Wikimedia Affiliations Committee
I'm entirely pleased to announce that Chad Horohoe is joining the
Wikimedia Foundation's Release Engineering Team.
This community needs no introduction to Chad so I thought I’d share a
little bit of my personal experience with him.
I first worked with Chad on the MediaWiki Core Team; in fact, Chad was
one of the people who interviewed me for my initial position at WMF.
That's when I got to see first hand Chad's relentlessness and strong
(and vocal) commitment to the mission of the WMF. He tells you when you
think you’re wrong (in a nice way) but most importantly helps you make
things right. In short, he follows up his critique with patches (both
literally and figuratively).
Chad will be an obvious asset to the Release Engineering team; his
experience with Gerrit, his (already deep) knowledge of Phabricator, his
understanding of how all the pieces of our system work together (which
will help Beta Cluster), and his familiarity (and annoyances) of our
deployment tooling.
Even though I can’t express through email[0] my excitement that Chad is
joining the Release Engineering team please join me in welcoming him to
his new role.
Greg
[0] !!!!!!!!!!!!!!1!!!111eleven
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
(Combining pieces of Jay's thread and pieces of the shared hosting thread.)
Daniel Friesen wrote:
>Parsoid can do Parsoid DOM to WikiText conversions. So I believe the
>suggestion is that storage be switched entirely to the Parsoid DOM and
>WikiText in classic editing just becomes a method of editing the content
>that is stored as Parsoid DOM in the backend.
Tim Starling wrote:
> Parsoid depends on the MediaWiki parser, it calls it via api.php. It's
>not a complete, standalone implementation of wikitext to HTML
>transformation.
>
>
> HTML storage would be a pretty simple feature, and would allow
>third-party users to use VE without Parsoid. It's not so simple to use
>Parsoid without the MediaWiki parser, especially if you want to support
>all existing extensions.
>
>
> So, as currently proposed, HTML storage is actually a way to reduce the
>dependency on services for non-WMF wikis, not to increase it.
>
> Based on recent comments from Gabriel and Subbu, my understanding is
>that there are no plans to drop the MediaWiki parser at the moment.
Yeah... what is this all about? My understanding (and please correct me if
I'm wrong) is that Parsoid is/was intended to be a standalone service
capable of translating wikitext <--> HTML. You seem to be stating that
Parsoid is neither complete nor standalone. Why?
Currently Parsoid is the largest client of the MediaWiki PHP parser, I'm
told. If Parsoid is regularly calling and relying upon the MediaWiki PHP
parser, what exactly is the point of Parsoid?
How much parity is there between Parsoid without the use of the MediaWiki
parser and the MediaWiki parser? That is, if you selected a random sample
of pages from a Wikimedia wiki, how many of them could Parsoid correctly
parse on its own? And from this question flows another: why is Parsoid
calling MediaWiki's api.php so regularly?
I'm also interested in Parsoid's development as it relates to the broader
push for services. If Parsoid is going to be the model of future services
development, I'd like a clearer evaluation of what kind of model it is.
Again, please correct me if I'm wrong, mistaken, misinformed, etc., but
from my place of limited knowledge, it sounds very unappealing to create
large Node.js applications ("services") that closely tie in and require(!)
PHP counterparts. This seems like the opposite of moving toward a more
flexible, modular architecture. From my perspective, it would seem to only
saddle us with additional technical debt moving forward, as we double
complexity indefinitely.
MZMcBride
Quick update:
I've had a great experience working on our mobile apps, but it's time to
get back to core MediaWiki and help "clean my own house"... now that we've
got Mobile Apps fully staffed I'm leaving the mobile department and will be
reporting directly to Damon within WMF Engineering.
First -- huge thanks to Monte and Dan and Kristen and Dmitry and Bernd and
of course Tomasz!! and everybody else who's been awesome in Mobile Apps --
and also to the rest of the mobile team, who have become too many to list
in a concise email. :)
For the moment I'm going to get back up to speed with the Architecture
Committee and push at general MediaWiki issues. As we determine the fate of
committees and narrow down what are our priority projects, my focus may
narrow a to getting some particular things done over the next months.
A few general notes:
* Working in mobile apps reminded me how important our APIs are -- our
ability to make flexible interfaces that work in different form factors and
different technology stacks is dependent on maintaining a good API. This
needs work. :)
This doesn't just mean interacting with "api.php" -- we need clean
configuration, data, etc interfaces as well, especially if we want people
to contribute in ways other than raw text editing. There's a lot to clean
up...
* Mobile mobile mobile! I've heard some folks complain that while there's a
lot of talk about "mobile-first" and similar there aren't always concrete
explanations yet of what that means. I hope to bring some of the excitement
we've seen in Mobile about Wikidata, better queries, better visual/user
interaction design, and generally making things *work for users*.
* Breaking or working around the "PHP barrier" for third-party MediaWiki
users: I hope to get the question of services resolved one way or another
-- either by us officially dropping "shared PHP hosting" support or by
making sure we have "pure PHP" implementations of things that are required
to operate -- which is mostly dependent on having good interfaces and APIs
so that multiple implementations can be written and maintained compatibly...
* Future stuff -- new media types, narrow-field editing, natural language
queries? WHO KNOWS! I'll be researching more crazy stuff in my additional
time.
I'll see many of you at the Dev Summit soon enough -- don't be shy about
pestering me with concerns and ideas about priorities. :)
-- brion
[Moving threads for on-topic-ness.]
On 16 January 2015 at 07:01, Brian Wolff <bawolff(a)gmail.com> wrote:
> Does anyone actually have
> anything they want that is difficult to do currently and requires a mass
> compat break?
Sure.
Three quick examples of things on the horizon (I'm not particularly saying
we'd actually do these for Wikimedia's use, but if you're going to ask for
straw man arguments… :-)):
- Get rid of wikitext on the server-side.
- HTML storage only. Remove MWParser from the codebase. All
extensions that hook into wikitext (so, almost all of them?) will need to
be re-written.
- Real-time collaborative editing.
- Huge semantic change to the concept of a 'revision'; we'd probably
need to re-structure the table from scratch. Breaking change for
many tools
in core and elsewhere.
- Replace local PHP hooks with proper services interfaces instead.
- Loads of opportunities for improvements here (anti-spam tools 'as a
service', Wordpress style; pre-flighting saves; ), but again, pretty much
everything will need re-writing; this would likely be "progressive",
happening one at a time to areas where it's
useful/wanted/needed, but it's
still a huge breaking change for many extensions.
> Proposing to rewrite mediawiki because we can without even a
> notion of what we would want to do differently seems silly.
>
Oh, absolutely. I think RobLa's point was that it's unclear who feels
empowered to make that decision (rather than the pitch). I don't. I don't
think RobLa does. Clearly the Architecture Committee don't.
J.
--
James D. Forrester
Product Manager, Editing
Wikimedia Foundation, Inc.
jforrester(a)wikimedia.org | @jdforrester
Hey,
On my local wiki I have a page with the name "File:Blue marker.png". The
following code returns false:
$title = Title::makeTitle( NS_FILE, $file );
$title->exists();
That used to return true in the past. Not sure what is broken - my wiki or
MediaWiki itself.
What I want to do is go from string page name, such as "File:Blue
marker.png", to full URL, such as "
http://localhost/phase3/images/6/6f/Blue_marker.png". What is the
recommended way of doing this now (that works on MW 1.19 and later)?
Cheers
--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
After 18 months of MediaWiki release management handled externally by
Markus Glaser and Mark Hershberger, we have agreed to bring the
MediaWiki release work back to the Wikimedia Foundation. From now on,
the Release Engineering team coordinated by Greg Grossmeier, which is
responsible for the WMF weekly releases, will be also in charge of the
MediaWiki releases for third parties. This will be a hit on the WMF
Release Engineering team initially, but we plan to make future
announcements related to that.
About two years ago we explored the outsourcing of MediaWiki releases
under the assumption that an external team not conditioned to the
Wikimedia priorities would be in a better spot to satisfy the needs of
third party MediaWiki users. Over time, this assumption has been
overshadowed by the overhead required to keep both release teams in
sync, and the difficulties that Mark y Markus have run into when trying
to follow the weekly release train. We believe that the situation would
be similar with any other external team, so we have decided to go back
to in-house releases.
Another lesson learned by all of us is that managing MediaWiki releases
and promoting a MediaWiki ecosystem are very different specialized
tasks, and it is hard to be the individuals in charge of both. Both Mark
and Markus were active in the MediaWiki community well before managing
MediaWiki releases, and will continue to be active now in the context of
the MediaWiki Stakeholders Group, as volunteers and with their own
stakeholders' hats. The Engineering Community team, coordinated by Quim
Gil, will offer their help in consolidating the Group and aligning their
interests with the WMF plans.
The Wikimedia Foundation thanks Mark, Markus, and Alexis for the work
done releasing MediaWiki and coordinating several community activities
in the context of the MediaWiki Stakeholders Group. While we are aware
of the disruption that this change of strategy might bring, we believe
that the new situation puts us all in a better position to support the
needs of the MediaWiki community.
This change of course is also a good chance to discuss what is the best
mid-term scenario for MediaWiki outside of Wikimedia, and what should we
all do in order to get there. The MediaWiki Developer Summit in San
Francisco next week is a good place to have this conversation, in
addition of our usual online channels, of course.
Markus Glaser, Mark Hershberger, Alexis T. Hershberger, Greg Grossmeier,
Quim Gil
I felt I should respond to this...
I run about a dozen MediaWiki wikis singlehandedly. I've been using
MediaWiki since 2005, and I think only once did I run into a substantial
problem doing an upgrade. Upgrading has only gotten easier over the
years, too... and I use a lot of extensions, including some I've
customized or written from scratch.
Mind you, I generally only do upgrades when either (a) migrating a site
to a new server, (b) I want to install an extension not supported by the
current version, or (c) I want new features -- but this is only because
the process takes a fair amount of time, not because it is especially
difficult. I could probably write up a guide to safe MediaWiki upgrading
in a printed page or so.
There is some truth to what Ryan says, but the impression he left on me
-- that maintaining MediaWiki is a hopeless mess for most admins -- is
rather at odds with my own experience.
(Side note, with regard to preserving old MediaWiki content when
maintenance becomes too much of a chore: Dkosopedia has apparently
abandoned MediaWiki; their content is now static HTML. I'd be interested
in finding out both why and how they did this.)
Woozle
On 01/19/2015 07:01 AM, wikitech-l-request(a)lists.wikimedia.org wrote:
> What you're forgetting is that WMF abandoned MediaWiki as an Open
> Source project quite a while ago (at least 2 years ago). There's a
> separate org that gets a grant from WMF to handle third party use, and
> it's funded just well enough to keep the lights on. Take a look at the
> current state of MediaWiki on the internet. I'd be surprised if less
> than 99% of the MediaWiki wikis in existence are out of date. Most are
> probably running a version from years ago. The level of effort
> required to upgrade MediaWiki and its extensions that don't list
> compatibility with core versions is past the skill level of most
> people that use the software. Even places with a dedicated ops team
> find MediaWiki difficult to keep up to date. Hell, I find it difficult
> and I worked for WMF on the ops team and have been a MediaWiki dev
> since 1.3. I don't think adding a couple more services is going to
> drastically alter the current situation. - Ryan