https://www.mediawiki.org/wiki/Wikimedia_Engineering/2013-14_Goals#Wikimedi…
The Engineering Community Team has some draft goals for what we'd like
to achieve in the next 12 months. We'll still be running Bugzilla,
putting out the monthly report, running GSoC and OPW, and doing those
other continuous tasks (as you can follow at
https://www.mediawiki.org/wiki/Wikimedia_Platform_Engineering#Engineering_C…
). But what else should we be concentrating on? This is a draft of
what we'd like to focus on, quarter by quarter.
Some highlights:
* Getting more volunteers trained in writing automated tests, especially
so that we can fix problems quicker in more Wikimedia functionality
(including important gadgets)
* Growing Tech Ambassadors membership, to improve two-way communication
between developers and users
* Training more volunteers in JavaScript and security-related
development and code review, to improve bottlenecks
I welcome your comments here or on the talk page.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
I attempted to install Wikibase the other day and made a fun discovery.
Installing it properly requires the following (12) extensions:
WikibaseClient
Wikibase DataModel
WikibaseLib
Wikibase Repository
DataValues
DataTypes
ValueParsers
ValueView
ValueValidators
ValueFormatters
Diff
Scribunto
And the following optional (2) extensions:
Universal Language Selector
Babel
Soon it will also require at least the following (4) extensions:
Ask
Wikibase Query
Wikibase Database
Wikibase QueryEngine
To fully deploy wikibase in a way that will work like wikidata, it will
take at least 18 extensions, all of which are versioned differently, and
are broken out in this manner to be used as libraries.
What this is subtly doing is adding another dependency chain into
MediaWiki: extension libraries. Since these extensions are meant to be used
as libraries, other extensions will eventually do so and admins will have
to worry about not only extension compatibility with MediaWiki (an already
nearly impossible task), but will also need to worry about extension
dependency with extension libraries. The compatibility matrix for this is
going to be terrible and exacerbates one of MediaWiki's biggest problems
for admins.
Quite a few of these should be core functionality or if they can't properly
pass review they should be removed. For legitimate library-like extensions
I have no constructive alternative, but there must be some sane alternative
to this.
- Ryan
(The bot is temporarily being called lolrrit-wm instead of gerrit-wm
until we get some issues fixed. lolrrit-wm is not a permanent nick for
the bot!)
Hello! The old gerrit-wm was a bunch of python scripts running as
hooks in the production cluster, which were hacky and not too well
maintained. Chad asked me if I could write a replacement, and since I
already had some infrastructure for similar tools running on toollabs,
I wrote one.
The source is at https://github.com/yuvipanda/lolrrit-wm, and will be
moved to Gerrit soon. It's currently running on toollabs, with me and
^demon as co-maintainers.
The new one is now active wherever the old one is supposed to be. If
it is not reporting changes somewhere it should be, please ping me so
I can fix that. It has a cleaner message format that also packs more
info into it, and hopefully can be more useful and less spammy.
Feature requests and pitchforks welcome! And requests/ideas for other
similar tools are also welcome :)
--
Yuvi Panda T
http://yuvi.in/blog
A friend of mine asked me what's the best way to identify actually executed
MediaWiki extensions, not just the ones that are installed.
Any hints? I thought that enabling debug logs for a day, then grepping for
the text of filenames in the directory 'extensions' (subtracting out the
stuff that adds hooks to the registry on page load only) may do the trick.
But I imagine there are other wrinkles, and was hoping someone had a canned
technique.
-Adam
Hi Nik,
Just a quick comment on choosing ElasticSearch over Solr:
We use Solr at Wikia, and we have a lot we can offer the Foundation in terms of knowledge sharing. It might be a good idea to consider future opportunities to collaborate while vetting ElasticSearch.
Even if ElasticSearch is your final call, you may still be able to use some of the code from our Search extension (https://github.com/wikia/app/tree/dev/extensions/wikia/Search). It uses the Solarium library for query abstraction, and I'm wondering if adding ElasticSearch support to that library and starting with some of the libraries we've written might get you most of the way there in your CirrusSearch efforts.
And code aside, both solutions have very similar engines behind them. When it comes to generating schemata, analyzing fields, handling language support, scaling, or backend architecture, please feel free to reach out. We'd love to help.
Robert Elwell
On Jul 19, 2013, at 5:14 PM, Rob Lanphier <robla(a)wikimedia.org> wrote:
> Everyone,
>
> I'm reviving this old thread to update everyone on the status of the RFC:
>
> We've continued working on implementation and everything seems to be
> proceeding smoothly. We evaluated Elasticsearch and were super impressed
> and decided it was very likely to be worth switching from Solr4 to it. The
> evaluation and the switch did cost some time but in my opinion doing it was
> time well spent.
>
> Thanks so much for your comments a month ago when I first posted this. If
> you are interested please give the page another look. Just to be helpful,
> here is a link to what I changed:
> http://www.mediawiki.org/w/index.php?title=Requests_for_comment%2FCirrusSea…
>
> Nik Everett
So Chad and I feel like we've gotten far enough in our prototype of our new
search backend for MediaWiki that we're ready to request comments. So here
is our format RFC:
https://www.mediawiki.org/wiki/Requests_for_comment/CirrusSearch
You'll note that the plugin is called CirrusSearch. SolrSearch seems to
have been taken by an unrelated project so we had to pick a different name.
Please read and comment in whatever way is normal for these things.
Thanks so much for your attention,
Nik Everett
Good day,
Now I finally do have a question that might make some sense. I'm attempting to
replace the editor for Mediawiki for certain pages. Are there any examples of
how to use the AlternateEdit hook beyond looking at the code for Mediawiki
itself? Are there any extensions that make use of this hook? Does
VisualEditor? I just want a second example of how to build an edit page that
does things a little differently so that I can compare the two and learn a bit
more about ways that I can go about this.
Thank you,
Derric Atzrott
Computer Specialist
Alizee Pathology
Hi folks,
I'm constantly getting the following error on https://git.wikimedia.org/:
Proxy Error
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the request GET /.
Reason: Error reading from remote server
/Alexander
Just to add to this, this issue also exists for other pairs of codes (als/gsw, bat-smg/sgs, and probably more).
Hazard-SJ
________________________________
From: Hazard-SJ <hazard_sj(a)yahoo.com>
To: "wikitech-l(a)lists.wikimedia.org" <wikitech-l(a)lists.wikimedia.org>; "wikidata-tech(a)lists.wikimedia.org" <wikidata-tech(a)lists.wikimedia.org>
Sent: Sunday, July 21, 2013 12:46 AM
Subject: [Wikidata-tech] Incorrect language code?
Hello,
As far as I checked, we should be using "nb" as the language code of "nowiki". As is known, (see bugs 46455 and 37459), Wikidata allows both codes. I've just come upon a bot request that is over 2 months old that I'm willing to tackle, just that I need verification: terms should use the language code "nb" rather than "no", correct?
As for statistics, there are 534741 uses of the "no" code at present, and only 294725 uses of the "nb" code (at present). As I said, I'm willing to have my bot make the necessary fixes, but I just need to verify i advance.
Thanks.
Hazard-SJ
_______________________________________________
Wikidata-tech mailing list
Wikidata-tech(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech