I try to be pretty responsive to email and I try to be on IRC most days,
but this summer I'll be away a lot. So, in case you need stuff from me
between now and August 15th or so:
This week I'm mostly offline because of the Open Source Bridge
conference, but I'll be checking my email and responding to urgent stuff
July 9-16 I'll be offline a lot because of Wikimania (and travel to and
from), but again, I'll be checking email and responding to urgent emails
July 27-August 15th I am trying to go completely offline for a vacation.
Greg Varnum (varnent) is my backup for Google Summer of Code
administration, and Rob Lanphier (robla) is my backup for everything
else, including coordinating 20% time.
Engineering Community Manager
I had posted an update and timeline for the watchlist grouping project on
my blog a few days ago, though I forgot to tell the mailing list.
I've replicated the text of the original post below (
I have uploaded the first alpha changeset for the Mediawiki Watchlist
Grouping project: https://gerrit.wikimedia.org/r/#/c/11587/
Here are some random notes on this first release:
*Database support is lacking at the moment. I’ve created setup scripts for
MySQL and Postgre, but I am still working on the upgrade scripts.
*Watchlist pages now take user and group inputs as subpages. For example,
Special:Watchlist/Admin/Cities will retrieve the user Admin’s watchlist
group entitled “Cities”. Special:EditWatchlist works in the same way,
except it will only take the group as a subpage (since users cannot edit
watchlists belonging to other users.) Alternatively, these settings can be
passed as url parameters (?user=<someuser>&group=<somegroup>)
*Filtering options will need to be modified to take user and group settings.
*After addressing these points (and any others that come up during
testing), I’ll be working on implementing inline group adding from article
and category pages and group permissions as well as modifying the raw
watchlist to work with groups.
eventually I’d like to incorporate jQuery/Ajax to improve the workflow. I
will be working with Eranroz who has started this part of the project:
July timeline for the project (tasks begin on the specified date):
June 25: Correct errors identified by jenkins (database table creation) and
clean up code for style/efficiency. Collect feedback from other developers
on how to proceed.
July 2: Continue implementing permissions.
July 9: Modify the raw watchlist.
July 23: Inline group adding from article and category pages.
End of July: Assess progress and plan next steps.
Now that the UI is usable, my goal is to release a changeset after each of
these tasks is completed.
If you have any questions regarding this project, please comment on the
changeset in Gerrit or this post. I appreciate any feedback from the
Hey all wikitech peeps,
In helping organize the upcoming Wikimania DC Hackathon, I wanted
to ask if there are particular categories of work that people with
fairly limited experience could do that would have a meaningful impact.
* Updating extensions to work with the latest version of MediaWiki
* Testing extensions so that we can update mediawiki.org pages about
the extension's compatibility with different MediaWiki revisions
* Converting user scripts into Gadgets
* Convert templates into Lua (but seems lower-impact than some of
the above because Lua scripts aren't deployed very many places yet)
* (Only applicable to attendees who mntain an extension) Teaching
maintainers how to move extensions from the wiki into things that
live in Git and are updated through Gerrit
I'm especially interested in tasks where there's a lot of work to do --
that way, people can be given lots of hands-on things to do that can
provide practice to help people be more comfortable with tools like git
and gerrit, or more comfortable with the MediaWiki hooks, or where the
task gives people a reason to install MediaWiki on their own machine.
Additionally, it's important the task meaningfully contributes to the
project, so people feel the value of what they're doing.
I expect that we'll get a lot of people with some PHP experience but who
have little experience with, say, Git and Gerrit.
Also, if you'll be at the Wikimania DC 2012 Hackathon and want to help
mentor people through any of these, reply as well.
Other ideas welcome. I'll be collating these over the next few days,
and then trying to pick the ones with the highest probable impact based
on the attendees. One warning: this is intended just as a research
question for now. I can't promise that I'll focus a portion of the
hackathon on your particular suggestion. But I do aim to stay in touch
as the planning progresses.
for a certain resource loader module in php? There is the
OuputPage::addJsConfigVars function but I didn't find anything to only load
a var only conditionally when a module is loaded. Right now I am
registering a variable in the ResourceLoaderGetConfigVars hook, but
actually, I only want to have it included when the 'wikibase' module is
loaded since it is rather big and shouldn't be loaded when not necessary.
Wikimedia Deutschland e.V. | NEU: Obentrautstr. 72 | 10963 Berlin
Tel. (030) 219 158 26-0
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
One developer recently complained about some freenode policies,
specifically that wiki projects (wikipedia etc has some kind of
exception) are no longer allowed to be hosted on freenode network,
which is supposed to host only opensource projects. It's fact that as
the wikimedia project is becoming more large the freenode is getting
less and less suitable. Right now there is a page  where are
discussed other options for IRC. One of the options is to leave
freenode and set up own wikimedia IRC network, which has lot of
benefits but also lot of issues (moving to another network is
complicated given to number of channels and users).
I would like to propose another idea, and that is, instead of leaving
freenode, to improve the relations with the freenode staff and
eventually ask them to change some of the restrictions to fit better
to our needs. On other hand we could offer them various services in
return, for example the wikimedia foundation has made few donations to
freenode in past. If we consider the amount of hardware resources we
have, it shouldn't be problem to offer freenode for example a
dedicated or virtual server running on our cluster, which could host
one or more of their ircd servers (our technical / operation community
is far larger than freenode's so there should be absolutely no problem
setting this up and keeping it maintained). This would be perfect kind
of long term support to freenode network in return for their services
they offer to wikimedia project and it could eventually improve the
relations with freenode so they would allow to improve some of their
- The wiki-projects (which are often related to mediawiki software or
developers, even some other companies / projects are affiliated with
MW development) should be allowed to be hosted on freenode, so that
the community of these projects shouldn't find it so hard to reach the
technical support of mediawiki (right now they would have to be on
multiple networks given that #mediawiki is hosted on freenode, but
wiki projects in general are not allowed to be hosted there)
- There is a limit defined by freenode to have maximal number of 4
Group contacts, who are people dealing with cloaks and various staff
related issues. The wikimedia project currently have 4 Group contacts,
so it's quite impossible to enlarge this team. Right now it takes some
time for cloak requests to be processed and in future this number of
people could not be sufficient. Freenode should make it possible for
large projects like wikimedia to have some better options.
- Technical channels have lot of services like nagios bots, these bots
are getting often killed for flooding, because they need to send a lot
of text in short time, it should be possible to define exceptions for
these services to allow sending bigger amount of data in channels
What do you think of this?
I did some investigation on how to compile MediaWiki to LaTeX. In this
Email I will discuss only the problems caused by the fact that MediaWiki
uses Unicode and how to use Unicode with LaTeX.
1) At first Unicode uses the same codepoint for different glyphs in
Chinese Japanse Korean. In Wikipedia there are special templates to work
around this problem, but there are many cases where these templates are
not used, so this causes essentially an unsolvable problem. In LaTeX you
got all needed glyphs available but if you just got the codepoint you
cannot know which one to chose.
2) There are currently three good LaTeX compilers. I think it is hard to
chose one, because each of them has got a significant disadvantage. One
point to understand here is microtype. It is about applying tiny changes
to glyphs to get better margins and better line breaking, which is
something very often done in professionally printed books, but something
only the pdflatex and lualatex compilers can do. The remaining xelatex
compiler can't do it. pdflatex can basically not really do unicode. I
made it do unicode by hacking the cjk package, but this requieres a
special hacked font, which legal under gpl, but it is still a hack and
will surely never make it into debain. I had a long discussion with the
developer of the CJK package, and essentially we didn't find any way to
make pdflatex do unicode in a way acceptable by Debian. The remaining
compiler is lualatex. This does not allow the change of fonts in the
current version of Ubuntu. But it does so in the current testing version
of Debian. But here is consumes a little bit more than one GByte of RAM
when changing fonts, which is also reported by other users and does not
seem to be a memory leak.
So what choises are there:
1) A wired Hack -> pdflatex
2) No microtype -> xelatex
3) 1GByte Memory Consumption and debian testing -> lualatex
If you can decide for one of these options, I will work towards an
offical debian package doing that. I personally prefer lualatex.