Hi everyone,
I'm happy to announce that we have promoted Sumana Harihareswara as
manager of Engineering Community group. Sumana started with us as a
contractor back in February 2011, initially in a targeted engagement
to help out with Google Summer of Code and with the Berlin Hackathon
last year. Later that year, as we interviewed people to bring in as
Volunteer Development Coordinator, not only did Sumana put in a strong
application herself, but recruited very worthy competition for the
role. After winning the role, she worked tirelessly to straighten out
many kinks in our processes around volunteer development and
systematically ensured that new volunteer developers get the
recognition and (if needed) help they deserve. She has also applied
focus and organization in many areas outside of her immediate purview,
for example, recently stepping in as project manager for Git, and
occasionally filling in for me when I've been unavailable for the
larger Platform Engineering organization.
The promotion to Engineering Community Manager isn't so much a change
in the way things are done here so much as an official recognition of
a vital role that she has already played for the past year. Sumana
has been working with Guillaume Paumier and Mark Hershberger under the
somewhat ad hoc group title of "Technical Liaison; Developer Relations
(tl;dr)", serving as lead of that group since last year. Under the
new "Engineering Community" name, this group will continue to serve
many roles: facilitating collaboration and communication between
Wikimedia Foundation and its employees and the larger Wikimedia
developer community, as well as facilitating collaboration and
communication between the Wikimedia developer community and other
Wikimedia communities.
Thank you, Sumana, for your hard work over the past year. I'm looking
forward to seeing what you and the group accomplish moving forward.
Congratulations!
Rob
I didn't see it on first glance, and wanted to ask: can I make the
Extension:Moodbar be visible always, and also usable by anon users?
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 2 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi all,
For a MediaWiki extension I'm working on (see
http://lists.wikimedia.org/pipermail/wikitech-l/2012-April/060254.html), an
effectively plain-text file will need to be converted into a static image.
I've got a set of scripts that does that, but it takes my medium-grade
consumer laptop about 30 seconds to convert the plain-text file into a
ray-traced static image. Since ray-tracing the images being created here
substantially improves their visual quality, my impression is that it's
worth a moderately expensive transformation operation like this, but only
if the operation is done once.
Given that, I assume it'd be best to do this transformation immediately
after the plain-text file has completed uploading. Is that right? If not,
what's a better time/way to do that processing?
I've looked into MediaWiki's 'UploadComplete' event hook to accomplish
this. That handler gives a way to access information about the upload and
the local file. However, I haven't been able to find a way to get the
uploaded file's path on the local file system, which I would need to do the
transformation. Looking around related files I see references to $srcPath,
which seems like what I would need. Am I just missing some getter method
for file system path data in UploadBase.php or LocalFile.php? How can I
get the information about an uploaded file's location on the file system
while in an onUploadComplete-like object method in my extension?
Thanks,
Eric
What: Meta and mediawiki.org translation tools bug triage
When: Wednesday, May 9, 16:00UTC
Time zone conversion: http://hexm.de/ir
Where: #mediawiki-i18n on freenode
Use http://webchat.freenode.net/ if you don't have an IRC
client
You are invited to a bug triage on Meta and mediawiki.org translation
tools hosted by the Wikimedia Foundation Localisation team. This will
be a one hour meeting. The intended audience is very broad:
translators, translation administrators, and developers. We will
discuss the current state of translation tools on Meta-Wiki and
mediawiki.org, and with your input we will try to map out which
features and issues will be most helpful to streamline the translation
process for things like documentation, policies, sitenotices,
fundraiser messaging and appeals, and other non-"primary project
content"* material that benefits from being available in as many
languages as possible.
Please forward this e-mail to anyone who may be interested. They are
most welcome to join in.
* Translating main namespace articles for Wikipedia and other projects
is still out of scope for now.
Cheers!
--
Siebrand Mazeland
Product Manager Localisation
Wikimedia Foundation
M: +31 6 50 69 1239
Skype: siebrand
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
(Siebrand, I am unsure if this will arrive at mediawiki-i18n, feel free to
forward it you consider it interesting to them).
OK, I've written a few lines of Python [1] which actually helped me answer
my questions. Sorry to bother.
And the answers are yes, yes, no, but close, and i hope so.
There are a small number of wikis which use a different language code than
their site code is, namely:
crh -> crh-latn
als -> gsw
be-x-old -> be-tarask
roa-rup -> rup
simple -> en
But, at the same time, the given *site* codes exist as *language* codes as
well, i.e. the languages/messages files exist for them, but they just
fallback to the given language code (i.e. MessagesAls.php just names gsw as
a fallback).
I would not be surprised if each of these five examples would have an
anecdote to explain why they are the way they are :)
Thanks,
Denny
P.S.: There is one thing I do not understand though. According to
https://simple.wikipedia.org/w/api.php?action=query&meta=siteinfo the
language of simple.wp is "en", but MessagesSimple.php seems to be taken
into account (instead of "edit" it has "change" in the UI, one of only two
changes in MessagesSimple to MessagesEn). So it seems that the language is
"simple" -- why does it say "en" in the siteinfo?
[1] available here: <http://pastebin.com/JpApSmNX>
2012/5/9 Siebrand Mazeland <s.mazeland(a)xs4all.nl>
> Forwarded. Please cc Denny, as he is not on this list.
>
> Begin doorgestuurd bericht:
>
> *Van:* Denny Vrandečić <denny.vrandecic(a)wikimedia.de>
> *Datum:* 9 mei 2012 01:36:01 GMT+02:00
> *Aan:* MediaWiki Tech list <wikitech-l(a)lists.wikimedia.org>
> *Onderwerp:* *[Wikitech-l] Language codes vs site codes*
> *Antwoord aan:* Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
>
> Hi,
>
> Wikimedia projects like Wikipedia exist for a large number of languages
> (>280, wow). Let us call them the sites. They are identified by the site
> code used in the subdomain and in language and interwiki links, e.g. "en"
> for en.wikipedia.org, i.e. the English Wikipedia.
>
> MediaWiki has a number of interface languages. These are selected through
> the preferences. They also have short language codes that identify them and
> which are used, e.g. in the localization of the code. "en" is used for
> English.
>
> My four questions:
> * are the language codes a proper superset of the site codes?
> * if the code is the same in both cases, does it always refer to the same
> language?
> * does the Wikimedia project identified by a specific site code also always
> use the same language code as its default interface language?
> * are the answers to the previous three questions accidental or by design,
> i.e. can we expect that this will stay like this in the future?
>
> I hope that the answers are yes,yes,yes,yes :)
>
> Sorry for the nitpicking questions, I hope someone knows the answers.
>
> Cheers,
> Denny
>
> --
> Project director Wikidata
> Wikimedia Deutschland e.V. | Obentrautstr. 2 | 10963 Berlin
> Tel. +49-30-219 158 26-0 | http://wikimedia.de
>
>
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 2 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi all,
In the last 24 hours I have found two new cases of spaces in log lines where the space is not used as a delimiter.
Case 1:
There are mobile page requests that contain a space in the URL, for example:
ssl1002 2198871 2012-04-06T23:50:24.566 0.002 0.0.0.0 FAKE_CACHE_STATUS/301 1051 GET https://en.m.wikipedia.org/wiki/Extensor_carpi radialis longus NONE/mobilewikipedia - https://www.biodigitalhuman.com/ - Mozilla/5.0%20(Windows%20NT%206.1;%20WOW64)%20AppleWebKit/535.19%20(KHTML,%20like%20Gecko)%20Chrome/18.0.1025.151%20Safari/535.19
Case 2:
The mimetype on varnish often contains additional charset=utf8 information, that results in a mimetype like "application/json; charset=utf8" or "text/xml; charset=utf8"
Instead of continuing patching our servers to fix these space issues I strongly suggest that we move away from the space as delimiter and start using the tab (\t) character. Spaces not being used as delimiters have been cropping up in our server logs for many years and it makes the analytics part that much more complex as we need to check more and more edge cases and/or create patches. I rather solve the problem at the root and that is by moving to a new delimiter.
The delimiter is added by nginx/varnish/squid when writing the log file, so
Please let me know if this is a sane or insane idea. Please let me also know if you are a consumer of these server log files and you would need to make a change on your end to accommodate this change.
Andrew has been working hard on building a test environment in Labs where we have nginx / varnish / squid servers running with production configuration and where we can test these changes extensively.
Best,
Diederik
Dear Sirs, my
function JidanniLessSkinMess($sktemplate,$nav_urls){$nav_urls['permalink']=$nav_urls['print']=false;return true;}
$wgHooks['SkinTemplateBuildNavUrlsNav_urlsAfterPermalink'][]='JidanniLessSkinMess';
no longer works when the user is logged out.
What other of my
http://transgender-taiwan.org/index.php?title=Special:Version#Hooks
probably are affected?
Hi,
Wikimedia projects like Wikipedia exist for a large number of languages
(>280, wow). Let us call them the sites. They are identified by the site
code used in the subdomain and in language and interwiki links, e.g. "en"
for en.wikipedia.org, i.e. the English Wikipedia.
MediaWiki has a number of interface languages. These are selected through
the preferences. They also have short language codes that identify them and
which are used, e.g. in the localization of the code. "en" is used for
English.
My four questions:
* are the language codes a proper superset of the site codes?
* if the code is the same in both cases, does it always refer to the same
language?
* does the Wikimedia project identified by a specific site code also always
use the same language code as its default interface language?
* are the answers to the previous three questions accidental or by design,
i.e. can we expect that this will stay like this in the future?
I hope that the answers are yes,yes,yes,yes :)
Sorry for the nitpicking questions, I hope someone knows the answers.
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 2 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Volunteer Marcin Cieslak (saper) is doing the great service of leading a
Git and Gerrit tutorial on May 8th 19:00 UTC. It will be 1-1.5 hours.
If you intend to come, please fill out this poll so saper knows how many
people are coming and what they already know.
http://doodle.com/qnrgibpqxyamqhzb
Time conversion:
http://doodle.com/r?url=http%3A%2F%2Ftimeanddate.com%2Fworldclock%2Ffixedti…
To participate, you'll need to:
* get developer access for Git/Gerrit/Wikimedia Labs
https://www.mediawiki.org/wiki/Developer_access
* install a SIP client ("Blink and CSipSimple for Android are
recommended and tested (with HD audio); I'll provide SIP dial-in")
* install vncviewer
* ensure you can ssh from a terminal
Marcin will cover:
Basics:
setting up Git
submitting a patch
commenting on a patch in Gerrit
merging a patch in Gerrit
enough Git internals not to get lost
Branching:
local branch, making, working & pushing
remote branch & git-review to that remote branch
Troubleshooting:
cherrypick changes between branches
amending (rebase vs multiple commit)
squashing work from a branch into a commit and pushing it
resolve merge conflicts
Thanks, Marcin!
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Hello,
I am one of the Google Summer of Code students and I will be working
on improving language support in MediaWiki (in particular with regards
to language names) and improving the usability of Wikimedia Incubator
(following my previous work on the Incubator-specific extension).
More information is available on
https://www.mediawiki.org/wiki/User:SPQRobin/GSoC/application and I
will be posting progress updates on
https://www.mediawiki.org/wiki/User:SPQRobin/GSoC/notes for who is
interested.
My first work will be the MediaWiki core part, for language names. I
will implement a language names cache (likely CDB) and I will make
English language names available in MediaWiki by default (as I once
proposed on wikitech), similar to the list currently in the Babel
extension (which would then be no longer needed). This would avoid
duplication and make language names consistent.
The biggest inconsistency currently is a difference between using
{{#language:xyz}} and {{#language:xyz|xyz}}, which both give native
names, but the first one only uses MediaWiki names and the second one
also additional data like CLDR. See
https://translatewiki.net/wiki/User:SPQRobin/languages for the
difference.
To make them consistent, I want to make our MediaWiki language names
(in Names.php) lowercase where this is usual (like: Français ->
français). I don't mind if it is displayed lowercase, but maybe others
do (for example in the interwiki sidebar). Plus, it might cause
problems for pages/templates using {{#language}} that depend on this
difference between MediaWiki and CLDR names.
Any idea, comment or feedback is welcome :)
Regards,
Robin