Hello all,
Google Summer of Code 2016 [1], and Outreachy'12 is closing in, and this
year we have a few change in rules. We are lifting our intern participation
limit of one GSoC/Outreachy with Wikimedia per student, from this year.
This was discussed [2], and repeating configuration is expected to make
sure continued work on ongoing projects, adding up to the goal of gathering
in more long term contributors.
Please note that a previous intern who had participated in one Outreachy
project can still participate in 2 GSoC rounds with Wikimedia, but not
three ( which would total 4 internships ). The change came after Google
announced its new limitation of maximum 3 GSoC per student in one lifetime.
We need to get all the previous interns and mentors back in action, and
closely update the GSoC'16 board [3]. Find some interesting task that can
be a possible tech project ? do update the Possible-Tech-Projects board
[4]. Your thoughts are also welcome[5] on getting our previous
interns+mentors back in action.
[1] https://developers.google.com/open-source/gsoc/timeline?hl=en
[2] https://phabricator.wikimedia.org/T124392
[3] https://phabricator.wikimedia.org/tag/google-summer-of-code-2016/
[4] https://phabricator.wikimedia.org/tag/possible-tech-projects/
[5] https://phabricator.wikimedia.org/T125559
Thanks,
Tony Thomas <https://www.mediawiki.org/wiki/User:01tonythomas>
ThinkFOSS <http://www.thinkfoss.com>
*"where there is a WiFi, there is a way"*
Hi Everyone(Devs),
Just want to share with you my achievements in WMF for the past 6
months i have been here. My stay here is so awesome that even myself
is amazed about it, i have done so far the following things in this
organisation and i really see it as a motivation to do more..
- Fixed more than(>) 20 bugs for the org in various extensions and in
the mediawiki core(https://gerrit.wikimedia.org/r/#/q/owner:D3r1ck01+status:merged,n,z).
- Mentored GCI #2016 that just ended some few weeks ago.
- Authored one extension (Mailgun extension) for WM
- Help in one way or the other to make sure that little problem on IRC
are solved by new comers in the organisation and also doing the same
thing to other new comers the way other great guys here did to me when
i just came in.
I feel so proud of myself and i want to do more and more contributions
to the organisation. Please i will like you all to guide me so that i
can stay on the right path and make the Foundation(movement) to be a
better one. Your comments are highly appreciated. Thanks to all
members of the WMF dev team :)
Regards
Alangi Derick Ndimnain
Hi all!
During next Wednesday's RFC meeting on IRC, we are planning to discuss
introducing "Per-language URLs for pages of multilingual wikis". This is an
exploratory discussion, with no expectation of a final decision. We are looking
for concerns and suggestions.
If you care about improved support for multi-lingual wikis, please visit
<https://phabricator.wikimedia.org/T114662> and comment before next week's
"live" meeting on IRC. And of yourse, join that meeting, if you like!
Thanks,
Daniel
PS: Don't want to click the link? Let me copy&paste that for you... but please
comment on phab rather than here.
Context:
Some wikis show page content depending on the user's interface language (for
instance wikidata.org and commons.wikimedia.org, or other wikis using the
Translate extension). All language versions (renderings) of a page are served
from the same URL, which is problematic for web caching. This is the reason we
currently do not allow anonymous users to change their interface language. This
makes it hard for use multilingual wikis without logging in.
See also for context:
* {T114640}
Proposal:
* For multilingual wikis, the user language is part of the request path: e.g. we
would use `/wiki-de/` instead of `/wiki/`.
* The plain `/wiki/` path would act as a 302 redirect to the language specific
path (based on the user's language, or a best guess or cookie as implemented by ULS)
* When viewing a page via a language-specific path, all links on the page (both
content and skin) point to pathes specific to that language. Both content and
skin are shown in the user's ui language (as far as possible, using whatever
mechanism for content translation or internationalization is available)
* When viewing a page in a language different from the user's preferred language
(according to user preferences or the cookie set by ULS), a warnign bar is shown
at the top, giving the user the option to
## switch to the version in their own language (according to user preference)
## change their user language to the current page's language
## hide the bar for a while (a day, or the browser session, or so).
Challanges:
* Make the Linker class aware of the target language (probably needs a complete
refactoring), so it generates links to the right path.
* Make all code that generates links in the skin use the Linker class (directly,
or via the Parser), so the path is consistent.
* Allow efficient purging of the entire "bundle" of all the renderings of a
given page when the page's content changes.
* Should translated names for namespaces and special pages be supported on the
language specific pathes? (would be nice, but tricky)
* Provide a way to explicitly link to a specific language rendering from
wikitext, e.g. `{{#link|Foo|lang=de-ch}}`
For some time now I've thought that possibilities for data visualization in
Wikipedia (or in MediaWiki) are pretty bad. So I'll describe one idea below
and would like to get some feedback. Reason is simple: I'd like to give
this task to one university student so that he could solve that issue.
What I'd like to see is some development, that would make it possible for
user to create visualizations inside MediaWiki. Something so easy that a
child could do that. Like this <https://infogr.am/>. Workflow example: 1)
user selects sth like Create Data Visualization, 2) has some selections
about cart type, colors, etc, 3) place to write down text (title, axes,
description) and 4) a table to fill in with data (values + their text
labels). That could then be saved as one revision. After that every other
user could edit this graph with the same selections and data tables just
like users edit articles and edit history is saved and easy to compare.
Image files like thi
<https://et.wikipedia.org/wiki/Pilt:VikiArtiklitearv.jpg>s or that
<https://commons.wikimedia.org/wiki/File:Tree_map_exports_2010_Estonia.svg> are
ridiculous and fixes like that
<https://en.wikipedia.org/wiki/Template:Pie_chart> are not that flexible,
pretty and easy to use as what we need. So lets move forward. There are
plenty of GPL licensed solutions that could be integrated with MediaWiki.
But I can't be only one thinking about this. So what should I know about
that topic so that this work could really be useful? I.e. how to avoid
reinventing the wheel (like building something already in development) and
how to be sure that it could be easily incorporated later? Who would be the
perfect people to talk about this topic?
Also: are there some very specific tasks within this data visualization
topic that suite well as a research project(s) for an IT student(s)?
Regards,
Ivo Kruusamägi
P.S: I also have some development plans for a web platform that will help
to gamify organizing media files in Wikimedia Commons (coordinates,
categories, descriptions, quality assessment, etc). Sort like adding an
additional data layer and when everything works fine then migrating that
information into Commons. Any great ideas there as well? (not so great
ideas could be sent to list :P )
P.S 2: There is this feedback platform named WikiComment
<http://wikicomment.ut.ee/>. Some testing is need by wikipedians :)
One of the conclusions from the recent SessionManager rollout failure [0] was:
"we should have recruited and coordinated testing by developers and
users inside and outside of the WMF while the code was only on the
beta testing cluster"
SessionManager is back on the WMF beta cluster [1] now after being
briefly removed for the 1.27.0-wmf.12 release cycle, so an
announcement seems in order. The beta cluster implements a SUL
authenticated wiki farm that is completely separate from the Wikimedia
production SUL system. Helping test SessionManager there would involve
logging in, logging out, creating new user accounts and generally
wandering around the wikis doing things you would normally do in
production while keeping an eye out for session related issues.
If you spot something (or just think you spotted something) file a
Phabricator task with as many details as you can provide and tag it
with the #reading-infrastructure-team project. For session related
issues getting traces of the headers and cookies used in the requests
that are failing is most helpful. You can also poke around in the
logging interface [2] to try and find associated error messages.
If you find other bugs, report them in Phabricator too. :)
Also please remember NOT TO USE passwords in the beta cluster that
match the passwords you use anywhere else on the planet!
[0]: https://wikitech.wikimedia.org/wiki/Incident_documentation/20160123-Session…
[1]: http://deployment.wikimedia.beta.wmflabs.org/wiki/Main_Page
[2]: https://logstash-beta.wmflabs.org/#/dashboard/elasticsearch/default
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hello,
to anyone who is a client of stream.wikimedia.org
(https://wikitech.wikimedia.org/wiki/RCStream), so people who run tools
relying on the RC stream.
In about 48 hours, on February 3rd at 20:00 UTC, we will have to reboot
the backend servers of the stream.wm.org service, rcs1001 and 1002.
This service is loadbalanced and we will only reboot the 2 servers one at a
time,
so there should be no service downtime.
But nevertheless your clients will get disconnected and may need your
intervention to reconnect.
Please be prepared to do so if your client will not automatically reconnect.
I will send another mail once this has happened.
Best regards,
Daniel
--
Daniel Zahn <dzahn(a)wikimedia.org>
Operations Engineer
Hi folks,
In the ArchCom meeting earlier today, Daniel, Timo, Tim and I discussed the
way we handle RFC assignments in Phabricator. Previously, the RFC would
frequently be assigned to person writing the RFC. As we try out the Rust
model (per T123606 <https://phabricator.wikimedia.org/T123606>), and as we
try to increase the speed by which RFCs move though the process, we thought
it would make sense to also assign RFCs to shepherds on the ArchCom.
We didn't discuss all of the implications of this in the meeting today, but
we think this might help us scale our RFC triage process. What do you all
think?
Rob
A security vulnerability has been discovered in MediaWiki setups which
use both Wikibase and MobileFrontend.
All projects in the Wikimedia cluster have been since patched but if
you use these two extensions please be sure to apply the fix.
Patch file and issue are documented on https://phabricator.wikimedia.org/T125684