Hey,
Back in the old days, when we were still using SVN, we had this svn auto
props id thing in API modules used in the getVersion method. Is there a git
equivalent thing we should use now?
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
Per bug 35842, I've overhauled the persistent cookie handling in the
MobileFrontend extension. I think my changes will work fine on the WMF
architecture where most of our sites have a separate domain for their
mobile version. However, for sites that use a shared domain for both
desktop and mobile views, there is major browser caching-related weirdness
that I have not been able to figure out. Details can be found in the bug:
https://bugzilla.wikimedia.org/show_bug.cgi?id=35842
A little more context about the issue: we need to be able to allow people
to switch between desktop/mobile views. We're currently doing this by
setting a cookie when the user elects to switch their view, in order to
keep that view persistent across requests. On the WMF architecture, we do
some funky stuff at the proxy layer for routing requests, depending on
detected device type and whether or not certain cookies are set for the
user. Generally speaking the sites hosted on our cluster have a separate
domain set up for their mobile versions, even though they're powered by the
same backend. This makes view switching a bit easier, although I think the
long-term hope is to get rid of mobile-specific domains. For sites that do
not have a separate domain set up, we rely solely on cookies to handle
user-selected view toggling. This seemed to generally work OK with the way
we were previously handling these 'persistent cookies', but the previous
way of cookie handling has been causing caching problems on our cluster.
The changes I've introduced to hopefully resolve those issues result in
browser-caching issues on single-domain sites using MobileFrontend, where
after toggling the view and browsing to a page that was earlier viewed in
the previous context, you might see a cached copy of the page from the
previous context. No good.
I'm stumped and am at a point where it's hard to see the forest through the
trees. I could use some help to deal with this - if anyone has any insight
or suggestions, I'm all ears!
Thanks,
Arthur
--
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
Hi everyone,
I'm forking off from the "Development process doesn't work" thread to
highlight a message that Sumana sent the other day.
Last year, Erik asked me to manage the 20% policy time that has frequently
been referenced here. Prior to the Git migration, I've made a point of
emphasizing code review, because we had a pretty large backlog, and the
consequences of falling behind there were more severe than letting other
things slide.
The goal of the 20% policy has always been to have a bounded but
appropriately large time set aside for supporting volunteer development and
high priority bugfixing. Now that we're on Git, and more specifically, now
that we're doing pre-commit review, there are a number of activities that
we can afford to bump up the priority on. They include:
* Reviewing extensions for deployment
* Deploying extensions
* Patches in Bugzilla
* Bugfixing problems outside of developers' focus areas
* Shell requests
* (possibly in the glorious future) pull requests issued in mirrors (e.g.
Github, Gitorious, etc)
That's a simplified version. A more complete view is here:
https://www.mediawiki.org/wiki/Wikimedia_engineering_20%25_policy#Scope
I've handed the baton off to Sumana to manage 20% time on behalf of the
community, a point that was included in her message below, but may have
been missed. We need to look at the totality of volunteer development
activity, plus take a look at high priority bug fixing on behalf of the
reader and editor community, and coordinate our approach in addressing
those issues. That's what I've asked Sumana to do, which is reflected in
her message below. If you haven't already read this, please do.
Thanks
Rob
---------- Forwarded message ----------
From: Sumana Harihareswara <sumanah(a)wikimedia.org>
Date: Tue, Apr 10, 2012 at 5:02 AM
Subject: Re: [Wikitech-l] Development process doesn't work (yes this is
another complaint from another community member)
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
On 04/04/2012 09:14 AM, Sumana Harihareswara wrote:
> Petr:
>
> My sympathies on the frustration. First I'm going to talk about the
> problem in general, then about your issue.
I can't tell whether anyone read my message on the 4th. I know it was
long, but that's because I was addressing pretty much all the open
questions at the time. :-) If you have concerns about this issue,
please do read it.
Petr replied to me offlist to straighten out his particular situation.
It sounds like for one of his extensions the ball is in his court, and
for the other (OnlineStatusBar), it's in the WMF's.
I've updated three pages to clarify and document our process:
* https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment now
explains that I'm the point of contact to get extensions authors their
initial technical design reviews, Howie Fung is their contact to get
initial user experience design reviews, and the release manager is their
contact to get reviewed code deployed.
* https://www.mediawiki.org/wiki/Review_queue now has the status of each
extension; about 8 are waiting for more WMF work, and 9 are waiting for
responses from extension authors.
* https://www.mediawiki.org/wiki/Deployment_queue
So I need to get some user experience reviews and some technical/code
reviews going for the 8 extensions that are awaiting more WMF work. Tim
suggested that it might be more efficient and pleasant if WMF engineers
could concentrate on one project each for their 20% community service
time, and RobLa has now decided that I should be the one prioritizing
and allocating 20%-time responsibilities. So I'm going to be asking
some WMF engineers if they could switch from doing patch review (in
Gerrit) to reviewing particular extensions, for their 20% days. I have
a few people in mind.
Another snag, in at least one case, was that WMF engineers are unclear
on who qualifies for deployment privileges and how to get them. That's
something we started talking about in December in
http://lists.wikimedia.org/pipermail/wikitech-l/2011-December/thread.html#5…
and that still needs followup - I believe Ian is going to put some
preliminary notes on mediawiki.org soon, and Platform Engineering
(specifically RobLa & I) will follow up on that.
Hope this helps.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi, this is a proposal for a new feature to mediawiki core or a new
extension (I would myself prefer an extension, but given that the
development process seems to be broken, per my previous email, it's
likely not possible for a non-wmf dev to have it deployed)
I think many editors had the problem that they were editing a page,
but had to leave computer for some reason before they finished and
needed to save the work, the only way to handle this is to copy the
source code and save it somewhere, but what if you aren't going to use
the same computer? The workaround is to copy paste the code of page to
some other page on wiki, but that isn't really so easy, for new users
and it's annoying. It would be nice to have a feature "save draft to
userspace" which would open the new text box with name of new page you
want to save source to, which would look like:
Special:MyPage/Draft_$ArticleName_$Date
Users could change it to some other name of course before saving. Or
even they could create a template for name in their advanced
preferences. This could make it simpler to save work in progress for
newbies.
Hi,
I submitted a bug here:
https://bugzilla.wikimedia.org/show_bug.cgi?id=34029
This is about the API blocklist. Mark marked it as a duplicate of
https://bugzilla.wikimedia.org/show_bug.cgi?id=24782. It is about recent
changes.
Yesterday I was fighting with usercontibs for a long time before I realized
that it has the same problem. This is already the third field.
You may think it is not very likely to have many contibutions for the same
user in the same second, but this is not necessarily the case with bots. (I
am not speaking about whether it is a polite thing to use bots this way, I
am speaking about working with user contribs once this happened.) I myself
do such things without bot when I open several articles in neighbour tabs
for parallel processing.
So hundred words end up in one, as we say in Hungary, time given in seconds
is inappropriate for query-continue in any API services as it is not
unique.
--
Bináris
The deadline for students to apply to Google Summer of Code has now
passed. We received 41 substantive proposals*, but of course can only
accept a small subset of those. We don't have that many willing
mentors, we certainly won't get that many slots from Google, and I want
to accept only students who are very likely to succeed.
Early next week I'll be in touch with mentors to finish ranking and
grading the proposals. Also next week, several applicants who still
haven't found mentors will be looking for experienced developers to fall
in love with their proposals. :-)
We'll announce the acceptances on April 23rd.
* And 22 that were spam, copy-and-paste from our ideas list, incredibly
vague, etc. In 2011 we only got about 24 consideration-worthy
proposals, so this year's quite an increase. Thanks to everyone who did
outreach and who helped people apply.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Thought this would be interesting to wikitech-l.
---------- Forwarded message ----------
From: Yuvi Panda <yuvipanda(a)gmail.com>
Date: Tue, Mar 27, 2012 at 1:47 PM
Subject: Chennai Unofficial Wikimedia Hackathon Report
To: "Discussion list on Indian language projects of Wikimedia."
<wikimediaindia-l(a)lists.wikimedia.org>
The Chennai Unofficial Wikimedia Hackathon Report
TL;DR: 13 completed hacks, including 2 core mediawiki patches, 3
tawiki userscript updates and 2 new deployed tools. It was super
awesome and super productive!
The 'Unofficial' Chennai Wikimedia
Hackathon(http://www.mediawiki.org/wiki/Chennai_Hackathon_March_2012)
happened on Saturday, March 17 2012 at the Thoughtworks office in
Chennai. It was a one day, 8 hour event focusing on getting people
together to hack on stuff related to all Wikimedia projects - not just
Mediawiki patches.
The event started with us sailing past security reasonably easily, and
getting setup with internet without a glitch. People trickled in and
soon enough we had 21 people in there. Since this was a pure
hackathon, there were no explicit tutorials or presentations. As
people came in, we asked them what technologies/fields they are
familiar with, and picked out an idea for them to work on from the
Ideas List (http://www.mediawiki.org/wiki/Chennai_Hackathon_March_2012/Ideas).
This took care of the biggest problem with hackathons with new people
- half the day spent on figuring out what to work on, and when found,
it is completely outside the domain of expertise of the people hacking
on the idea. Talking together with them fast to pick an idea within 5
minutes that they can complete in the day fixed this problem and made
sure people can concentrate on coding for the rest of the day.
People started hacking, and just before lunch we made people come up
and tell us what they were working on. We then broke for lunch and
usual socialization happened over McDonalds burgers and Saravana
Bhavan dosas. Hacking started soon after, and people were
concentrating on getting their hacks done before the demo time. And we
did have quite a few demos!
Demos
=====
Here's a short description of each of the demos, written purely in the
order in which they were presented:
1. Wikiquotes via SMS
By: @MadhuVishy and @YesKarthik
What it does:
Send a person name to a particular number, and you'll keep getting
back quotes from that person. Works in similar semi-automated fashion
as the DYKBot. Built on AppEngine + Python.
Status:
Deployed live! Send SMS '@wikiquote Gandhi' to 9243342000 to test it
out! Has limited data right now, however.
---
2. API to Rotate Images (Mediawiki Core Patch)
By: Vivek
What it does:
Adds an API method that can arbitrarily rotate images. Think of this
as first step towards being able to rotate any image in commons with a
single button instantly, without having to wait for a bot. Patch was
attached to https://bugzilla.wikimedia.org/33186.
Status:
It was reviewed on that day itself (Thanks Reedy!). Vivek is now
figuring out how to modify his patch so that it would be accepted into
Mediawiki core. Vivek is also applying to work with Mediawiki for
GSoC, so we will hopefully get a long term contributor :)
---
3. Find list of unique Tamil words in tawiki
By: Shrinivasan T
What it does:
It took the entire tamil wikipedia dump and extracted all unique words
out of it. About 1.3 million unique tamil words were extracted. Has
multiple applications, including a tamil spell checker.
Status:
Code and the dataset live on github:
https://github.com/tshrinivasan/tamil-wikipedia-word-list
---
4. Program to help record pronunciations for words in tawikt
What it does:
Simple python program that gives you a word, asks you to pronounce it
and then uploads it to commons for being used in Wiktionary. Makes the
process much more streamlined and faster.
Status:
Code available at:
https://github.com/tshrinivasan/voice-recorder-for-tawictionary.
Preliminary testing with his friends shows that easy to record 500
words in half an hour. Is currently blocked on figuring out a way to
properly upload to commons
---
5. Translation of Gadgets/UserScripts to tawiki
By: SuryaPrakash [[:ta:பயனர்:Surya_Prakash.S.A.]]
What he did:
Surya spent the day translating two gadgets into Tamil, so they can be
used on tawiki. First is the 'Prove It' Reference addition tool
(http://ta.wikipedia.org/wiki/Mediawiki:Gadget-ProveIt.js). The second
one was the 'Speed Reader' extension that formats content into
multiple columns for faster scanning
(http://ta.wikipedia.org/wiki/Mediawiki:Gadget-TwoColumn.js). Last I
checked, these are available for anyone with only tamil knowledge to
use, so yay!
(He also tried to localize Twinkle for Tamil, couldn't because of
issues with the laptop he was using.
---
6. Structured database search over Wikipedia
By: Ashwanth
What it does:
Built a tool that combined DBPedia and Wikipedia to allow you to
search in a semantic way. We almost descended into madness with people
searching for movies with Kamal and movies with Rajni (both provided
accurate results, btw). Amazing search tool that made it super easy to
query information in a natural way.
Status:
The code is available at
https://github.com/ashwanthkumar/structured-wiki-search. Definitely
would be awesome to see this deployed somewhere, so would be great if
the community could come up with specific ideas on how to make this a
specific cool tool.
---
7. Photo upload to commons by Email
By: Ganesh
What it does:
Started with building a tool that will let you email a particular
address with pictures + metadata in the body of the page, and it will
be uploaded to commons. This was for the benefit of people with older
outdated phones *cough*Logic*cough* who would like to use their
phone's camera to contribute to commons, but can not due to technical
limitations.
Status:
He wasn't able to get that to work during the hackathon - too many
technical issues cropped up. However, he's *very* definitely
interested in setting it up, and has made progress towards it. I
hope someone from the community (perhaps people doing WLM?) should be
able to get in touch with him to see if this tool could be developed
further with a specific goal in mind.
---
8. Lightweight offline Wiki reader
By: Feroze
What it does:
There is a project called qvido
(http://projects.qi-hardware.com/index.php/p/qvido/) which was a
'lightweight' offline Wiki reader (compared to Kiwix, which is
heavier). It has been abandoned for a while, however. Feroze took the
time to revive the project, figure out how to build it (and wrote
build instructions!) and also fixed a bug so that it can be used to
demo showing offline Wiki navigation. He was able to demo it showing
the Odiya Wikipedia dump offline, with working link navigation.
Status:
There exists a git repo (https://github.com/feroze/qvido) with the
code + the build instructions. I hope that people interested in
offline projects check this out and see if it can be made useful, and
take this forward.
---
9. Patches to AssessmentBar
By: gsathya
What it does:
AssessmentBar is a small widget/tool I'm building to make WP India
assessments easier (at the request of User:AshLin. Stay tuned for an
announcement in the next few days). Sathya spent time making the
backend for it more scalable, so the same server can support multiple
projects and concurrent users in a better way. Before that he was
contemplating setting up a hidden Tor node for Wikipedia (he's a Tor
core contributor) and then playing with data visualizations with WP
data.
Status:
There is a pull request (https://github.com/yuvipanda/MadamHut/pull/2)
that I need to merge :)
---
10. Parsing Movie data into a database
By: Arunmozhi (Tecoholic) and Lavanya
What it does:
It scrapes the infoboxes of all movies from whatever category you give
it and stores this into a database. This is harder than it sounds
because parsing wikitext is similar to beating yourself up repeatedly
in the head with a large trout. They managed to figure out a nice way
to extract information from all Indian movie pages, and put it in a
database for programmatic easy access later.
Status:
I've asked them to put the code up publicly somewhere, and since I
believe Tecoholic is in this mailing list, he'll reply with the link
:) These kinds of data scraping can be used to build very nice tools
that show off how much information Wikipedia has, and perhaps also
help people contribute back by editing information for their favorite
movies. I hope the community comes up with a nice idea to utilize
this, and takes this project forward to its ultimate destiny: A super
sexy IMDB type site for Indian Movies with data sourced from Wikipedia
(I can dream :D)
---
11. Random Good WP India article tool
By: Shakti and Sharath
What it does:
It is a simple tool that shows you one B, A, GA or FA article every
time you go there. The idea is to provide a usable service for people
who want to accumulate lots of knowledge by randomly reading stuff,
but only want good stuff (so stubs, etc are filtered out (unlike
Special:Random)). I'll also note that neither of them had worked with
any web service before the hackathon, nor with JSON, nor with the
mediawiki API, yet were able to build and deploy this tool within the
day. /me gives a virtual imaginary barnstar to either of them
Status:
It is currently deployed at http://srik.me/WPIndia. Everytime you hit
that link, you'll get an article about India that the community has
deemed 'good'. The source code is available
(https://github.com/saki92/category-based-search). They are eager to
do more hacks such as these, and I'm hoping that the community will
find enough technical cool things for these enthusiastic volunteers to
work on
---
12. Fix bugs on tawiki ShortURL gadget
By: Bharath
What it does:
The short url service used in tawiki (tawp.in) is shown in the wiki
via a gadget. It is not the most user friendly gadget - you need to
right click and select copy. Bharath looked for a solution by which
you could click it and it would copy to the clipboard, but did not
find any that would work without flash. Hence he abandoned that and
started figuring out easier ways of making that happen. He also fixed
several bugs in the implementation of the gadget, and I expect it to
get deployed soonish. Thanks to SrikanthLogic for helping him through
the process.
Status:
Code is available at
http://ta.wikipedia.org/wiki/%E0%AE%AA%E0%AE%AF%E0%AE%A9%E0%AE%B0%E0%AF%8D:….
He's still fixing things on the script. If the community needs people
to come fix up their user scripts/gadgets, Bharath would be a willing
(and awesome!) candidate!
---
13. Add 'My Uploads' to top bar along with My Contributions, etc
(Mediawiki Core Patch)
By: SrikanthLogic
What it does:
Not satisfied with being the organizer of the hackathon, Srikanth
wanted to flex his programming muscles and spent time fixing a bug in
core mediawiki (https://bugzilla.wikimedia.org/show_bug.cgi?id=30915).
He spent a while digging around the proper way to do this, and managed
to make a proper patch!
Status:
It has been committed in gerrit (currently unable to find a link).
Should be merged in soon. Yay!
Honorable Mentions
===================
1. WikiPronouncer
By: Russel Nickson
What it was supposed to do:
Exactly like Shrini's tool to record word pronunciations and upload to
commons, but written for Android so people could add prononciations on
the go.
Status:
Code is available at https://github.com/russelnickson/pronouncer. He
ran into technical issues with Android setup (it stops working
completely if you look at it the wrong way), and was unable to
complete this. I think this would still be a very useful tool, and
hope someone from the community steps up to work with Russel and get
this finished.
---
2. Wiktionary cross lingual statistics
By: PranavRC
What it was supposed to do:
It was a statistical tool that generated statistics about how many
words overlap between all indic languages in Wiktionary (as measured
by interwiki links).
Status:
The code has been written (I've requested the author to put it up
publicly, will update list when it is). It, however, requires a lot of
time to be run. So validation by the community that such stats would
be useful would, IMO, definitely give Pranav the impetus to finish it
up and show us the pretty graphs :)
So, in all, 13 demos were produced (+ 2 near misses). I think we can
call this one a success, no? :)
Next Steps
==========
Where do we go from here? Random thoughts:
1. Geek retention - this is reasonably easy. If we keep feeding
hackers interesting problems that affect a lot of people, they'll keep
helping us out. Is it possible to have some sort of a 'tools required'
or 'hacks required' or 'gadgets required' page/queue someplace where
we can always direct hackers looking for interesting problems to? IMO
Wikipedia is full of interesting technical problems, so this *should*
be feasible.
2. Follow ups - this time, I am able to do this personally (small
enough group). Clearly this will not scale. Do we have ideas/methods
for following up with these people so that they stay with us?
3. More of these? This was pretty much a 'zero cost' event - stickers
were the only 'cost'. A lot of places around the country would love to
have their space used for a hackathon of sorts. Should we do more of
these kind of 'Unofficial' hackathons?
Thanks due (in random order)
============================
1. Thoughtworks/BalajiDamodaran: He graciously hosted us at
Thoughtworks. The biggest challenge for any hackathon is to find a
nice place which understands what hackathons are, and provides what is
considered the lifeblood of a hackathon - working WiFi. Balaji
(@openbala) was incredibly awesome, and this entire thing would've not
been possible at all without him and ThoughtWorks.
2. Dorai Thodla: He helped popularize the hackathon among the Chennai
Geeks crowd, and acted as a sounding board at various important times.
He also connected us with @openbala and enabled us to get the venue.
Thanks!
3. Srikanth Lakshmanan: The hackathon was his idea, and he made sure
it was executed in a nice way. He was the official 'organizer', and
made sure that all logistics were taken care of. Once the event
started, he was very helpful in helping people technically and in
picking up ideas, while also hacking on his own patch. This event was,
in essence, organized and run by him. He took an overnight trip from
Hyderabad straight out of office just for this. Thanks for making this
possible!
4. Shrini (aka the relentless forwarder): This event wouldn't have
been as much a success without him either. Evangelism across multiple
lists, adding a lot of ideas that could be done, helping the people
there out technically at all times and writing two really good hacks -
Thank you! I'm glad we get to keep you :)
5. Subhashish Panighrahi: For sending us stickers :D (And who all is
involved in that logistical process too!)
Most of all, this event was a success because of the quality and
dedication of the people who turned up, giving up their Saturdays.
Hope everyone who turned up had a nice time :) I am personally in
touch with most of them, and I also have their email address, phone
number *and* permission to contact them again. If anyone here thinks
that they liked one of the hacks and want to take it further, please
contact me (User:Yuvipanda on Mediawiki.org or yuvipanda(a)gmail.com)
and I'll get you people in touch. If there is a more accepted,
standard way of handling this type of private information, please let
me know as well!
Thanks!
-
Yuvi Panda T
http://yuvi.in
--
Yuvi Panda T
http://yuvi.in/blog
I have created a patch for the gallery tag and have been given the
following review.
https://gerrit.wikimedia.org/r/4609
* JavaScript injection: you can inject javascript: URIs which execute
code when clicked
* plain links ("link=Firefox") are taken as relative URLs which will
randomly work or not work depending on where they're viewed from
* need parser test cases to demo it working
So my questions are:
What would be the recommended way of stripping away javascript from
uris? Are there any shared functions which do exactly this?
And how would i solve the plain links problem? do a regex check for an
absolute uri? e.g http://example.org/foo/bar?
And what is "parser test cases", phpunit tests? or some other form of testing?
Thank you!
Kim Eik.
If anyone is interested in improving watchlists with grouping and usability
enhancements, please let me know. I have submitted a clear and practical
project proposal for the 2012 Google Summer of Code and I am seeking a
mentor. I will, of course, do all of the heavy lifting (coding) to make it
happen. All I need from a mentor is knowledgeable guidance and occasional
assistance with debugging. Please let me know as soon as possible as the
applications will be reviewed in the next few days. I really want to see
this project move forward with full support from the MediaWiki development
community, because the Watchlist certainly could benefit from a retrofit.
My commitment to this project will extend beyond the GSoC timeframe, so
support in finding a mentor will certainly pay off dividends for the
MediaWiki project.
My proposal is available here:
https://www.mediawiki.org/wiki/User:Blackjack48/GSOC_proposal_for_watchlist…
Thanks,
Aaron Pramana