Hi all
I am Basil George, a research scholar at IIIT, Hyderabad<http://iiit.ac.in/>,
India. I have submitted an IEG proposal to build a tool for easy
visualization of revision histories of Wikipedia pages along with their
relevant statistics by rendering them onto a map. The tool will also
provide suggestions as to which geographical areas lack editors on a
particular topic. Free and open source geo-spatial technologies will be
used for this project which, we hope, will encourage more technology
developers to pitch in and contribute to developing Wikimedia.
Please go through the proposal
here<http://meta.wikimedia.org/wiki/Grants:IEG/Mapping_History:_Revision_History…>
and
do endorse it if you find it interesting.
Looking forward to a good discussion.
Thanks and regards,
--
Basil.
http://researchweb.iiit.ac.in/~basil.george/
Chance favors the prepared mind.
I'd like to introduce LinqToWiki: a new library for accessing the
MediaWiki API from .Net languages (e.g. C#).
Its main advantage is that it knows the API and is strongly-typed,
which means autocompletion works on API modules, module parameters and
result properties and correctness is checked at compile time.
More information is at http://en.wikipedia.org/wiki/User:Svick/LinqToWiki.
Any comments are welcome.
Petr Onderka
[[en:User:Svick]]
Because of US "President's Day," many Wikimedia Foundation staffers in
the US will be unavailable Monday 18 February. If you have an
emergency, I believe #wikimedia-tech will be responsive, just as it is
on weekends.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
All,
I am pleased to announce Marc-Andre will be joining us as aTechnical
Operations Engineer (contractor) starting this February 25, 2013. His
primary focus will be to build up the Wikimedia Labs infrastructure and to
assist the community developers to migrate their tools to that
infrastructure, especially those residing on the Toolserver today.
Marc-Andre is an active wikipedian and he is better known as 'Coren' on
English Wikipedia and has been a volunteer editor since 2006 where he has
served as administrator and arbitrator. He also always kept himself
involved with the technical and procedural aspects of automated editing
(so-called bots), having written and operated a copyright-violation
detection bot for several years.
Marc has been a Unix system administrator and occasional computer science
instructor for 20+ years, in fields ranging from telecommunication to game
development. He studied IT at École de Technologie Supérieure (Canada).
Please join me to welcome him and you can find him on IRC (freenode.net)
using the nick 'Coren'.
Thanks.
CT Woo
Hello all,
With the upgrade of Gerrit, I have also taken the time to improve the
Reviewer bot. For those who do not know: the reviewer bot is a tool
that adds potential reviewers to new changesets, based on
subscriptions [1].
One of the problems we have encountered is the use of the SSH-based
Gerrit change feed. This effectively requires 100% uptime to not miss
any changesets. This has now been solved: we are now using the
mediawiki-commits mailing list, and are reading the messages in a
batched way (every five minutes).
This change also makes development & testing much easier, which is a nice bonus.
On the backend, there are two changes: instead of the JSON RPC api, we
are now using the REST api, and we have moved hosting from the
toolserver to translatewiki.net (thank you, Siebrand!)
Best,
Merlijn
[1] http://www.mediawiki.org/wiki/Git/Reviewers
+1 with Amir. The unicode support is very important.
About list of firsts wikis, why fr.wikisource isn't in the list? We have requested some months ago to be one of the first wikis to test the extension and the answer seems positive. https://bugzilla.wikimedia.org/show_bug.cgi?id=39744
Thomas
"Amir E. Aharoni" <amir.aharoni(a)mail.huji.ac.il> a écrit :
Is there any way to handle Unicode strings in the version that is
going to be deployed? For example, things like getting the length of
the string "François" as 8 rather than 9?
If not, is there any plan to have this ability any time soon?
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
2013/2/16 Rob Lanphier <robla(a)wikimedia.org>:
> Hi everyone,
>
> We're planning to deploy Lua to a long list of wikis on Monday,
> February 18, 23:00-01:00 UTC (stretching into Tuesday UTC), including
> English Wikipedia.
>
> Details here:
> http://meta.wikimedia.org/wiki/Lua
>
> Jan Kučera (User:Kozuch) has placed notifications on many of the
> wikis. Those notifications and general communications listed here:
> http://en.wikipedia.org/wiki/User:Kozuch/Lua
>
> This is a really exciting deployment for the projects. We're really
> looking forward to seeing the great things that people do with this,
> and looking forward to making editing and previewing more responsive
> for template-heavy pages.
>
> Rob
>
> _______________________________________________
> Wikitech-ambassadors mailing list
> Wikitech-ambassadors(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
(Apologies for cross-posting)
Heya,
The mobile team needs accurate pageviews for the alpha and beta mobile
site. Currently, this information is only stored in a cookie, but we don't
want to go the route of starting to store this cookie because of cache
server performance, network performance and privacy policy issues. The
mobile team also needs to be able to diferentiate between initial and
secondary API requests - pages in the beta version of MobileFrontend are
dynamically loaded via the API, meaning that MobileFrontend will might make
multiple API requests to load sections of an article when they are toggled
open up by the user. At the moment, we have no way of diferentiating
between API requests to determine which one should count as a 'pageview'.
We propose that we set two additional custom HTTP headers - one to identify
alpha/beta/stable version of MobileFrontend, the other to be able to
diferentiate between initial and secondary API requests. This would make
logging the necessary information trivial, and we believe it would be
fairly lightweight to implement.
We propose the following two headers with their possible values:
X-MF-Mode: a/b/s (alpha/beta/stable)
X-MF-Req: 1/2 (primary/secondary)
X-MF-Mode would be determined by Varnish based off the existence of the
alpha/beta identifying cookies while X-MF-Req would be set by
MobileFrontend in the backend response.
These headers would only be set on the Varnish servers, on the Squids/Nginx
we will just set a dash ('-') in the log fields.
Questions:
1) Are there objections to the introduction of these two http headers?
2) We would like to aim for a late February deployment, is that an okay
period? (We will announce the real deployment date as well)
3) Are we missing anything important?
Thanks for your feedback!
Best
Arthur & Diederik
Good evening,
Hashar and me discussed about the Vector extension this morning on
#wikimedia-tech and we wonder if the code shouldn't be merged to core.
Rationale: Vector is the main design of our product and the Vector
extension contains enhancement for this skin.
--
Best Regards,
Sébastien Santoro aka Dereckson
http://www.dereckson.be/