This project is as simple as grandious its name is: let's categorize the
scipts growing in various wikis and connect these categories with
interwikies to see what other guys scripted. Currently I could do it in
Hungarian, French and Chinese Wikipedia. (Although my own scripts don't
follow the recommendations at the moment.) Anyone feels like continuing it,
please don't hesitate!
Visit http://meta.wikimedia.org/wiki/International_Pywiki_Project.
--
Bináris
Hi Pywikipedians,
Is there a method for getting the inbound redirects that point to a page?
For example, given the urlname=Death_of_Michael_Jackson, how do I get the
pages that redirect to it?
Thanks for advice!
jrf
As i18n stared during a hackatron it was implemented for rewrite branch only and I ported it to the trunk version. That's the reason why the files are place in the rewrite branch. You should not place new translation files to the trunk but the rewrite release. All new translations should be done at twn only after the file is ported.
Regards
xqt
Hi Bináris,
it should be kept as an option and not become default. E.g. on de-wiki there is no copyright violation for minor edits, bot edits and trivial edits named "Schöpfungshöhe" (sorry, don't know, how to translate). Anyway we normally have no copyright violation on categories. On the other hand we place the version history in one version of the article namespace rather than the talk space an the permanent link is tagged to the corresponding talk page.
Regards
xqt
Hi Bináris,
it should be kept as an option and not become default. E.g. on de-wiki there is no copyright violation for minor edits, bot edits and trivial edits named "Schöpfungshöhe" (sorry, don't know, how to translate). Anyway we normally have no copyright violation on categories. On the other hand we place the version history in one version of the article namespace rather than the talk space an the permanent link is tagged to the corresponding talk page.
Regards
xqt
Hi folks,
please help me understand how to do this stuff.
First, I have to use the i18n.twtranslate() function. That's OK.
Second: there must be a file in /i18n subdirectory with the same name that
contains translations.
As I see here:
https://www.mediawiki.org/wiki/Special:Code/pywikipedia/9882Follow-up
translations are directly written in this file and committed? But what is
the role of translatewiki then? I miss a manual for that.
--
Bináris
Pywikipedians,
What is the best way to get the wikitext for every revision of a page?
I've been trying to understand why pywikipedia.fullVersionHistory does not
keep going. It seems to do one or two or maybe three fetches of
revCount=500 and then it stops --- even if there are many more revisions.
Is there a fix for this?
For example, for Barack_Obama, I get consistently 1393 revisions, and the
most recent in that list is from 2006 !
Here's how I am calling it:
h = p.fullVersionHistory(getAll=True, reverseOrder=True, revCount=500)
where 'p' is a Page instance.
Advice?
Thanks!
John
--
___________________________
John R. Frank <jrf(a)mit.edu>
Forwarding, as it might be interesting to see some of you guys there!
-------- Original Message --------
Subject: [Toolserver-l] Save the date: Wikimedia & MediaWiki hackathon
in Berlin, 1-3 June 2012
Date: Fri, 10 Feb 2012 13:30:32 -0500
From: Sumana Harihareswara <sumanah(a)wikimedia.org>
Reply-To: toolserver-l(a)lists.wikimedia.org
Organisation: Wikimedia Foundation
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>,
mediawiki-l(a)lists.wikimedia.org, mediawiki-api(a)lists.wikimedia.org,
mediawiki-enterprise(a)lists.wikimedia.org,
toolserver-l(a)lists.wikimedia.org, wiki-research-l(a)lists.wikimedia.org,
Nicole Ebber <nicole.ebber(a)wikimedia.de>
I invite you to the yearly Berlin hackathon.
This is the premier event for the MediaWiki and Wikimedia technical
community. We'll be hacking, designing, and socialising.
Our goals for the event are to bring 100-150 people together, with
lots of people who have not attended such events before. User
scripts, gadgets, API use, Toolserver, Wikimedia Labs, mobile,
structured data, templates -- if you are into any of these things, we
want you to come!
Some financial assistance will be available -- more details soon.
This event will be hosted by Wikimedia Germany (WMDE) and supported by
the Wikimedia Foundation. Thank you, WMDE!
Dates: June 1-3 2012. Barely-started wiki page, no registration details
yet: https://www.mediawiki.org/wiki/Berlin_Hackathon_2012 . Organizers:
me and WMDE's Nicole Ebber with assistance from Lydia Pintscher and
Daniel Kinzler.
Mark your calendars!
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
_______________________________________________
Toolserver-l mailing list (Toolserver-l(a)lists.wikimedia.org)
https://lists.wikimedia.org/mailman/listinfo/toolserver-l
Posting guidelines for this list: https://wiki.toolserver.org/view/Mailing_list_etiquette
Hi,
I have a headache because of the new script. apispec.py. Site parameter
behaves very strange.
My intention was to give an optional site parameter to __init__ in class
Blocks. If it is not given explicitely, a getSite() has to be run.
Now, my bot's home wiki is huwiki. It has in this moment 2 blocks that will
expire in 1 day (one of them expires by the time you read this,
unfortunately, but is important, and according to my experiences it will be
displayed today until midnight).
Data for block #37947
Blocked user: n/a (autoblock)
Admin: Pagony (#121148)
Beginning in UTC: 2012-02-04 16:04:22
Expiry in UTC: 2012-02-05 16:04:22
Flags: automatic, nocreate
Reason: Az általad használt IP-cím
[[Wikipédia:Autoblokk|autoblokkolva]] van, mivel korábban a blokkolt
"[[User:Magyarihun|Magyarihun]]" használta. (Magyarihun blokkolásának
indoklása: "'''blokk kijátszása'''") Ha nem te vagy Magyarihun, lépj
Data for block #37940
Blocked user: 89.149.56.249
Admin: Malatinszky (#55605)
Beginning in UTC: 2012-02-02 14:47:29
Expiry in UTC: 2012-02-06 06:47:29
Flags: anononly, nocreate, allowusertalk
Reason:
I run the attached sample code. It is very short.
Now, if I leave the parentheses of getSite empty or I write ('hu'), it
works well. For many other codes such as 'fr', 'de', 'ar' it works well,
too, and gives the appropriate list of blocks expiring within 24 hours from
that wiki.
But! If I write ('en'), the result is the appropriate list from enwiki PLUS
the two Hungarian blocks at the end.
Writing 'ru' is even stranger, because it returns the blocks from ruwiki
plus only the second one of the huwiki's blocks at the end (the first one
is not even inside the list, I checked).
If I omit detecting and passing site, and give the code directly in the
beginning of apispec.py, the result is the same.
Where is the error?
TIA,
--
Bináris
Hi folks,
I committed a small extension to category.py. When moving categories, with
-hist it creates a wikitable to the talk page of the new category that
lists the edit history of the old category to be deleted.
See the example at
http://hu.wikipedia.org/wiki/Kateg%C3%B3riavita:Akci%C3%B3j%C3%A1t%C3%A9kok
(The table is from getVersionHistoryTable() just as is.)
It is good for respecting the copyrights and is in some way similar what we
do when moving a page from Wikipedia to a sister project without proper
export-import. And it is more verbose than just listing authors in edit
comment.
This behaviour needs now the explicite command line parameter -hist because
I would like to hear your opinions. If you agree, the extra parameter could
be removed and this behaviour made default.
(Unfortunately, I am not familiar with i18n support yet, so this has an
old-style translation dict at the moment. I will transform it later.)
--
Bináris