As owner of interwiki bot i now see:
Disabling of interwiki bots is about one row in source code of bot, depend how ofter owners update. So 1-2 days can disable many bots, the others shoud be blocked for a while
But there is problem now - other wikis still use classic intwrwiki links and on hu.wiki still remains these links - which are outdated and in some case with incorrect links too - and this causes interwiki conflicts on other wikis, because bots read links in this wiki, but cannot edit them.
And on wikidata are oudtated data, because many new articles are created (moved and deleted) daily, but the most used platform - pywikipedia is not ready yet for wikidata.
Bots should remove links on nonconflicted pages on hu.wiki and update wikidata too, but no one was able (or willing?) to write this feature since 30th october :-(
In next days more and more wikis will be "locked" for interwiki bots, but these problems will remain minimally one week after this feature exists (one week is necessary for granting bot flag on wikidata - or should there be global bots allowed?)
JAnD
"Bináris" wikiposta@gmail.com schrieb:
2013/1/28 Amir Ladsgroup ladsgroup@gmail.com
What is exact time of the next deployment (it and he)?
If you want to catch it, join #wikimedia-wikidata on IRC. It was great to follow it on D-day!
And what time you think is best to disable interwiki bots?
Xqt can modify the code, but pywiki is not deployed, it is updated by bot owners, so there is no chance to focus it on one hour. For this reason I would say to begin it after deployment of Wikibase as otherwise one should do it at least 1 or 2 days before which would cause a maintenance pause. Yes, people will try to remove iws and some of them will be put back by bots.
Would it also make sense to write a bot putting the remaining iws to >wikidata and rmoving them from the wiki if they can be replaced by t>hem from wikidata?
Marco
On Tue, Jan 29, 2013 at 4:27 PM, Jan Dudík jan.dudik@gmail.com wrote:
And on wikidata are oudtated data, because many new articles are created (moved and deleted) daily, but the most used platform - pywikipedia is not ready yet for wikidata.
Depending on what you mean with ready it is: http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata
Cheers Lydia
-- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
On Tue, Jan 29, 2013 at 4:27 PM, Jan Dudík jan.dudik@gmail.com wrote:
Bots should remove links on nonconflicted pages on hu.wiki and update wikidata too, but no one was able (or willing?) to write this feature since 30th october :-(
In next days more and more wikis will be "locked" for interwiki bots, but these problems will remain minimally one week after this feature exists (one week is necessary for granting bot flag on wikidata - or should there be global bots allowed?)
JAnD
I agree with you. I am also waiting for "somebody", who can change pywiki compatible with wikidata. I have no time and knowledge for it, but I have a bot (at least on huwiki, not on wikidata) and I have access to the Hungarian Toolserver, so I could tun this bot for cleaning the wikicode on huwiki and update the interwiki links on wikidata. But we need a/the "Somebody" first :)
Cheers, Samat
On Tue, Jan 29, 2013 at 7:51 PM, Samat samat78@gmail.com wrote:
I agree with you. I am also waiting for "somebody", who can change pywiki compatible with wikidata. I have no time and knowledge for it, but I have a bot (at least on huwiki, not on wikidata) and I have access to the Hungarian Toolserver, so I could tun this bot for cleaning the wikicode on huwiki and update the interwiki links on wikidata. But we need a/the "Somebody" first :)
Have you looked at the link I posted? What exactly is missing for you to do what you want to do?
Cheers Lydia
-- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
On Tue, Jan 29, 2013 at 7:54 PM, Lydia Pintscher < lydia.pintscher@wikimedia.de> wrote:
On Tue, Jan 29, 2013 at 7:51 PM, Samat samat78@gmail.com wrote:
I agree with you. I am also waiting for "somebody", who can change pywiki compatible with wikidata. I have no time and knowledge for it, but I have a bot (at
least on
huwiki, not on wikidata) and I have access to the Hungarian Toolserver,
so I
could tun this bot for cleaning the wikicode on huwiki and update the interwiki links on wikidata. But we need a/the "Somebody" first :)
Have you looked at the link I posted? What exactly is missing for you to do what you want to do?
Cheers Lydia
Yes, I have. I mean that interwiki.py should do at least the following: * delete interwikis from every article where there is no conflict; * add these interwikis to the relevant page on Wikidata (create this page if it doesn't exist yet, change the page if it already exists). As I know, the Hungarian editors are doing this tasks now manually. If there is (are) conflict(s) between interwiki links, it can be the next step.
If this features already work I am sorry and I go to run my bot (or first request for bot approval on Wikidata). :) If this features don't work now, we need them urgently.
Samat
2013/1/29 Samat samat78@gmail.com:
On Tue, Jan 29, 2013 at 7:54 PM, Lydia Pintscher lydia.pintscher@wikimedia.de wrote:
On Tue, Jan 29, 2013 at 7:51 PM, Samat samat78@gmail.com wrote:
I agree with you. I am also waiting for "somebody", who can change pywiki compatible with wikidata. I have no time and knowledge for it, but I have a bot (at least on huwiki, not on wikidata) and I have access to the Hungarian Toolserver, so I could tun this bot for cleaning the wikicode on huwiki and update the interwiki links on wikidata. But we need a/the "Somebody" first :)
Have you looked at the link I posted? What exactly is missing for you to do what you want to do?
Cheers Lydia
Yes, I have. I mean that interwiki.py should do at least the following:
- delete interwikis from every article where there is no conflict;
- add these interwikis to the relevant page on Wikidata (create this page if
it doesn't exist yet, change the page if it already exists). As I know, the Hungarian editors are doing this tasks now manually. If there is (are) conflict(s) between interwiki links, it can be the next step.
Well, actually, I wouldn't think that it is immediately urgent. I completely understand that this should be done some time soon - probably in a couple of weeks from now. But it may be a good idea not to use a bot to immediately remove the links from all the (non-conflicting) articles until the post-deployment dust settles.
And until the Big Links Remove, if the bots don't re-add the removed links by force, that should be enough.
-- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com “We're living in pieces, I want to live in peace.” – T. Moore
On Tue, Jan 29, 2013 at 9:45 PM, Amir E. Aharoni < amir.aharoni@mail.huji.ac.il> wrote:
2013/1/29 Samat samat78@gmail.com:
On Tue, Jan 29, 2013 at 7:54 PM, Lydia Pintscher lydia.pintscher@wikimedia.de wrote:
On Tue, Jan 29, 2013 at 7:51 PM, Samat samat78@gmail.com wrote:
I agree with you. I am also waiting for "somebody", who can change pywiki compatible
with
wikidata. I have no time and knowledge for it, but I have a bot (at least on huwiki, not on wikidata) and I have access to the Hungarian
Toolserver,
so I could tun this bot for cleaning the wikicode on huwiki and update the interwiki links on wikidata. But we need a/the "Somebody" first :)
Have you looked at the link I posted? What exactly is missing for you to do what you want to do?
Cheers Lydia
Yes, I have. I mean that interwiki.py should do at least the following:
- delete interwikis from every article where there is no conflict;
- add these interwikis to the relevant page on Wikidata (create this
page if
it doesn't exist yet, change the page if it already exists). As I know, the Hungarian editors are doing this tasks now manually. If there is (are) conflict(s) between interwiki links, it can be the next step.
Well, actually, I wouldn't think that it is immediately urgent. I completely understand that this should be done some time soon - probably in a couple of weeks from now. But it may be a good idea not to use a bot to immediately remove the links from all the (non-conflicting) articles until the post-deployment dust settles.
And until the Big Links Remove, if the bots don't re-add the removed links by force, that should be enough.
-- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com “We're living in pieces, I want to live in peace.” – T. Moore
OK. I have time (to wait) :)
Samat
On 29/01/13 21:45, Amir E. Aharoni wrote:
And until the Big Links Remove, if the bots don't re-add the removed links by force, that should be enough.
They should not be doing that. Any well-behaving bot will get the list of links from the API and not by parsing the article text. And the list of links is exactly the same regardless of whether the links come from the article text or from Wikidata.
2013/1/29 Lydia Pintscher lydia.pintscher@wikimedia.de
Have you looked at the link I posted? What exactly is missing for you to do what you want to do?
As far as I see, these are just code fragments, Lego elements to build
something from, but they are not yet integrated into interwiki.py.