Στις 29-09-2011, ημέρα Πεμ, και ώρα 22:08 +0200, ο/η Merlijn van Deen
έγραψε:
Hello to both the wikitech and pywikipedia lists --
please keep both
informed when replying. Thanks.
A few days ago, we - the pywikipedia developers - received alarming
reports of interwiki bots removing content from pages. This does not
seem to happen often, and we have not been able to reproduce the
conditions in which this happens.
However, the common denominator is the fact it seems to be happening
only on the wikipedia's that run MediaWiki 1.18 wikis. As such, I
think this topic might be relevant for wikitech-l, too. In addition,
there is no-one in the pywikipedia team with a clear idea of why this
is happening. As such, we would appreciate any ideas.
1. What happens?
Essentially, the interwiki bot does its job, retrieves the graph and
determines the correct interwiki links. It should then add it to the
page, but instead, /only/ the interwiki links are stored. For example:
http://nl.wikipedia.org/w/index.php?title=Blankenbach&diff=next&old…
http://eo.wikipedia.org/w/index.php?title=Anton%C3%ADn_Kl%C3%A1%C5%A1tersk%…
http://simple.wikipedia.org/w/index.php?title=Mettau%2C_Switzerland&act…
2. Why does this happen?
This is unclear. On the one hand, interwiki.py is somewhat black
magic: none of the current developers intimately knows its workings.
On the other hand, the bug is not reproducible: running it on the
exact same page with the exact same page text does not result in a
cleared page. It could very well be something like broken network
error handling - but mainly, we have no idea. Did anything change in
Special:Export (which is still used in interwiki.py) or the API which
might cause something like this? I couldn't find anything in the
release notes.
Out of curiosity... If the new revisions of one of these badly edited
pages are deleted, leaving the top revision as the one just before the
bad iw bot edit, does a rerun of the bot on the page fail?
Ariel