On Sun, 12 Jun 2005 16:01:43 +0000, Daniel Herding wrote:
> Update of /cvsroot/pywikipediabot/pywikipedia
> when setting interwiki links on a disambiguation page,
> and we find a non-disambiguation page, ask the user to
> ignore it. The other way round as well.
Cool; apart from [y/N] I suggest adding 'Never'.
+ 'pl': [u'disambig'],
--tsca
On http://meta.wikimedia.org/wiki/A_new_look_at_the_interwiki_link an
article has been put by someone who thinks to have found a solution to
maintain interwiki links.
There are some interesting points in the discussion, but the writer has
a firm belief that "everything will arrange itself" if this plan is
realized. I have some doubts myself...
Anton
Dear list members,
There are a few small things in the current release of the interwikibot
that I have tried to improve in interwiki.py, family.py and date.py.
They are:
- adding language os
- adding correct syntax of years BC on id, ko
- correcting the reported changes messages in sk (thanks Palica), sv
I have uploaded the modified files to
http://meta.wikimedia.org/wiki/Image:Interwikibot-20050323-20050610.zip.ogg
(rename to *.zip)
Regards,
Anton / Quistnix
Hi,
I guess all of you who have used the interwiki bot have run into
conflicts where e.g. [[en:Mars (planet)]] links to [[xy:Mars]], but
[[xy:Mars]] has lately become a disambiguation page.
In such a case it's a complete mess to clean it up because you have to
manually change the 'Planet Mars' page in all languages and fix all the
bad xy: links. This problem is getting worse as we have a rising number
of active Wikipedias.
I want to propose a rule that there should never be any interwiki links
between disambiguation pages and non-disambiguation pages. If we
implemented this in PyWikipediaBot, all you had to do is fix the bad
link to xy: on one wiki and let the bot run on all the others.
Yuri has already created code to recognize disambiguation pages by the
{{disambiguation}} template, and we've got nearly all translations. So
implementing what I proposed would be extremely easy.
It should also replace the marking programmed by Yuri, which is causing
some trouble because warnfiles don't work properly anymore.
The downside would be that in some rare cases it is intended to have
interwiki links between disamb and non-disamb pages. But I think that
this is outweighed by the positive effect.
What do you think of it?
Daniel
Dear list members,
Last week, a plan to do the interwiki links more efficiently started to
grow, and I am posting it here to see if anyone of you can find a flaw
in it, or want to add more ideas to it.
Here's my plan:
In the current situation, a warning file has to be produced from each
language, and run on each other language. This requires n * ( n - 1)
steps for n languages for one complete interwiki update.
If all warnfiles would be merged before splitting, n * 2 steps would be
enough to do the job, and each run will be shorter than before.
By merging all warnfiles, all required and broken links are gathered for
all languages. After removal of inconsistencies and double entries, the
file is ready to be split into n warnfiles. Each warnfile will contain
practically *all* needed changes, instead of just a subset of it. By
merging, all pages have to be opened just once, even if the changes are
collected from different warnfiles.
If bot operators on every language will upload their most recent -
complete - warnfile to a specific location for that language,
overwriting the previous one, this location will contain the most recent
warnfile from that language. Suppose they do this between the first and
15th of each month, and inform all other bot users via the interwiki
robot page when they did it.
One of the operators can download all warnfiles, merge them, split them
and upload them to a different location as soon as all warnfiles of the
month are ready, or on the 15th - whichever comes first. This operator
sends an e-mail and makes a note on the interwiki bot page on commons as
soon as the new warnfiles are ready. After that, all operators have the
time until the 1st of the next month to process the warnfile for their
own language. Whenever a warnfile is processed, the bot operator will
report it on the interwiki bot page. At the end of the month, there is
time for other bot operators to process files that have not been
processed previously on their language.
This ensures maximum effectiveness of interwiki bot operation.
I'll try to write the merger-script (if an experienced bot writer wants
to help, please contact me as I haven't programmed in Python for more
than a year now), and I'll include a filter to eliminate all no/nb
stuff. That will be sorted out as soon as a page has to be updated anyway.
Regards
Anton