Yes, so probably our issues here are the lack of coordination of bot owners and memory
usage issue. Shouldn't we write some simple script which can automatically remove old
memory used by the interwiki script? DaB's idea was okay for me, just that one of the
points was that no one else can run the interwiki script anymore, which is ridiculous to
me.
Maybe the MMP can be used to ensure that there is no overlapping bots? All interwiki bot
owners should join this project, check an available wiki that no one has taken up and
start asking for clearance to run their own interwiki bot there.
Regards,
Hydriz
From: valhallasw(a)arctus.nl
Date: Mon, 16 Jan 2012 09:19:19 +0100
To: toolserver-l(a)lists.wikimedia.org
Subject: Re: [Toolserver-l] interwiki.py
2012/1/16 Hydriz Wikipedia <admin(a)wikisorg.tk>tk>:
Personally, I rather we wait for the Pywikipedia
devs to fix that script,
This is not going to happen anytime soon. Considering the state of the
code base (two hundred exceptions for three hunderd wikis, long
functions and no automated testing - and thus practically untestable),
and the state of the InterLanguage extension ('will be installed
soon'), so-one is really willing to invest a lot of time in tracking
memory usage and reducing it.
The only reasonable action we can take to reduce the memory
consumption is to let the OS do its job in freeing memory: using one
process to track pages that have to be corrected (using the database,
if possible), and one process to do the actual fixing (interwiki.py).
This should be reasonably easy to implement (i.e. use a pywikibot page
generator to generate a list of pages, use a database layer to track
interlanguage links and popen('interwiki.py <page>') if this is a
fixable situation)
Best,
Merlijn
_______________________________________________
Toolserver-l mailing list (Toolserver-l(a)lists.wikimedia.org)
https://lists.wikimedia.org/mailman/listinfo/toolserver-l
Posting guidelines for this list:
https://wiki.toolserver.org/view/Mailing_list_etiquette