Brad Jorsch, 09/11/2012 17:30:
> On Fri, Nov 9, 2012 at 7:59 AM, Hydriz Wikipedia <admin(a)alphacorp.tk> wrote:
>>
>> You mentioned "a while back" for "apcontinue", show recent was it? This dump
>> generator is attempting to archive all sorts of versions of MediaWiki, or so
>> unless we write a backward compatibility handler in the script itself.
>
> July 2012: http://lists.wikimedia.org/pipermail/mediawiki-api-announce/2012-July/00003…
>
> Any wiki running version 1.19, or a 1.20 snapshot from before
> mid-July, would be returning the old parameter. If you do it right,
> though, there's little you have to do. Just use whichever keys are
> given you inside the <query-continue> node. Even with your regular
> expression mess, just capture which key is given as well as the value
> and use it as the key for your params dict.
Thank you again for your useful suggestions!
However, as already noted,
https://www.mediawiki.org/wiki/API:Query#Continuing_queries doesn't give
any info about supported releases.
Nemo
P.s.: Small unreliable "temporary" things in MediaWiki, like the
"powered by MediaWiki" sentence we grep for, are usually the most
permanent ones, although I don't like it.
Hydriz Wikipedia, 09/11/2012 16:59:
> You mentioned "a while back" for "apcontinue", show recent was it? This
> dump generator is attempting to archive all sorts of versions of
> MediaWiki, or so unless we write a backward compatibility handler in the
> script itself.
+1
https://www.mediawiki.org/wiki/API:Allpages ,
https://www.mediawiki.org/wiki/API:Lists and
https://www.mediawiki.org/wiki/API:Query#Continuing_queries don't really
shed any light.
> ...and I agree, the code is in a total mess. We need to get someone to
> rewrite the whole thing, soon.
Well, that in an ideal world. In this one, the best would probably be
suggestions for simple libraries to be used to solve such small
problems? (Which can become very big if one doesn't follow API evolution
very closely or know it's history from the beginning of time.)
Nemo
> On Fri, Nov 9, 2012 at 11:50 PM, Brad Jorsch wrote:
>
> You're searching for the continue parameter as "apfrom", but this was
> changed to "apcontinue" a while back. Changing line 162 to something
> like this should probably do it:
>
> m = re.findall(r'<allpages (?:apfrom|apcontinue)="([^>]+)" />',
> xml)
>
> Note that for full correctness, you probably should omit both apfrom
> and apcontinue entirely from params the first time around, and send
> back whichever of the two is found by the above line in subsequent
> queries.
>
> Also, why in the world aren't you using an XML parser (or a JSON
> parser with format=json) to process the API response instead of trying
> to parse the XML using regular expressions?!
>
> On Fri, Nov 9, 2012 at 2:27 AM, Federico Leva (Nemo)
> <nemowiki(a)gmail.com <mailto:nemowiki@gmail.com>> wrote:
> > It's completely broken:
> > https://code.google.com/p/wikiteam/issues/detail?id=56
> > It will download only a fraction of the wiki, 500 pages at most per
> > namespace.
>
> _______________________________________________
> Mediawiki-api mailing list
> Mediawiki-api(a)lists.wikimedia.org
> <mailto:Mediawiki-api@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
>
>
>
>
> --
> Regards,
> Hydriz
>
> We've created the greatest collection of shared knowledge in history.
> Help protect Wikipedia. Donate now: http://donate.wikimedia.org
> <http://donate.wikimedia.org/>
>
>
> _______________________________________________
> Mediawiki-api mailing list
> Mediawiki-api(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
>
It's completely broken:
https://code.google.com/p/wikiteam/issues/detail?id=56
It will download only a fraction of the wiki, 500 pages at most per
namespace.
Let me reiterate that
https://code.google.com/p/wikiteam/issues/detail?id=44 is a very urgent
bug and we've seen no work on it in many months. We need an actual
programmer with some knowledge of python to fix it and make the script
work properly; I know there are several on this list (and elsewhere),
please please help. The last time I, as a non-coder, tried to fix a bug,
I made things worse
(https://code.google.com/p/wikiteam/issues/detail?id=26).
Only after API is implemented/fixed, I'll be able to re-archive the 4-5
thousands wikis we've recently archived on archive.org
(https://archive.org/details/wikiteam) and possibly many more. Many of
those dumps contain errors and/or are just partial because of the
script's unreliability, and wikis die on a daily basis. (So, quoting
emijrp, there IS a deadline.)
Nemo
P.s.: Cc'ing some lists out of desperation; sorry for cross-posting.