Hi all,
I am beginning work on a port to PHP due to some issues regarding unit
testing for another project of mine (if you follow me on GitHub, you will
know). I hope to help out with fixing the script, but it is a good idea to
get someone who knows python (pywikipedia-l people) and the MediaWiki API
(mediawiki-api people) to help.
On Fri, Nov 9, 2012 at 6:27 PM, Federico Leva (Nemo) <nemowiki(a)gmail.com>wrote;wrote:
It's completely broken:
https://code.google.com/p/**
wikiteam/issues/detail?id=56<https://code.google.com/p/wikiteam/issues/d…
It will download only a fraction of the wiki, 500 pages at most per
namespace.
Let me reiterate that
https://code.google.com/p/**
wikiteam/issues/detail?id=44<https://code.google.com/p/wikiteam/issues/d…
a very urgent bug and we've seen no work on it in many months. We need
an actual programmer with some knowledge of python to fix it and make the
script work properly; I know there are several on this list (and
elsewhere), please please help. The last time I, as a non-coder, tried to
fix a bug, I made things worse (
https://code.google.com/p/**
wikiteam/issues/detail?id=26<https://code.google.com/p/wikiteam/issues/d…
).
Only after API is implemented/fixed, I'll be able to re-archive the 4-5
thousands wikis we've recently archived on
archive.org (
https://archive.org/details/**wikiteam<https://archive.org/details/wikit…)
and possibly many more. Many of those dumps contain errors and/or are just
partial because of the script's unreliability, and wikis die on a daily
basis. (So, quoting emijrp, there IS a deadline.)
Nemo
P.s.: Cc'ing some lists out of desperation; sorry for cross-posting.
--
Regards,
Hydriz
We've created the greatest collection of shared knowledge in history. Help
protect Wikipedia. Donate now:
http://donate.wikimedia.org