there seems to be a problem with the current jawiki-dump. The size of the complete history dump is only 4.3 GB, but the size of the dump before was 19 GB.
Another issue: Acccording to http://wikitech.wikimedia.org/view/Dumps#Worker_nodes there shoulde be 3 threads for the large dumps, but since a few days there are only 2 running threads.
NEU: FreePhone - kostenlos mobil telefonieren und surfen!
Jetzt informieren: http://www.gmx.net/de/go/freephone
For folks who have not been following the saga on
we were able to get the raid array back in service last night on the XML
data dumps server, and we are now busily copying data off of it to
another host. There's about 11T of dumps to copy over; once that's done
we will start serving these dumps read-only to the public again.
Because the state of the server hardware is still uncertain, we don't
want to do anything that might put the data at risk until that copy has
The replacement server is on order and we are watching that closely.
We have also been working on deploying a server to run one round of
dumps in the interrim.
Thanks for your patience (which is a way of saying, I know you are all
out of patience, as am I, but hang on just a little longer).
I've some questions for you and I think it won't be so hard for you to answer.
What do you know about new wikipedia dumps?
When they will be refreshed and available to be downloaded (new dumps)
I've heard that they can give you an opportunity to order you own dump but I don't now how to do it. Can you help me?
And the last:
I think they are in searching for volunteers and mirrores for their projects. Can you support them and start mirroring for some languages?