On 3/24/06, Stefan F. Keller sfkeller@hsr.ch wrote:
On March 24, 2006 2:57 AM Christopher Beland wrote:
I think you may be looking in the wrong place; the dumps moved sometime in the past few months. dewiki dumps from March are linked from: http://download.wikimedia.org/dewiki/20060320/
You can check http://download.wikimedia.org for periodic updates.
Thank you for the kind hint. I'm new here but I assume you all are aware what this means to us when we have to read in the dump instead of having a programmatic access to dewiki? It means installing a rather non-repeatable process (unstable pathnames, mediawiki versions etc.) which has a processing time of more than 30 hours once it gets running...
You write software which scans the dump directly. Don't import it into mediawiki. If you're looking for particular data, this works fairly well.
On March 23, 2006 10:09 PM Gregory Maxwell wrote:
The compression really isn't a problem, if the text access is ever restored I will help you write code to read the compressed data... it is easy.
Who is responsible for restoring text access? When is it estimated to achieve this?
Your guess is as good as mine: I asked (posted to both this list and the developers list) a month ago and didn't even receive a reply.
If someone is in charge here at all, then they are asleep at the wheel.
I've yet to even here an update on why it's gone, the most recent excuse I've seen thrown out is that mysql can't replicate from multiple masters... but that should be a non issue: we could run multiple instance on mysql on separate ports.