Hello everybody,
I am a worker at the computer science departement at the University of Konstanz in Germany. We are working on a revisioned native XML database. Wikipedia is therefore the optimal playground when it comes to huge amounts of data since the xml dump is perfect for our application.
At the moment I am looking for a new dump for the enwiki which contains all revisions. I know that this XML has to be really huge, but that's why we want to use it. Unfortunately I couldn't find any file called "page-meta-history" on the enwiki download section. Can you help me with some dump, an idea how to get the data,...?
greetings
sebastian
-------------------------------------------------- Sebastian Graf Distributed Systems Lab University of Konstanz Phone: +49 7531 88 4319 Mail: sebastian.graf@uni-konstanz.de