Jakob, thanks for suggestions.
Do you use XML::SAX in perl or how do you parse the dump?
Home grown parser code, fast and simple. It should survive most potential updates to the xml scheme, but it can be broken, so xml purists will object to it, probably rightly so.
Will there also be new csv files? The old ones have not changed: http://www.wikipedia.org/wikistats/csv/csv.zip
Fixed for wikipedias, for other projects after next run.
Another thing: every time I want to visit the stats I have to search because the base directories are not readable: http://www.wikipedia.org/wikistats/ http://www.wikipedia.org/wikistats/EN/
I added index.html file on both levels
Your home page http://eza.gemm.nl/ still points to http://members.chello.nl/epzachte/Wikipedia/Statistics/
Points now to http://www.wikipedia.org/wikistats/
So how about creating a HTTP redirect or something like that from http://meta.wikipedia.org/wiki/Wikistats so everybody can document the scripts there?
I added link at bottom of each page
Cheers, Erik
wikimedia-l@lists.wikimedia.org