I have written a PERL script that parses the SQL dumps (cur & old) and generates a html file, containing lots of info about wikipedians articles and database for each month since the project started:
Please note that the script produces historical growth figures per Wikipedia based on the
** new (link) counting system **
right from the first month.
Now a report for the English WP is available as well <<<
The script parses 6 GB of data in 33 min. on my 1.2 GHz PC, which ihmo is not too bad.
http://members.chello.nl/epzachte/Wikipedia/Statistics
I propose to run this script weekly on the new SQL dumps for all WP's and put the resulting html files in a public folder.
ToDo: unicode support prepare a consolidated report for all Wikipedias prepare a csv file e.g. for import into Excel (for graphics)
Erik Zachte