There's still room for optimization, and it would be nice to get some
of the Special Pages back. (Erik Moeller)
Special pages like longest article, orphans etc could easily be extracted from a SQL dump, I'll write a Perl script if there is capacity to run it and daily dumps to feed into it.
All discussions so far center on speed, how about robustness? In the current situation a disk crash is disastrous. Remember no dump was made for 18 days just recently. So will this be cured when a replicated server is introduced? Also, Brion mentioned: "unless we lock the wiki the cur and old databases will be inconsistent in the backup. (Which is in fact the present state of the backups for the English wiki.)"
One of the advantages of RAID ('I' stands for inexpensive) is the possibility to introduce redundancy, so that a crashed disk can be replaced on the fly and the content rebuilt from the other disks.
Erik Zachte
wikitech-l@lists.wikimedia.org