On 11/1/06, Travis Derouin travis@wikihow.com wrote:
You're right, looking at the logs, it is more like 1.5 minutes. Funny! I still hear complaints about that small amount of downtime though. The downtime is likely longer because the incoming connections are likely getting backed up, and there's probably a bottleneck when the dump has finished.
Our dump uncompressed is 1.2 GB one wiki. _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
That does sound sort of slow.
As Tim points out, the ultimate solution to this is a slave DB which you can stop without affecting the primary which is serving to the live wiki.
Are you doing the dump-to-/tmp trick others noted early in the thread? A RAM disk is far better than real disk for dump speed, if you have the RAM available... and 1.2 GB isn't all that much RAM these days.