Hi,
We've been using mysqldump to do daily full database backups in case
our hardware on our DB server fails. This causes some problems because
for a short period of 4 minutes or so, the site in inaccessible
because mysqldump has the db locked.
I'm not too familiar with the maintenance/dumpPages.xml script, but
this script doesn't backup the whole db, including user accounts,
recent changes, links, etc, does it? And if it does, it probably
doesn't avoid the problem of having to lock the DB for a few minutes,
right?
Is there any reason why Squid is reporting this error to anonymous
users for pages that should be cached? Squid does seem to be caching
pages properly.
If mysqldump is still the answer,(I'm using the --quick option) are
there any other ways we can avoid this brief downtime to capture a
backup? How does Wikipedia do this?
Thanks a lot,
Travis