Eric K wrote:
I had two questions about database issues. The SQL
database for me is now 800MB.
1) On the shared server the wiki is hosted, I have to email support to create a backup
that I can download. I can no longer download it myself from the PHPMyAdmin interface as
its too big and it times out.
There are probably several solutions that handle backing up large databases without
draining resources heavily (and instead just take more time to complete, which is fine. It
would have to be broken up into a number of processes too so it doesnt exceed the
execution time limit on shared servers).
So my question is: Are there any popular good solutions for backing up large databases
that you are using for backing up your wiki? For example, I found one using a google
But I dont know if this is reliable.
That solution is not consistent (the data
won't be atomically copied).
But the result is probably good enough.
What you can do is to launch a mysqldump process, then downloading the
Not downloading cache tables and compressing the dump should help, too.
2) Most of the space is taken up by mw_text, for the
old revisions. I dont want to delete them since it deletes the page history so I'm
looking at other options. One that I saw says we can compress old revisions:
- Note: If the $wgCompressRevisions option is on, new rows (=current revisions) will be
gzipped transparently at save time. Previous revisions can also be compressed by using the
My question is: Will compressing old revisions effect performance or create any other
issues on a shared hosting environment?
I'm guessing it will only have an effect when we
click on a "diff" link. But will that make it execute the script say for,
20 seconds and make it really slow? Also, will this reduce the database
lot or just a little bit?
I've read somewhere that the 'old_text' blob field could be set to 0, in a
script to delete the revision but keep the
historical record of the edit (date, author but not the actual changes, and thats fine).
I'm debating between these two options.
When you compress a revision it will be decompressed when needed, that
being viewing the old revision, a diff, etc. Don't worry, decompressing
is fast, it won't take 20 seconds.
However, compressing all the old revisions will take a while.
You can also move the text from old revisions to a different database,
but that may not be available/useful for a shared hosting.