I had two questions about database issues. The SQL database for me is now 800MB. 1) On the shared server the wiki is hosted, I have to email support to create a backup that I can download. I can no longer download it myself from the PHPMyAdmin interface as its too big and it times out. There are probably several solutions that handle backing up large databases without draining resources heavily (and instead just take more time to complete, which is fine. It would have to be broken up into a number of processes too so it doesnt exceed the execution time limit on shared servers). So my question is: Are there any popular good solutions for backing up large databases that you are using for backing up your wiki? For example, I found one using a google search: http://www.phpclasses.org/package/4017-PHP-Backup-large-MySQL-databases-into... But I dont know if this is reliable. 2) Most of the space is taken up by mw_text, for the old revisions. I dont want to delete them since it deletes the page history so I'm looking at other options. One that I saw says we can compress old revisions: - Note: If the $wgCompressRevisions option is on, new rows (=current revisions) will be gzipped transparently at save time. Previous revisions can also be compressed by using the script compressOld.php http://www.mediawiki.org/wiki/Manual:Text_table My question is: Will compressing old revisions effect performance or create any other issues on a shared hosting environment? I'm guessing it will only have an effect when we click on a "diff" link. But will that make it execute the script say for, 20 seconds and make it really slow? Also, will this reduce the database size a lot or just a little bit? I've read somewhere that the 'old_text' blob field could be set to 0, in a script to delete the revision but keep the historical record of the edit (date, author but not the actual changes, and thats fine). I'm debating between these two options. thanks Eric
On Sun, Sep 26, 2010 at 8:26 AM, Eric K ek79501@yahoo.com wrote:
So my question is: Are there any popular good solutions for backing up large databases that you are using for backing up your wiki? >
mysqldump.
-Chad
Eric K wrote:
I had two questions about database issues. The SQL database for me is now 800MB.
- On the shared server the wiki is hosted, I have to email support to create a backup that I can download. I can no longer download it myself from the PHPMyAdmin interface as its too big and it times out.
There are probably several solutions that handle backing up large databases without draining resources heavily (and instead just take more time to complete, which is fine. It would have to be broken up into a number of processes too so it doesnt exceed the execution time limit on shared servers). So my question is: Are there any popular good solutions for backing up large databases that you are using for backing up your wiki? For example, I found one using a google search: http://www.phpclasses.org/package/4017-PHP-Backup-large-MySQL-databases-into... But I dont know if this is reliable.
That solution is not consistent (the data won't be atomically copied). But the result is probably good enough.
What you can do is to launch a mysqldump process, then downloading the resulting file. Not downloading cache tables and compressing the dump should help, too.
- Most of the space is taken up by mw_text, for the old revisions. I dont want to delete them since it deletes the page history so I'm looking at other options. One that I saw says we can compress old revisions:
- Note: If the $wgCompressRevisions option is on, new rows (=current revisions) will be gzipped transparently at save time. Previous revisions can also be compressed by using the script compressOld.php
http://www.mediawiki.org/wiki/Manual:Text_table My question is: Will compressing old revisions effect performance or create any other issues on a shared hosting environment?
I'm guessing it will only have an effect when we click on a "diff" link. But will that make it execute the script say for, 20 seconds and make it really slow? Also, will this reduce the database
size a
lot or just a little bit? I've read somewhere that the 'old_text' blob field could be set to 0, in a script to delete the revision but keep the historical record of the edit (date, author but not the actual changes, and thats fine). I'm debating between these two options.
When you compress a revision it will be decompressed when needed, that being viewing the old revision, a diff, etc. Don't worry, decompressing is fast, it won't take 20 seconds. However, compressing all the old revisions will take a while.
You can also move the text from old revisions to a different database, but that may not be available/useful for a shared hosting.
So my question is: Are there any popular good solutions for backing up large databases that you are using for backing up your wiki?
Try MySQLDumper (http://www.mysqldumper.net ).
"MySQLDumper is a PHP and Perl based tool for backing up MySQL databases. You can easily dump your data into a backup file and - if needed - restore it. It is especially suited for shared hosting webspaces, where you don't have shell access. MySQLDumper is an open source project and released under the GNU-license."
A PHP script has a maximum execution time that is usually set to 30 seconds on most server installations. A script running longer than this limit will simply stop working. This behavior makes backing up large databases impossible.
MySQLDumper uses a proprietary technique to avoid this problem."
It's a very reliable tool which is provided by many hosting providers here in Germany.
hth Frank
mediawiki-l@lists.wikimedia.org