On Tue, Jun 2, 2009 at 4:46 PM, Henny Savenije webmaster@henny-savenije.pe.kr wrote:
No but if you have two servers and on both servers the wiki is under the same domain name the only thing you have to do is regularly copy all the files and databases to the other server (not in use) if one goes down you will have to change the name servers at the registrar.
Technically it would be possible to have two identical wiki's on two different domains on two different servers, but that would be a similar story.
You could use this program to backup and ftp the files to the other server automatically http://www.mysqldumper.de/en/
And of course you will have to make some cronjobs to copy files from one server to the other. (wget, pack, unpack etc backup on one server and restore on the other)
At 08:33 AM 6/3/2009, you wrote:
Is it possible to have mediawiki be high available?
I have 2 servers and I would like to have it highly available in case 1 server goes down.
Has anyone done this before?
Mediawiki is technically a specialized web application server - sitting between the raw web server and a SQL database on the back end which handles the long term storage issues.
There are three things which have to be running: 1. Web servers + Mediawiki software - trivially parallelizable, plus a web load balancer on the front end. 2. SQL server(s) - MySQL is easily clusterable 3. Media storage - Easily replicatable or externalized (NFS mounts, optionally from clustered NFS server)
What degree of HA do you desire? Simple avoidance of single failure points? Multiple failure tolerance?
You can go all the way from "two web/Mediawiki servers and separate MySQL / NFS server on back end" to a redundant network, clustered-everything, geographically distributed in case of single datacenter / internet failure tolerant installation. The cost multiplier for the first is very little - for the latter is a whole lot.
mediawiki-l@lists.wikimedia.org