Is there a way to replicate an installation of WIKI Media, including the database? We would like to build an additional system for disaster recovery purposes so the backup system should be able to replicate data back to the primary system.
If anyone has any working installation I would love to hear how you are doing it.
Thanks, -Ed
Disclaimer: Any references to Pipeline performance contained herein are based on historic performance levels which Pipeline expects to maintain or exceed but nevertheless does not guarantee. Congested networks, price volatility, or other extraordinary events may impede future trading activities and degrade performance statistics.
Hello,
I am back, still struggling.
I installed a trial wikisite behind the firewall and had hard time to 'export/import" pages. Someone told me that I needed to have both sites "seeing" each other in order to "export/import". Now I did the samething to install a mediawiki site connected to internet directly, in other words not walled by the firewall). At the last step of installation I need to run "http://localhost" in order to setup the mediawiki. However, the Red Hat Enterprise Linux Test Page keep on coming up instead of the expected mediawiki setup page.
Has anybody experienced this problem before? What did I do wrong? How do I avoid this test page?
Nelson -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- This is a PRIVATE message. If you are not the intended recipient, please delete without copying and kindly advise us by e-mail of the mistake in delivery. NOTE: Regardless of content, this e-mail shall not operate to bind CSC to any order or other contract unless pursuant to explicit written agreement or government initiative expressly permitting the use of e-mail for such purpose. --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I installed a trial wikisite behind the firewall and had hard time to 'export/import" pages. Someone told me that I needed to have both sites "seeing" each other in order to "export/import". Now I did the samething to install a mediawiki site connected to internet directly, in other words not walled by the firewall). At the last step of installation I need to run "http://localhost" in order to setup the mediawiki. However, the Red Hat Enterprise Linux Test Page keep on coming up instead of the expected mediawiki setup page.
Has anybody experienced this problem before? What did I do wrong? How do I avoid this test page?
Is the wiki installed to a subfolder of your public_html folder (or equivalent) rather than the main folder? If so, you need to go to http://localhost/subfolder.
Hello,
That is exactly the problem. Thank you all for great help. I reaaly appreciate all you experts. Have a nice weekend.
Nelson -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- This is a PRIVATE message. If you are not the intended recipient, please delete without copying and kindly advise us by e-mail of the mistake in delivery. NOTE: Regardless of content, this e-mail shall not operate to bind CSC to any order or other contract unless pursuant to explicit written agreement or government initiative expressly permitting the use of e-mail for such purpose. --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
"Thomas Dalton" thomas.dalton@gmail.com Sent by: mediawiki-l-bounces@lists.wikimedia.org 09/20/2007 05:48 PM Please respond to MediaWiki announcements and site admin list mediawiki-l@lists.wikimedia.org
To "MediaWiki announcements and site admin list" mediawiki-l@lists.wikimedia.org cc
Subject Re: [Mediawiki-l] How to complete installation (was How to copy a page)
I installed a trial wikisite behind the firewall and had hard time to 'export/import" pages. Someone told me that I needed to have both sites "seeing" each other in order to "export/import". Now I did the
samething
to install a mediawiki site connected to internet directly, in other
words
not walled by the firewall). At the last step of installation I need to run "http://localhost" in order to setup the mediawiki. However, the
Red
Hat Enterprise Linux Test Page keep on coming up instead of the expected mediawiki setup page.
Has anybody experienced this problem before? What did I do wrong? How
do
I avoid this test page?
Is the wiki installed to a subfolder of your public_html folder (or equivalent) rather than the main folder? If so, you need to go to http://localhost/subfolder.
_______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Hi Ed,
I use a different approach to disaster-recovery: my wiki runs on a virtual server (based on VMware Server 1.0.3) which only hosts the wiki and nothing else. As a backup I have several copies (one on another harddisk of the host-server and two on the two harddisks of another backup-host-server) of the directory which "makes up" the virtual server. At night I shutdown the VM via script, synchronize the four directories via rsync and restart the VM. This way, I loose the changes of one day at the most. I'm not sure, whether the VMware-snapshot-functionality could be used to backup a running virtual server, because I haven't investigated that possibility (yet). But I guess, it should be possible to make a snapshot and backup that via rsync at a much greater frequenzy... As my wiki is an internal wiki with under 30 users (most of them not activly adding to the wiki to boot) and no need for 24/7 availability, I didn't need to investigate performance and availability-issues that may be connected with using a virtual server. But I guess there are others reading this list who can add their experience in that field. ;-)
In short: I've moved the task of data-backup to server-backup via VMware. ;-)
Greetings Kate
Ed Melendez schrieb:
Is there a way to replicate an installation of WIKI Media, including the database? We would like to build an additional system for disaster recovery purposes so the backup system should be able to replicate data back to the primary system.
If anyone has any working installation I would love to hear how you are doing it.
Thanks, -Ed
I am not as subtle as kate in my backup yet - i just shutdown my VMWare Player virtual machine for a few minutes back up the whole thing (less than 4.5 Gb) to a usb hard drive and then boot up again. > Date: Fri, 21 Sep 2007 10:48:52 +0200> From: wolkwitz@fh-swf.de> To: mediawiki-l@lists.wikimedia.org> Subject: Re: [Mediawiki-l] WIKI Replication> > Hi Ed,> > I use a different approach to disaster-recovery:> my wiki runs on a virtual server (based on VMware Server 1.0.3) which only hosts> the wiki and nothing else. As a backup I have several copies (one on another> harddisk of the host-server and two on the two harddisks of another> backup-host-server) of the directory which "makes up" the virtual server. At> night I shutdown the VM via script, synchronize the four directories via rsync> and restart the VM. This way, I loose the changes of one day at the most.> I'm not sure, whether the VMware-snapshot-functionality could be used to backup> a running virtual server, because I haven't investigated that possibility (yet).> But I guess, it should be possible to make a snapshot and backup that via rsync> at a much greater frequenzy...> As my wiki is an internal wiki with under 30 users (most of them not activly> adding to the wiki to boot) and no need for 24/7 availability, I didn't need to> investigate performance and availability-issues that may be connected with using> a virtual server. But I guess there are others reading this list who can add> their experience in that field. ;-)> > In short: I've moved the task of data-backup to server-backup via VMware. ;-)> > Greetings Kate> > Ed Melendez schrieb:> > Is there a way to replicate an installation of WIKI Media, including the> > database? We would like to build an additional system for disaster> > recovery purposes so the backup system should be able to replicate data> > back to the primary system.> > > > If anyone has any working installation I would love to hear how you are> > doing it.> > > > Thanks,> > -Ed> > _______________________________________________> MediaWiki-l mailing list> MediaWiki-l@lists.wikimedia.org> http://lists.wikimedia.org/mailman/listinfo/mediawiki-l _________________________________________________________________ 100’s of Music vouchers to be won with MSN Music https://www.musicmashup.co.uk
MySQL's binlog functionality is one way of replicating the database continually, to another server. The idea is that you have one master database server, and any number of slave database servers, that continually reads the binlog that the master writes (which contains all the changes that occurs in the master database) via TCP/IP connection to the master server. It requires root access ("admin access" on windows) to the server where the wiki is running, but if you have that it should fairly easy to set up.
Regards, Samuel
Ed Melendez skrev:
Is there a way to replicate an installation of WIKI Media, including the database? We would like to build an additional system for disaster recovery purposes so the backup system should be able to replicate data back to the primary system.
If anyone has any working installation I would love to hear how you are doing it.
Thanks, -Ed
Disclaimer: Any references to Pipeline performance contained herein are based on historic performance levels which Pipeline expects to maintain or exceed but nevertheless does not guarantee. Congested networks, price volatility, or other extraordinary events may impede future trading activities and degrade performance statistics. _______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
... and then one requires an 'rsync' like process to handle the file uploads. jld.
Samuel Lampa wrote:
MySQL's binlog functionality is one way of replicating the database continually, to another server. The idea is that you have one master database server, and any number of slave database servers, that continually reads the binlog that the master writes (which contains all the changes that occurs in the master database) via TCP/IP connection to the master server. It requires root access ("admin access" on windows) to the server where the wiki is running, but if you have that it should fairly easy to set up.
Regards, Samuel
Ed Melendez skrev:
Is there a way to replicate an installation of WIKI Media, including the database? We would like to build an additional system for disaster recovery purposes so the backup system should be able to replicate data back to the primary system.
If anyone has any working installation I would love to hear how you are doing it.
Thanks, -Ed
Disclaimer: Any references to Pipeline performance contained herein are based on historic performance levels which Pipeline expects to maintain or exceed but nevertheless does not guarantee. Congested networks, price volatility, or other extraordinary events may impede future trading activities and degrade performance statistics. _______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
mediawiki-l@lists.wikimedia.org