On Friday 18 August 2006 10:03, Carsten Marx wrote:
Hello to all,
i've a problem with making an offline copy of my own local wiki.
The following szenario: All the admin-related stuff and also the whole documentation of our infrastructure is stored on a webserver. So the document to restore the webserver if it crashes is located on the same webserver.... so therefore i need an offline copy of my local wiki. (I know this is not the perfect solution but there is a backup from the wiki - but i need a simple way to have the important documents stored on my local computer).
What about a mirror of the wiki? Dump the database to another machine, upload it at regular intervals (once a day)?
On the other hand, if your entire datacenter goes down, then a hard copy would be a happy thing.
<snip>
o Serveral Alternative Parsers: See http://meta.wikimedia.org/wiki/Alternative_parsers for more Information. I tried the HTML2FPDF and Mediawiki Article (http:// meta.wikimedia.org/wiki/HTML2FPDF_and_Mediawiki) but i did not get it working. Also it's not good that you habe to change some mainfiles from the mediawiki installation. The other projects are imho not 'ready' or the intension is something different.
We are working on something with a table of content file (Stückliste) that contains the articles we want to load and in the order we want/need them printed. Adding printable=yes to the URL we strip out all of the junk, parse the file a little more and then create a PDF.
o wget to mirror the wiki-Site I also tried mirroring the wiki with 'wget -m http://mydomain.com/ mywikidirectory/' (i also tried the url http://mydomain.com/ mywikidirectory/index.php/MainSite) but it is not only mirroring the wiki. It's also mirroring the whole site at 'http://mydomain.com/'. Why? Can i customize the wget command that it is only mirroring sites from 'http://mydomain.com/mywikidirectory/%27?
That'll just get you the articles as files on your local system. Although it is probably useful in an emergency, there are better alternatives.