[Mediawiki-l] Making a offline [HTML|PDF] copy of my local Wiki

Rob Church robchur at gmail.com
Fri Aug 18 15:49:07 UTC 2006


On 18/08/06, Carsten Marx <carsten at merkste-was.de> wrote:
> o Script 'dumpHtml.php in the 'maintenance'-Folder:
> The script works and stores all the Wiki-Sites as static HTML-pages
> in a given folder ('php dumpHtml.php -d /my/folder/')
> BUT
> the images are lost if you copy the directory to your local pc (in
> apache) and the relative links inside the pages are broken (why is
> there '../../' in front of all links). What is the intension of this
> script? Am i doing something wrong?

The purpose of the script is to dump static HTML versions of all
pages. Maintain the relative relationship between "images" and the
dump directories to ensure that images continue to be accessible, i.e.
dump to /path/to/wiki/html, and copy the ../wiki/images and
../wiki/html directories to wherever you're keeping the documentation
locally.

> The other projects are imho not 'ready' or the intension is something
> different.

Both.

> Why? Can i customize the wget command that it is only mirroring sites
> from 'http://mydomain.com/mywikidirectory/'?

man wget


Rob Church



More information about the MediaWiki-l mailing list