Hello,
I recently discover kiwix and the zim file. It seems a really great project for offline use, even if there is still a lot of challenges.
For now, as a traveller, I would want to have a recent & usable dump of some wiki (wikipedia, wikitravel, wiki.couchsurfing, wikivoyage, ...) and find really hard to achieve it.
As a reference, I found http://en.wikipedia.org/wiki/Wikipedia:Database_download http://www.kiwix.org/index.php/Main_Page http://wikitravel.org/en/Wikitravel:Offline_Reader_Expedition http://www.wikivoyage.org/tech/Database_dumps
But in most case, the best I have is * a mediawiki xml archive * last, need to mirror website with httrack or wget (and ensuring to copy only one language release: only en, fr, de, ...)
In this case, is there any available scripts to build a zim file ? I have checked http://www.kiwix.org/index.php/Tools but as I'm in travel, I don't have time to make each atomic operations, would really prefer a batch.
Else, thanks a lot for your work. Can't wait to have it available on different devices (computer, smartphone, ...) with all wiki :)
Cheers
Julien
Hi Julien,
On 02/03/2011 10:12 AM, Julien T wrote:
I recently discover kiwix and the zim file. It seems a really great project for offline use, even if there is still a lot of challenges.
For now, as a traveller, I would want to have a recent & usable dump of some wiki (wikipedia, wikitravel, wiki.couchsurfing, wikivoyage, ...) and find really hard to achieve it.
Having easily fresh ZIM files from such Mediawiki web sites is one of our current challenges. This is currently no easy to do that... also for developers involved in openZIM.
For Wikimedia Foundation Web site, I have a way which is not so complicated... for me. For the other one, this has to be investigated on a case by case base.
As a reference, I found http://en.wikipedia.org/wiki/Wikipedia:Database_download http://www.kiwix.org/index.php/Main_Page http://wikitravel.org/en/Wikitravel:Offline_Reader_Expedition http://www.wikivoyage.org/tech/Database_dumps
But in most case, the best I have is
- a mediawiki xml archive
- last, need to mirror website with httrack or wget (and ensuring to
copy only one language release: only en, fr, de, ...)
If you have the static HTML pages, I can easily make a ZIM file from them.
In this case, is there any available scripts to build a zim file ? I have checked http://www.kiwix.org/index.php/Tools but as I'm in travel, I don't have time to make each atomic operations, would really prefer a batch.
I think there is no solution for you... and in any case, also a batch to go over thousands of web pages seems to me not being the best solution.
Else, thanks a lot for your work. Can't wait to have it available on different devices (computer, smartphone, ...) with all wiki :)
If you really want a ZIM file, and you have time, you may make a feature request to the Kiwix team here: http://requestafeature.kiwix.org
Thx you for your feedback Emmanuel