Hi,
I created a openzim file from the german wikipedia dump I became from Josch
last year. The file contains all articles from the german wikipedia without
images and its size is 1,3G (or more precisely 1302052315 bytes). Generation
took only about 1:10 on our server with the new zimwriter. It could be even
improved by parallizing the compression phase, since this is CPU bound and
takes the most time, but I feel, that it is not necessary. There are more
important task to do.
You can download the file from
http://www.openzim.org/download/dewiki.zim.
The zimreader (the tntnet based webapplication) is almost working with that
file. There are some bugs to fix, but this will be done soon.
Emmanuel: the file is updated. I fixed some bugs. The zimDump crashed when
reading redirects and the writer failed to generate redirects correctly.
Josch: do you have an updated dump?
Tommi