Stian Haklev wrote:
The developer himself suggested packing the Wikipedia dump file with something like this 7z a -mx -ms=32M -m0d=32M archive *
The static HTML dumps are already packed with -ms8m, a chunk size which I found during testing to give a good tradeoff between compression ratio and random access speed. But perhaps my testing was biased. You're not the first person to have complained about it. Reducing the chunk size to say 2-4MB might be a good move. But increasing it to 32MB would be a step in the wrong direction.
I'm not sure what -mx is meant to do, the manual implies that option is meant to be followed by a number. Presumably -m0d=32M is meant to set the dictionary to 32MB. I'm not sure what good that would do when the chunk size is far below the default dictionary size. Less memory usage perhaps?
-- Tim Starling