I am doing a research project for which I had needed to download the wiki english dump which is of 15 GB from the below link. http://static.wikipedia.org/downloads/2008-06/en/wikipedia-en-html.tar.7z My problem is with the extraction of it.when I tried to extract it could extract upto 32GB only whereas it is supposed to give me around 208GB of data. Please help me as to how to extract it.
You probably want wikitech-l for this question. cc'd there.
- d.
2008/9/8 jay mehta jmenjoy05@yahoo.com:
I am doing a research project for which I had needed to download the wiki english dump which is of 15 GB from the below link. http://static.wikipedia.org/downloads/2008-06/en/wikipedia-en-html.tar.7z My problem is with the extraction of it.when I tried to extract it could extract upto 32GB only whereas it is supposed to give me around 208GB of data. Please help me as to how to extract it.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l