I am doing a research project for which I had needed to download the wiki english dump which is of 15 GB from the below link. http://static.wikipedia.org/downloads/2008-06/en/wikipedia-en-html.tar.7z My problem is with the extraction of it.when I tried to extract it could extract upto 32GB only whereas it is supposed to give me around 208GB of data. Please help me as to how to extract it.