2008/9/8 jay mehta jmenjoy05@yahoo.com:
I am doing a research project for which I had needed to download the wiki english dump which is of 15 GB from the below link. http://static.wikipedia.org/downloads/2008-06/en/wikipedia-en-html.tar.7z My problem is with the extraction of it.when I tried to extract it could extract upto 32GB only whereas it is supposed to give me around 208GB of data. Please help
me as to how to extract it.
You're right, it's 208GB and 321 MB. It looks to me like a limitation of the filesystem. Which filesystem is used on that partition?