Brion Vibber wrote:
And last but not least - If the dumps don't work, then it is very important to be able to dump some articles with their full histories in other fashions. I ask my pledge again - do you know who made the block so that export would only allow for 100 revisions? any way to hack that? Would it be possible to open an exception to get the data for a research study?
That was originally done because buffering would cause a longer export to fail. The export has since been changed so it should skip buffering, so this possibly could be lifted. I'll take a peek.
Currently the limit isn't applied to GET requests -- you either get only current version or the full history. Interesting. :)
I'm not entirely sure it's supposed to do that, the code for handling input is a little funky atm. :)
wget 'http://en.wikipedia.org/wiki/Special:Export/Bay_Area_Rapid_Transit?history=1'
-- brion