On 18 February 2014 20:41, Antoine Musso hashar+wmf@free.fr wrote:
Le 18/02/2014 08:18, Philip Neustrom a écrit :
The last details on their technical infrastructure indicated that Snowden used "web crawler" (love the quotes) software to obtain information from their internal wiki:
http://www.nytimes.com/2014/02/09/us/snowden-used-low-cost-tool-to-best-nsa....
Hello,
From a usability point of view, I am wondering why he had to rely on a web crawler to export the whole Wiki as HTML. For those wondering, you could use:
https://www.mediawiki.org/wiki/Extension:DumpHTML maintenance script which generate a HTML version of your wiki for archival purposes
https://www.mediawiki.org/wiki/Extension:Pdf_Book to assemble articles in a single document export as PDF.
MediaWiki has Special:Export, but it only export Wikitext.
cheers,
Because all the articles are just speculation?