As another suggestion, XOWA (http://gnosygnu.github.io/xowa/) can generate a list of thumbs. It takes about 60 hours to parse the English Wikipedia dump and generate a table of 4.78 million rows with the following columns:
* file name * file extension * repo (commons or local) * file width * file height * thumbtime (for video) * page (for djvu / pdf)
There's more information in XOWA at home/wiki/Help:Import/Command-line/Thumbs . I can provide more information online or offline if you're interested.
If you need the actual thumb files, you can download XOWA databases from https://archive.org/details/Xowa_enwiki_latest . They have about 5 million thumbs within SQLite tables. It should be straightforward to write code to pull the blob from the database and save them to disk.
Otherwise, as others have indicated, I know of no MediaWiki way to get this information (via .sql dump file or by api.php). Since XOWA parses wikitext, it can generate the information easily, though the solution is not officially a MediaWiki one.
Hope this helps.
On Sun, Sep 13, 2015 at 1:43 AM, wp mirror wpmirrordev@gmail.com wrote:
- Context
I am currently developing new features for WP-MIRROR (see < https://www.mediawiki.org/wiki/Wp-mirror%3E).
- Objective
I would like WP-MIRROR to generate all image thumbs during the mirror build process. This is so that mediawiki can render pages quickly using precomputed thumbs.
- Dump importation
maintenance/importDump.php - this computes thumbs during importation, but is too slow. mwxml2sql - loads databases quickly, but does not compute thumbs.
- Question
Is there a way to compute all the thumbs after loading databases quickly with mwxml2sql?
Sincerely Yours, Kent _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l