I need to load up a few hundred MW pages with text and images.
Looking at http://en.wikipedia.org/w/api.php, I find several tools for getting information about images, but nothing for uploading them:
* prop=images (im) * * prop=imageinfo (ii) * * list=allimages (ai) * * list=imageusage (iu) *
Am I missing something?
More generally, are there any tools other than pywikipediabot and the MW API that I should look into?
-r
Rich Morin schreef:
I need to load up a few hundred MW pages with text and images.
Looking at http://en.wikipedia.org/w/api.php, I find several tools for getting information about images, but nothing for uploading them:
- prop=images (im) *
- prop=imageinfo (ii) *
- list=allimages (ai) *
- list=imageusage (iu) *
Am I missing something?
In a way, yes. You can export wiki pages (not sure about images, though) to an XML file using Special:Export, and import them in another wiki using Special:Import. You need the import right (sysops only by default) to use Special:Import, though.
Roan Kattouw (Catrope)
At 20:29 +0200 9/21/08, Roan Kattouw wrote:
Rich Morin schreef:
Am I missing something?
In a way, yes. You can export wiki pages ...
Thanks for the pointer, but looks like it still requires me to manually import 200+ pages.
-r
Rich Morin schreef:
At 20:29 +0200 9/21/08, Roan Kattouw wrote:
Rich Morin schreef:
Am I missing something?
In a way, yes. You can export wiki pages ...
Thanks for the pointer, but looks like it still requires me to manually import 200+ pages.
You can export a virtually unlimited* number of pages into a single .xml file, and import it in one go as well.
Roan Kattouw (Catrope)
* In practice, there are limits on file size and script execution time (typically 30 seconds), of course, so you won't be able to import/export, say, a million pages
mediawiki-api@lists.wikimedia.org