its just for search results so maybe just the into paragraph? or maybe you could request either?
Dian mentioned merging it all as a single request but that would be pretty resource intensive and have pretty low cache hit rate. I foresee the use case eventually being fairly high traffic.
you can see what I am using it for ( the add_media_wizard ) by adding: importScriptURI('http://mvbox2.cse.ucsc.edu/w/extensions/MetavidWiki/skins/external_media_wiz...'); to your monobook.js user page.
then edit a page click on the wizard button on the left then click on the "list" layout option in the search results... and notice all the wiki-text ... would look nicer as just html ( will probably have to strip the html down to only perverse a few tags to have consistent formating... but that can be done in javascript )
peace, --michael
Roan Kattouw wrote:
Michael Dale schreef:
Is it possible to parse multiple pages at once or to get your search results as html?
for example I run something like: http://commons.wikimedia.org/w/api.php?format=jsonfm&action=query&ge...
And I get all the results "revisions": ["*"] as wiki-text ... ideally I could get those results as html is there anyway to do that?
You can't get them all at once, currently, no. You could run them through action=parse one by one, but you seem to imply you already knew that.
I'll look into the feasibility of parsing multiple pages at once (do you want to parse all pages in their entirety, or just parts of them?) when I have time, which probably won't be this week or next (busy times).
Roan Kattouw (Catrope)
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api