Is it possible to retrieve content in different Chinese language
variants using the /w/api.php API? There doesn't seem to be a variant or
language parameter that would allow selecting a variant like "zh-tw" or
"zh-hk". Is there some other way to do this?
I checked the auto-generated API docs and the API:* articles but could
not find any hint on how to do this. I'm using an API call like this to
retrieve the article content:
Hopefully someone can give me a hand here...
I'm looking for the mediawiki api which should be located at
http://wikitravel.org/w/api.php, but can't seem to find it.
Also, googling showed nothing about the subject. could it be that wikitravel
does not have an API? has it been removed?
is there an alternate location?
Is there not an "action=query&list=allcategories"? Any plans for
( 877 ) 406 - 6272 [toll free]
"There is more than one right way .. to make it perfect!"
As far as I have understood, the API should permit to create/edit pages as
described in the proposal found at
In the last version of API developped, I did not found this
Is it planned to develop it ? If yes, when ?
Thanks for your help.
Yuri Astrakhan schreef:
> Congrats :)
Thanks. It took some 5 months to get the whole thing done, and I'm glad
it's more or less finished now. When I have more time (in a few weeks)
I'll look into the apiedit_vodafone branch, which has been inactive for
quite a while if memory serves.
BTW, the apiedit branch has actually been deleted now, in r28187.
Roan Kattouw (Catrope)
I was looking through changes to the apiedit branch and saw a revert to
disable login tokens. I read the note in SVN as to why, but I don't
understand the benefit of just using cookies versus using tokens, especially
for robots. I'm not questioning Brion's decision, just wondering if there
was explanation. Also, I don't understand how to implement his suggestion -
is that just with cookies now? Thanks.
I'm trying to use the Mediawiki API to get information about revisions
to a given page. I understand that it is possible to get the entire
contents of a given revision. However I would like to know if it is
possible to only get the diffs introduced in a given revision? This
would result in a reduced bandwidth consumption.
My goal is to get the most frequently used terms in a set of revisions
to a given page. For instance, for the web page Wikipedia what were
the most common terms in all revisions between 2007-09 and 2007-11.
Thanks in advance for any comments on this issue,