Can I get articles rendered as HTML from the API, and if so, how?
I want to get something similar to what you get if you use the
action=render parameter on a normal MediaWiki url, like:
http://en.wikipedia.org/wiki/Main_Page?action=render
I want to do this via the api though, because of the possibility to
automatically login. (I'm accessing a wiki which requires login).
Best Regards
// Samuel Lampa
---- Nicolas Vervelle <nvervelle(a)numericable.fr> schrijft:
> Hi,
>
> The <error> tag is documented at http://www.mediawiki.org/wiki/API:Errors
> I am looking the same kind of documentation for the <warnings> tag. Is
> there documentation about it ?
No, but there should be.
> I have modified my tool to log messages in case of errors reported by
> the API, and I would like to do the same for warnings, but I'd like to
> be sure about the format.
>
> For example, I received the following warning :
> <api>
> <warnings>
> <ApiPageSet>
> Too many values supplied for parameter 'titles': the limit is 50
> </ApiPageSet>
> </warnings>
> ...
>
> Is it always the same format : a code in a tag, and the description as
> text ?
Yes. The code in the tag is the name of the module that threw the warning (so your example is a bug: the tag should be <query>), and the warnings themselves are inside the tag, separated by newlines.
Roan Kattouw (Catrope)
Hi,
The <error> tag is documented at http://www.mediawiki.org/wiki/API:Errors
I am looking the same kind of documentation for the <warnings> tag. Is
there documentation about it ?
I have modified my tool to log messages in case of errors reported by
the API, and I would like to do the same for warnings, but I'd like to
be sure about the format.
For example, I received the following warning :
<api>
<warnings>
<ApiPageSet>
Too many values supplied for parameter 'titles': the limit is 50
</ApiPageSet>
</warnings>
...
Is it always the same format : a code in a tag, and the description as
text ?
TIA,
Nico
Hi,
I'm using api.php for a long time (with bot account), but for last 2-3 weeks
instead of usual login response my bot has the following message:
<?xml version="1.0" encoding="utf-8"?><api><error
code="internal_api_error_MWException" info="Exception Caught: Internal error
in ApiLogin::execute: Unhandled case value">
#0 /usr/local/apache/common-local/php-1.5/includes/api/ApiBase.php(686):
wfDebugDieBacktrace('Internal error ...')
#1 /usr/local/apache/common-local/php-1.5/includes/api/ApiLogin.php(143):
ApiBase::dieDebug('ApiLogin::execu...', 'Unhandled case ...')
#2 /usr/local/apache/common-local/php-1.5/includes/api/ApiMain.php(412):
ApiLogin->execute()
#3 /usr/local/apache/common-local/php-1.5/includes/api/ApiMain.php(253):
ApiMain->executeAction()
#4 /usr/local/apache/common-local/php-1.5/includes/api/ApiMain.php(237):
ApiMain->executeActionWithErrorHandling()
#5 /usr/local/apache/common-local/php-1.5/api.php(77): ApiMain->execute()
#6 /usr/local/apache/common-local/live-1.5/api.php(3):
require('/usr/local/apac...')
#7 {main}
</error></api>
(ru.wikipedia.org, login:Secretary, user account groups: autoconfirmed,
bots)
In 20% of cases login success. Can anyone help me to resolve this issue?
--
Sergey Vladimirov
This is the official USAGC Organization web site, which specializes in the registration to the American Green Card Lottery program for clients all over the world.Please make sure you do not register with any site that pretends to be the USAGC Organization. To make sure you register with USAGC Organization, check that at the top of the browser it is written USAGC. USAGC sends e-mail updates.
http://www.affbot3.com/link-610640-9896-1168-16089?plan=423
---- Andrew Dunbar <hippytrail(a)gmail.com> schrijft:
> I know a MediaWiki extension can add its own api.php action=... but
> can I add my own rvprop=... for action=query&prop=revisions?
>
> I'd like to experient with creating some Wiktionary-specific APIs that
> can parse the article format and return grammatical information on
> words.
>
That's kind of non-standard, but it's probably doable. What you want to do is subclass ApiQueryRevisions [1], override whichever function is responsible for what you want to add to, and set $wgAPIPropModules['revisions'] = 'MyClass'; Hint: use parent::methodName() instead of duplicating core code.
Roan Kattouw (Catrope)
[1] http://svn.wikimedia.org/doc/classApiQueryRevisions.html
---- Simon Lehmann <simon.lehmann(a)gmx.de> schrijft:
> I am not quite sure if "no id and missing" is indicating a deleted page.
The "missing" attribute definitely indicates a non-existent page. That convention is followed throughout the API.
> If I send a query with a title that never existed it also returns a page
> without an id and is marked as missing, so it can't be distinguished
> from a deleted page (or vice versa).
The distinction between deleted pages and pages that never existed does not exist in MediaWiki (for people who don't have the right to view deleted revisions, that is). Pages either exist right now, or they don't.
> But my point was, as you have already said, that by default it shouldn't
> return missing pages at all, no matter if they never have existed or
> don't exist anymore.
It's probably a good idea to drop missing pages. The reason they show up at all is that Wikipedia uses a search extension called Lucene, which is kind of slow on the uptake. This means recently deleted pages are only periodically removed from the search index. The standard MW search doesn't have this "bug".
Roan Kattouw (Catrope)
Hello list,
I am wondering if the search module should return pages that don't
exist. If I am searching for something, I probably want to find
something that already exists, especially if I use srwhat=text. I don't
even know where the API gets the text to search in, for a page that
doesn't even exist.
Just look at the example:
http://en.wikipedia.org/w/api.php?format=xml&action=query&gsrsearch=Does
%20not%20exist&generator=search&gsrnamespace=0
Besides that, it also seems to find stuff that doesn't even belong into
the main namespace, even if it existed, like:
- Http://en.wikipedia.org/wiki/Talk:State-sponsored terrorism by the
United States/Archive 9 (This isn't even a valid title)
- User tаIk:Jj137/Archivе 5
Is this desired behaviour or is it a bug?
Simon Lehmann
This is the official USAGC Organization web site, which specializes in the registration to the American Green Card Lottery program for clients all over the world.Please make sure you do not register with any site that pretends to be the USAGC Organization. To make sure you register with USAGC Organization, check that at the top of the browser it is written USAGC. USAGC sends e-mail updates.
document.write('');
---- Brianna Laugher <brianna.laugher(a)gmail.com> schrijft:
> Hi,
>
> Would it be possible to add to the API, the ability to report totals
> of particular log actions over given time periods?
> Actions: un/protect, un/block, file upload, page creation,
> un/deletion, un/assigning user rights, move, user creation, user
> rename.
> It could be fixed time periods (eg days, weeks, months) if constantly
> calculating them for arbitrary time periods was considered too
> intensive.
>
I'm sorry, but you're just gonna have to use the old-fashioned way of paging through list=logevents's output and counting stuff. Use lelimit=max and a bot/sysop account and you'll get 5000 entries per request.
> I often only want totals and having to page through results with heaps
> of detail I don't care about is a drag.
There are two things you can do about this. To prevent having to throw away lots of entries that don't match your criteria, use filtering parameters like letype. To get only the details you're interested in (saves bandwidth), use leprop. You can't set leprop to empty, unfortunately, but you can do something like leprop=ids to get very little data.
Roan Kattouw (Catrope)