The ApiQueryDeletedrevs module has issues: since it's a list module rather
than a prop module, it isn't handled correctly by the simplified
continuation, it doesn't support input of specific revids, and so on.
Gerrit change 168646[1] deprecates list=deletedrevs in favor of two new
modules:
* list=alldeletedrevisions will query the deleted revisions in a namespace
and/or for a user.
* prop=deletedrevisions will work like prop=revisions, querying deleted
revisions for a page, or specific revisions requested using action=query's
revids parameter.
For the latter to work properly, the revids parameter will now recognize
deleted revision IDs as valid if the querying user has the 'deletedhistory'
user right; before this change deleted revision IDs were treated as
non-existent.
These changes should be deployed to WMF wikis with 1.25wmf7, see
https://www.mediawiki.org/wiki/MediaWiki_1.25/Roadmap for the schedule.
At some point in the future, limited information from the archive table may
be made available to all users, as is already done on Tool Labs.[2] If that
happens, recognition of deleted revision IDs as valid for the revids
parameter will likely be extended to all users.
[1]: https://gerrit.wikimedia.org/r/#/c/168646/
[2]: https://bugzilla.wikimedia.org/show_bug.cgi?id=49088
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Il 25/10/2014 10:40 UTC, Harald Groven ha scritto:
> How do I get a parameter of pages using a template using the API?
> E.g. extract 12345 from pages tagged with {{mytemplate|12345}}, when
> mytemplate do not contain categories.
>
> --
> Harald Groven
AFAIK, there is no way to get such information with the core API modules
without directly adding categories via templates.
But if the target wiki uses Parsoid
<https://www.mediawiki.org/wiki/Parsoid>, you may use its API to obtain
RDFa data (combined with HTML5 content) also containing transclusions in
a machine-readable format. The specification
<https://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec#Transclusion_cont…>
is available.
Another approach could be the great mwparserfromhell
<https://github.com/earwig/mwparserfromhell> Python library, widely used
by bots for advanced wikitext parsing.
Please let us know your choice.
How do I get a parameter of pages using a template using the API?
E.g. extract 12345 from pages tagged with {{mytemplate|12345}}, when
mytemplate do not contain categories.
--
Harald Groven
The API supports two methods for continuing action=query when more results
are available, the simple method[1] and the raw method.[2] The raw method
is currently the default for historical reasons, but as the simple method
is much easier for new users to use *correctly* that really should be the
default.
To make this transition easy for clients, the current plan is to make the
change on the following timetable:
* Starting with 1.24wmf22,[3][4] action=query will recognize a
"rawcontinue" boolean input parameter. Clients that wish to continue using
the raw method for continuation should begin supplying this parameter with
all action=query queries.
* Sometime during the MediaWiki 1.25 development cycle, the API will begin
reporting warnings when neither "continue" nor "rawcontinue" are supplied
with action=query.[5]
* Sometime during the MediaWiki 1.26 development cycle, simplified
continuation will become the default.[6]
Note this is also documented at <
https://www.mediawiki.org/wiki/Requests_for_comment/API_roadmap#Simplified_…>.
See other sections on that page for additional planned API changes.
[1]: https://www.mediawiki.org/wiki/API:Query#Continuing_queries
[2]: https://www.mediawiki.org/wiki/API:Raw_Query_Continue
[3]: https://gerrit.wikimedia.org/r/#/c/154092/
[4]: See https://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap for the
schedule of deployments to WMF wikis.
[5]: https://gerrit.wikimedia.org/r/#/c/160222/
[6]: https://gerrit.wikimedia.org/r/#/c/160223/
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
I need to retrieve Wikipedia articles (the URLs of which are not known until run time) and display them in an iFrame using jQuery AJAX.
Could someone please give me some sample code on how to do this?
Thanks!
ja
http://vispo.com
Legoktm has just merged Gerrit change 160798,[1] Gerrit change 161093,[2]
Gerrit change 160819,[3] and a few associated patches. Collectively, these
make the following changes to the human-readable aspects of the API output,
which will be deployed to WMF wikis with 1.25wmf4 (see
https://www.mediawiki.org/wiki/MediaWiki_1.25/Roadmap for the schedule).
* action=help now outputs a nice HTML page instead of a plain-text document
wrapped in XML.
* Hitting api.php with no parameters will display help for the main module
only, with links to each submodule. One of the examples provided shows how
to get everything on one page as with the old help.
* The default for the 'format' parameter is now jsonfm rather than xmlfm.
* Syntax highlighting for format=jsonfm and format=xmlfm now uses
Extension:SyntaxHighlight_GeSHi rather than ad-hoc code that only worked
right for xmlfm. Other formats are not highlighted.
The patches also change the aspects of action=paraminfo that return
help-related information:
* Module description is no longer returned by default; a new parameter
selects html, wikitext, or raw Message data.
* The 'examples' result property is now an array rather than a string.
* The 'allexamples' result property has been removed.
* Parameter descriptions are no longer returned by default; a new parameter
selects html, wikitext, or raw Message data. Machine-readable parameter
info remains available.
The impact of these changes on existing bots and scripts is expected to be
minimal, as non-human consumption is likely to be limited to ApiSandbox.
[1]: https://gerrit.wikimedia.org/r/#/c/160798
[2]: https://gerrit.wikimedia.org/r/#/c/161093
[2]: https://gerrit.wikimedia.org/r/#/c/160819
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hi,
A user reported problems with my tool WPCleaner since a few hours, but my
tool hasn't changed at all in a few weeks.
It seems there's a problem with an API request for list=backlinks.
For example, the request below returns now an internal API error, but it
was working correctly before.
Request:
https://fr.wikipedia.org/w/api.php?bltitle=BNF&action=query&blredirect=&lis…
Result:
<?xml version="1.0"?><api servedby="mw1053"><error code="
internal_api_error_MWException" info="Exception Caught: Internal error in
ApiFormatXml::recXmlPrint: (redirlinks, ...) has integer keys without
_element value. Use ApiResult::setIndexedTagName()." xml:space="preserve" />
</api>
Nico
(better with a subject)
Hi,
A user reported problems with my tool WPCleaner since a few hours, but my
tool hasn't changed at all in a few weeks.
It seems there's a problem with an API request for list=backlinks.
For example, the request below returns now an internal API error, but it
was working correctly before.
Request:
https://fr.wikipedia.org/w/api.php?bltitle=BNF&action=query&blredirect=&lis…
Result:
<?xml
version="1.0"?><api servedby="mw1053"><error
code="internal_api_error_MWException" info="Exception
Caught: Internal error in ApiFormatXml::recXmlPrint: (redirlinks, ...) has
integer keys without _element value. Use
ApiResult::setIndexedTagName()." xml:space="preserve" /></api>
Nico
>
>
Hi!
When I query the Estonian Wikipedia's Web API for the article's first
sentence, I sometimes get empty response. Actually it gives back an
horizontal rule and thats it.
For example:
https://et.wikipedia.org/w/api.php?action=query&prop=extracts|categories&ex…
gives only an horizontal rule as the extract:
"extract": "<hr />",
Can anyone say what is happening here. Is the article's source organized
in a wrong way or is it a problem on the APIs sentence parser side?
Best regards
Kristian Kankainen