Hello,
I am writing a Java program to extract the abstract of the wikipedia page
given the title of the wikipedia page. I have done some research and found
out that the abstract with be in rvsection=0
So for example if I want the abstract of 'Eiffel Tower" wiki page then I am
querying using the api in the following way.
http://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Eiffel…
and parse the XML data which we get and take the wikitext in the tag <rev
xml:space="preserve"> which represents the abstract of the wikipedia page.
But this wiki text also contains the infobox data which I do not need. I
would like to know if there is anyway in which I can remove the infobox data
and get only the wikitext related to the page's abstract Or if there is any
alternative method by which I can get the abstract of the page directly.
Looking forward to your help.
Thanks in Advance
Aditya Uppu
When list=allusers is used with auactiveusers, a property 'recenteditcount'
is returned in the result. In bug 67301[1] it was pointed out that this
property is including various other logged actions, and so should really be
named something like "recentactions".
Gerrit change 130093,[2] merged today, adds the "recentactions" result
property. "recenteditcount" is also returned for backwards compatability,
but will be removed at some point during the MediaWiki 1.25 development
cycle.
Any clients using this property should be updated to use the new property
name. The new property will be available on WMF wikis with 1.24wmf12, see
https://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap for the schedule.
[1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=67301
[2]: https://gerrit.wikimedia.org/r/#/c/130093/
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Now that MediaWiki has a pure-PHP tidying implementation, we are
deprecating non-tidy output.[1] Further, the future rewrite of Parsoid in
PHP[2] and its merge to core will have "tidying" as an integral feature.
Thus, the disabletidy parameter to action=parse is being deprecated and
will be removed at some point in the future. Clients should stop using the
parameter and begin using tidied HTML output.
This change should be deployed to Wikimedia wikis with 1.32.0-wmf.24 or
later, see https://www.mediawiki.org/wiki/MediaWiki_1.32/Roadmap for a
schedule.
[1]: https://phabricator.wikimedia.org/T198214
[2]: https://phabricator.wikimedia.org/tag/parsoid-php/
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hi,
I am trying to adapt a program that searches and fetches pages from the
main US Wikipedia to do the same against a private MediaWiki installation.
The program is written using the wikipedia python library
https://github.com/goldsmith/Wikipedia.
Unfortunately, I cannot figure out how to get the library to "point" at
other Mediawikis than the Wikipedia ones.
Can anyone either
a) show me how to change the target endpoint with this library or
b) point me to another Python library where I can do the same
and
c) provide me with an URL to a publicly accessible non-Wikipedia Mediawiki
site with API that I can use to test the results of a/b?
Any help will be much appreciated.
Fred
--
Fred Zimmerman
Hi there!
I'm trying to generate 10 random articles in Hebrew using this request:
https://he.wikipedia.org/w/api.php?format=xml&action=query&generator=random…
As you can see, I'd like to get their intro, but some of the articles'
intro is empty in the API response, even though it's not empty in Wikipedia
itself.
I have tried to make the same request in English, and I the problem didn't
happen.
This is the first time I encounter this weird problem after making this
request in my App for over a year... Am I missing something? Is there any
new information that I should add to the request or it's just a bug?
Thanks!