Hello,
I am writing a Java program to extract the abstract of the wikipedia page
given the title of the wikipedia page. I have done some research and found
out that the abstract with be in rvsection=0
So for example if I want the abstract of 'Eiffel Tower" wiki page then I am
querying using the api in the following way.
http://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Eiffel…
and parse the XML data which we get and take the wikitext in the tag <rev
xml:space="preserve"> which represents the abstract of the wikipedia page.
But this wiki text also contains the infobox data which I do not need. I
would like to know if there is anyway in which I can remove the infobox data
and get only the wikitext related to the page's abstract Or if there is any
alternative method by which I can get the abstract of the page directly.
Looking forward to your help.
Thanks in Advance
Aditya Uppu
When list=allusers is used with auactiveusers, a property 'recenteditcount'
is returned in the result. In bug 67301[1] it was pointed out that this
property is including various other logged actions, and so should really be
named something like "recentactions".
Gerrit change 130093,[2] merged today, adds the "recentactions" result
property. "recenteditcount" is also returned for backwards compatability,
but will be removed at some point during the MediaWiki 1.25 development
cycle.
Any clients using this property should be updated to use the new property
name. The new property will be available on WMF wikis with 1.24wmf12, see
https://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap for the schedule.
[1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=67301
[2]: https://gerrit.wikimedia.org/r/#/c/130093/
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
We have decided to officially retire the rest.wikimedia.org domain in
favor of /api/rest_v1/ at each individual project domain. For example,
https://rest.wikimedia.org/en.wikipedia.org/v1/?doc
becomes
https://en.wikipedia.org/api/rest_v1/?doc
Most clients already use the new path, and benefit from better
performance from geo-distributed caching, no additional DNS lookups,
and sharing of TLS / HTTP2 connections.
We intend to shut down the rest.wikimedia.org entry point around
March, so please adjust your clients to use /api/rest_v1/ soon.
Thank you for your cooperation,
Gabriel
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
Hi,
we are considering a policy for REST API end point result format
versioning and negotiation. The background and considerations are
spelled out in a task and mw.org page:
https://phabricator.wikimedia.org/T124365https://www.mediawiki.org/wiki/Talk:API_versioning
Based on the discussion so far, have come up with the following
candidate solution:
1) Clearly advise clients to explicitly request the expected mime type
with an Accept header. Support older mime types (with on-the-fly
transformations) until usage has fallen below a very low percentage,
with an explicit sunset announcement.
2) Always return the latest content type if no explicit Accept header
was specified.
We are interested in hearing your thoughts on this.
Once we have reached rough consensus on the way forward, we intend to
apply the newly minted policy to an evolution of the Parsoid HTML
format, which will move the data-mw attribute to a separate metadata
blob.
Gabriel Wicke
I've created a wiki page providing an overview of the objectives, current
status, and obstacles for the experimental Wiktionary popups, which you can
find here:
https://www.mediawiki.org/wiki/Wikimedia_Apps/Wiktionary_definition_popups_…
-m.
On Sun, Feb 14, 2016 at 2:08 PM, Dmitry Brant <dbrant(a)wikimedia.org> wrote:
> Hi Nemo,
>
> As Gabriel notes, the Wiktionary endpoint is still very much experimental,
> and is subject to change. One of the ongoing goals for the Android app is
> to integrate more rich content into the browsing experience. One such
> feature is to allow the user to highlight words in an article and see a
> quick popup definition of the word from Wiktionary[1]. To facilitate this
> action, we set up a RESTBase endpoint for fetching the desired term from
> Wiktionary[2].
>
> This feature is currently only available in the Wikipedia Beta app, and is
> restricted only to English wiktionary. Further work on this endpoint will
> depend on the level of user engagement with the feature, once it's rolled
> out to the main Wikipedia app. So, once again, even though we're building
> the endpoint with the hope that it would be used by other consumers besides
> the Android app (and expanded to all languages), at the moment it's by no
> means ready for general consumption.
>
> We do have a wiki page[3] with some more details on the service endpoints
> that are used by the apps, which you, as well as the Wiktionary community,
> are welcome to comment on.
>
> -Dmitry
>
> [1] https://phabricator.wikimedia.org/T115484
> [2] https://phabricator.wikimedia.org/T119235
> [3]
> https://www.mediawiki.org/wiki/Wikimedia_Apps/Team/RESTBase_services_for_ap…
>
>
> On Sun, Feb 14, 2016 at 12:18 PM, Gabriel Wicke <gwicke(a)wikimedia.org>
> wrote:
>
>> Federico,
>>
>> as indicated by the classification as "experimental" [1], the
>> definition end point [2] is at a very early point of its development.
>> The mobile app team has added preliminary support for extracting
>> definitions in the content service [3] using Parsoid's template
>> metadata, and is using this end point to power a "define this word"
>> feature in the next version of the Android app. You can preview the
>> feature in the beta Android app when browsing English Wikipedia by
>> selecting a word, and then hitting the 'definition' icon next to
>> 'copy'.
>>
>> In this first iteration, only English Wiktionary is supported.
>> Generalizing the service and API end point to provide definitions
>> using more or all Wiktionaries will require more work and planning. In
>> the next iteration, I would expect a focus on enabling collaborative
>> definition and maintenance of extraction rules, as well as broader
>> involvement of Wiktionary communities in the planning process. The
>> timing for the next iteration depends partly on the mobile app team's
>> priorities, so I will defer to the team to comment on this.
>>
>> To summarize: We are aiming to gradually develop this into a generally
>> useful, stable and well-documented API entry point for word
>> definitions. The experimental end point published right now is just
>> the beginning, and you are very much invited to help shape the way
>> forward.
>>
>> Gabriel
>>
>> [1]: https://www.mediawiki.org/wiki/API_versioning#Experimental
>> [2]:
>> https://en.wiktionary.org/api/rest_v1/?doc#!/Page_content/get_page_definiti…
>> [3]:
>> https://github.com/wikimedia/mediawiki-services-mobileapps/blob/master/lib/…
>>
>>
>> --
>> Gabriel Wicke
>> Principal Engineer, Wikimedia Foundation
>>
>> _______________________________________________
>> Mobile-l mailing list
>> Mobile-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>>
>
>
>
> --
> Dmitry Brant
> Software Engineer / Product Owner (Android)
> Wikimedia Foundation
> https://www.mediawiki.org/wiki/Wikimedia_mobile_engineering
>
>
> _______________________________________________
> Mobile-l mailing list
> Mobile-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>
>
Federico,
as indicated by the classification as "experimental" [1], the
definition end point [2] is at a very early point of its development.
The mobile app team has added preliminary support for extracting
definitions in the content service [3] using Parsoid's template
metadata, and is using this end point to power a "define this word"
feature in the next version of the Android app. You can preview the
feature in the beta Android app when browsing English Wikipedia by
selecting a word, and then hitting the 'definition' icon next to
'copy'.
In this first iteration, only English Wiktionary is supported.
Generalizing the service and API end point to provide definitions
using more or all Wiktionaries will require more work and planning. In
the next iteration, I would expect a focus on enabling collaborative
definition and maintenance of extraction rules, as well as broader
involvement of Wiktionary communities in the planning process. The
timing for the next iteration depends partly on the mobile app team's
priorities, so I will defer to the team to comment on this.
To summarize: We are aiming to gradually develop this into a generally
useful, stable and well-documented API entry point for word
definitions. The experimental end point published right now is just
the beginning, and you are very much invited to help shape the way
forward.
Gabriel
[1]: https://www.mediawiki.org/wiki/API_versioning#Experimental
[2]: https://en.wiktionary.org/api/rest_v1/?doc#!/Page_content/get_page_definiti…
[3]: https://github.com/wikimedia/mediawiki-services-mobileapps/blob/master/lib/…
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
Hi Luigi,
On Fri, Jan 29, 2016 at 12:31 PM, Luigi Assom <itsawesome.yes(a)gmail.com> wrote:
> - how to extract _ID from ETag in headers:
> GET /page/title/{title}
the page id is indeed not directly exposed in the HTML response.
However, the revision number is exposed as part of the ETag. This can
then be used to request revision metadata including the page id at
https://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/get_page_revision_….
This is admittedly not very convenient, so I created
https://phabricator.wikimedia.org/T125453 for generally improved page
id support in the REST API.
> - how to ensure
> GET /page/title/{title with different char encoding or old titles are always
> resolved to last canonical version}
The storage backing this end point is automatically kept up to date
with edits and dependency changes. Edits in particular should be
reflected within a few seconds.
>> If you refer to
>>
>> https://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/get_page_graph_png…,
>> this is an end point exposing rendered graph images for
>> https://www.mediawiki.org/wiki/Extension:Graph (as linked in the end
>> point documentation).
>
>
> Oh very interesting!
> So basically html markup can be extended ?
> Would it be possible to share json objects as html5 markup and embed them in
> wiki pages?
The graph extension is using the regular MediaWiki tag extension
mechanism: https://www.mediawiki.org/wiki/Manual:Tag_extensions
Graphs are indeed defined using JSON within this tag.
> I want to avoid to update my graph just because titles changes: entities are
> always the same.
Makes sense. The current API is optimized for the common case of
access by title, but we will consider adding access by page ID as
well.
> I still don't know what parsoid is.
Parsoid is the service providing semantic HTML and a bi-directional
conversion between that & wikitext:
https://www.mediawiki.org/wiki/Parsoid
Gabriel