I've created a wiki page providing an overview of the objectives, current
status, and obstacles for the experimental Wiktionary popups, which you can
find here:
https://www.mediawiki.org/wiki/Wikimedia_Apps/Wiktionary_definition_popups_…
-m.
On Sun, Feb 14, 2016 at 2:08 PM, Dmitry Brant <dbrant(a)wikimedia.org> wrote:
> Hi Nemo,
>
> As Gabriel notes, the Wiktionary endpoint is still very much experimental,
> and is subject to change. One of the ongoing goals for the Android app is
> to integrate more rich content into the browsing experience. One such
> feature is to allow the user to highlight words in an article and see a
> quick popup definition of the word from Wiktionary[1]. To facilitate this
> action, we set up a RESTBase endpoint for fetching the desired term from
> Wiktionary[2].
>
> This feature is currently only available in the Wikipedia Beta app, and is
> restricted only to English wiktionary. Further work on this endpoint will
> depend on the level of user engagement with the feature, once it's rolled
> out to the main Wikipedia app. So, once again, even though we're building
> the endpoint with the hope that it would be used by other consumers besides
> the Android app (and expanded to all languages), at the moment it's by no
> means ready for general consumption.
>
> We do have a wiki page[3] with some more details on the service endpoints
> that are used by the apps, which you, as well as the Wiktionary community,
> are welcome to comment on.
>
> -Dmitry
>
> [1] https://phabricator.wikimedia.org/T115484
> [2] https://phabricator.wikimedia.org/T119235
> [3]
> https://www.mediawiki.org/wiki/Wikimedia_Apps/Team/RESTBase_services_for_ap…
>
>
> On Sun, Feb 14, 2016 at 12:18 PM, Gabriel Wicke <gwicke(a)wikimedia.org>
> wrote:
>
>> Federico,
>>
>> as indicated by the classification as "experimental" [1], the
>> definition end point [2] is at a very early point of its development.
>> The mobile app team has added preliminary support for extracting
>> definitions in the content service [3] using Parsoid's template
>> metadata, and is using this end point to power a "define this word"
>> feature in the next version of the Android app. You can preview the
>> feature in the beta Android app when browsing English Wikipedia by
>> selecting a word, and then hitting the 'definition' icon next to
>> 'copy'.
>>
>> In this first iteration, only English Wiktionary is supported.
>> Generalizing the service and API end point to provide definitions
>> using more or all Wiktionaries will require more work and planning. In
>> the next iteration, I would expect a focus on enabling collaborative
>> definition and maintenance of extraction rules, as well as broader
>> involvement of Wiktionary communities in the planning process. The
>> timing for the next iteration depends partly on the mobile app team's
>> priorities, so I will defer to the team to comment on this.
>>
>> To summarize: We are aiming to gradually develop this into a generally
>> useful, stable and well-documented API entry point for word
>> definitions. The experimental end point published right now is just
>> the beginning, and you are very much invited to help shape the way
>> forward.
>>
>> Gabriel
>>
>> [1]: https://www.mediawiki.org/wiki/API_versioning#Experimental
>> [2]:
>> https://en.wiktionary.org/api/rest_v1/?doc#!/Page_content/get_page_definiti…
>> [3]:
>> https://github.com/wikimedia/mediawiki-services-mobileapps/blob/master/lib/…
>>
>>
>> --
>> Gabriel Wicke
>> Principal Engineer, Wikimedia Foundation
>>
>> _______________________________________________
>> Mobile-l mailing list
>> Mobile-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>>
>
>
>
> --
> Dmitry Brant
> Software Engineer / Product Owner (Android)
> Wikimedia Foundation
> https://www.mediawiki.org/wiki/Wikimedia_mobile_engineering
>
>
> _______________________________________________
> Mobile-l mailing list
> Mobile-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>
>
Federico,
as indicated by the classification as "experimental" [1], the
definition end point [2] is at a very early point of its development.
The mobile app team has added preliminary support for extracting
definitions in the content service [3] using Parsoid's template
metadata, and is using this end point to power a "define this word"
feature in the next version of the Android app. You can preview the
feature in the beta Android app when browsing English Wikipedia by
selecting a word, and then hitting the 'definition' icon next to
'copy'.
In this first iteration, only English Wiktionary is supported.
Generalizing the service and API end point to provide definitions
using more or all Wiktionaries will require more work and planning. In
the next iteration, I would expect a focus on enabling collaborative
definition and maintenance of extraction rules, as well as broader
involvement of Wiktionary communities in the planning process. The
timing for the next iteration depends partly on the mobile app team's
priorities, so I will defer to the team to comment on this.
To summarize: We are aiming to gradually develop this into a generally
useful, stable and well-documented API entry point for word
definitions. The experimental end point published right now is just
the beginning, and you are very much invited to help shape the way
forward.
Gabriel
[1]: https://www.mediawiki.org/wiki/API_versioning#Experimental
[2]: https://en.wiktionary.org/api/rest_v1/?doc#!/Page_content/get_page_definiti…
[3]: https://github.com/wikimedia/mediawiki-services-mobileapps/blob/master/lib/…
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
Hi Luigi,
On Fri, Jan 29, 2016 at 12:31 PM, Luigi Assom <itsawesome.yes(a)gmail.com> wrote:
> - how to extract _ID from ETag in headers:
> GET /page/title/{title}
the page id is indeed not directly exposed in the HTML response.
However, the revision number is exposed as part of the ETag. This can
then be used to request revision metadata including the page id at
https://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/get_page_revision_….
This is admittedly not very convenient, so I created
https://phabricator.wikimedia.org/T125453 for generally improved page
id support in the REST API.
> - how to ensure
> GET /page/title/{title with different char encoding or old titles are always
> resolved to last canonical version}
The storage backing this end point is automatically kept up to date
with edits and dependency changes. Edits in particular should be
reflected within a few seconds.
>> If you refer to
>>
>> https://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/get_page_graph_png…,
>> this is an end point exposing rendered graph images for
>> https://www.mediawiki.org/wiki/Extension:Graph (as linked in the end
>> point documentation).
>
>
> Oh very interesting!
> So basically html markup can be extended ?
> Would it be possible to share json objects as html5 markup and embed them in
> wiki pages?
The graph extension is using the regular MediaWiki tag extension
mechanism: https://www.mediawiki.org/wiki/Manual:Tag_extensions
Graphs are indeed defined using JSON within this tag.
> I want to avoid to update my graph just because titles changes: entities are
> always the same.
Makes sense. The current API is optimized for the common case of
access by title, but we will consider adding access by page ID as
well.
> I still don't know what parsoid is.
Parsoid is the service providing semantic HTML and a bi-directional
conversion between that & wikitext:
https://www.mediawiki.org/wiki/Parsoid
Gabriel
On Fri, Jan 29, 2016 at 1:41 PM, Bináris <wikiposta(a)gmail.com> wrote:
> 2016-01-29 18:56 GMT+01:00 Brad Jorsch (Anomie) <bjorsch(a)wikimedia.org>:
>
> > by going to
> > https://www.mediawiki.org/wiki/Special:ApiFeatureUsage, entering your
> > agent
> > (or any useful prefix of it), and looking for "https-expected".
> >
>
> What does *unclear-"now"-timestamp* mean here?
>
For various API timestamp-typed parameters, you can pass unusual values
such as the empty string or "0" and it will be interpreted as meaning
"now", which doesn't make very much sense except for the fact that it has
always done that. If you really mean "now", you should pass that as the
value instead.
action=edit even has to hack around this to avoid spurious edit conflicts
if you do it for the 'basetimestamp' parameter. Ideally we'd make empty
string and '0' be rejected as invalid timestamps, but first people have to
stop passing them in.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
(+mediawiki-api-announce, 2nd try)
---------- Forwarded message ----------
From: Gabriel Wicke <gwicke(a)wikimedia.org>
Date: Mon, Jan 25, 2016 at 11:00 AM
Subject: Deprecating rest.wikimedia.org in favor of /api/rest_v1/
We have decided to officially retire the rest.wikimedia.org domain in
favor of /api/rest_v1/ at each individual project domain. For example,
https://rest.wikimedia.org/en.wikipedia.org/v1/?doc
becomes
https://en.wikipedia.org/api/rest_v1/?doc
Most clients already use the new path, and benefit from better
performance from geo-distributed caching, no additional DNS lookups,
and sharing of TLS / HTTP2 connections.
We intend to shut down the rest.wikimedia.org entry point around
March, so please adjust your clients to use /api/rest_v1/ soon.
Thank you for your cooperation,
Gabriel
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
This is now entering its final comment period, so please weigh in at
https://phabricator.wikimedia.org/T124365.
Based on your input, the Parsing, Editing & Services teams will make a
decision on this next Wednesday, Feb 2nd.
Thanks,
Gabriel
On Thu, Jan 21, 2016 at 4:29 PM, Gabriel Wicke <gwicke(a)wikimedia.org> wrote:
> Hi,
>
> we are considering a policy for REST API end point result format
> versioning and negotiation. The background and considerations are
> spelled out in a task and mw.org page:
>
> https://phabricator.wikimedia.org/T124365
> https://www.mediawiki.org/wiki/Talk:API_versioning
>
> Based on the discussion so far, have come up with the following
> candidate solution:
>
> 1) Clearly advise clients to explicitly request the expected mime type
> with an Accept header. Support older mime types (with on-the-fly
> transformations) until usage has fallen below a very low percentage,
> with an explicit sunset announcement.
>
> 2) Always return the latest content type if no explicit Accept header
> was specified.
>
> We are interested in hearing your thoughts on this.
>
> Once we have reached rough consensus on the way forward, we intend to
> apply the newly minted policy to an evolution of the Parsoid HTML
> format, which will move the data-mw attribute to a separate metadata
> blob.
>
> Gabriel Wicke
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
It currently still works to POST to the API via http instead of https. But
we'd really like to stop allowing that, see
https://phabricator.wikimedia.org/T105794. Thus, the API will now return a
warning if https was expected but not used.
If you run a bot, please check your configuration to make sure that you're
using https rather than http. If you're using a distinctive user agent for
your bot (which you all are, right?[1]), you can now check whether your bot
is using http by going to
https://www.mediawiki.org/wiki/Special:ApiFeatureUsage, entering your agent
(or any useful prefix of it), and looking for "https-expected".
If for some reason your bot cannot support https, you really should upgrade
it to make that happen.
[1]: https://meta.wikimedia.org/wiki/User-Agent_policy
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
On Mon, Jan 25, 2016 at 11:38 AM, Oliver Keyes <okeyes(a)wikimedia.org> wrote:
> Will it apply to the pageviews API as well?
It will, but the canonical URL for this has always been
https://wikimedia.org/api/rest_v1/?doc, which will continue to work.
Are you aware of any pageview users hitting rest.wikimedia.org?
In any case, we'll check the logs for remaining rest.wikimedia.org
accesses & make an effort to remind remaining users before
decommissioning it.
Gabriel
Hello,
I would like to better understand the difference in using list=search VS
generator=search for full-text search.
I've read list=search relies on elastic search: which are the differences
in indexing and differences in returned results between list=generator and
generator=search ?
I also need to query the page_ID of returned articles:
I can using a generator=search: page_IDs are related to returned pages (example
in sandbox
<https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&prop=extracts…>
)
But cannot do it with list=search:
I tried: list=search + generator=allpages + indexpageids parameter.
The pageIDs in query['pageids'] *are not related* to the articles in
the query['search']
list - it looks like generator is querying new stuff by itself, instead of
taking the list in input.
Could you please help to write a query using list=search to fetch also
pageIDs of returned pages?
My sandbox attempt is:
https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&list=search&f…
Thank you!
With the merge of Gerrit change 264309,[1] there are two changes to the
handling of login and createaccount tokens. The changes should be deployed
to WMF wikis with 1.27.0-wmf.12, see
https://www.mediawiki.org/wiki/MediaWiki_1.27/Roadmap for the schedule.
Neither of these changes should break existing API clients, unless the
client is treating API warnings as errors or is doing something unusual
with these tokens.
The first change is that login and createaccount tokens now use the same
token generation mechanism as other CSRF tokens, but not the special case
that results in other CSRF tokens always being "+\" when not logged in.
This means that login and createaccount tokens will be longer, will end in
"+\", and include an embedded timestamp so a potential future change could
have them expire after a defined time period rather than lasting for the
duration of the session.
The second change is that login and createaccount tokens will now be able
to be fetched via action=query&meta=tokens, in the same manner as other
CSRF tokens. Fetching them by submitting an action=login or
action=createaccount request without a token (to receive a NeedToken
response) is now deprecated, and a warning will be returned along with the
NeedToken response indicating this deprecation. There is no plan to
actually remove the NeedToken response from action=login at this time, and
any future plan for its removal will be announced separately with
appropriate lead time. The NeedToken response will remain in
action=createaccount until the previously-announced breaking change to that
module,[2] and will be removed from action=createaccount along with that
breaking change.
[1]: https://gerrit.wikimedia.org/r/#/c/264309/
[2]:
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2016-January/0…
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce