We have decided to officially retire the rest.wikimedia.org domain in
favor of /api/rest_v1/ at each individual project domain. For example,
https://rest.wikimedia.org/en.wikipedia.org/v1/?doc
becomes
https://en.wikipedia.org/api/rest_v1/?doc
Most clients already use the new path, and benefit from better
performance from geo-distributed caching, no additional DNS lookups,
and sharing of TLS / HTTP2 connections.
We intend to shut down the rest.wikimedia.org entry point around
March, so please adjust your clients to use /api/rest_v1/ soon.
Thank you for your cooperation,
Gabriel
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
Luigi,
On Thu, Jan 28, 2016 at 2:09 AM, XDiscovery Team <info(a)xdiscovery.com> wrote:
> I tried /rest_v1/ endpoint and it is terribly fast.
that is great to hear. A major goal is indeed to provide high volume
and low latency access to our content.
> @Strainu / @Gabriel , what does 'graph' extension do ?
If you refer to
https://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/get_page_graph_png…,
this is an end point exposing rendered graph images for
https://www.mediawiki.org/wiki/Extension:Graph (as linked in the end
point documentation).
> I have few questions for using proxy cache:
> 1# Is it possible to query a page by page_ID and include redirect?
We don't currently provide access by page ID. Could you describe your
use case a bit to help us understand how access by page id would help
you?
> /page/title/{title}
> allow to get metadata by page, including the pageID , but I would like to
> have final page redirect (e.g. dna return 7956 and I would like to fetch
> 7955 of redirected 'DNA' )
We are looking into improving our support for redirects:
https://phabricator.wikimedia.org/T118548. Your input on this topic
would be much appreciated.
> /page/html/{title} get the article but page_ID / curid is missing in source
> I would like to get the two combined.
This information is actually included in the response, both in the
`ETag` header and in the <head> of the HTML itself. I have updated the
documentation to spell this out more clearly in [1]. The relevant
addition is this:
The response provides an `ETag` header indicating the revision and
render timeuuid separated by a slash (ex: `ETag:
701384379/154d7bca-c264-11e5-8c2f-1b51b33b59fc`). This ETag can be
passed to the HTML save end point (as `base_etag` POST parameter), and
can also be used to retrieve the exact corresponding data-parsoid
metadata, by requesting the specific `revision` and `tid` indicated by
the `ETag`.
> 2# The rest are experimental:
> what could happen if a query fail?
> Does it raise an error, return 404 page or what else?
The stability markers are primarily about request and response
formats, and not about technical availability. Experimental end points
can change at any time, which can result in errors (if the request
interface changed), or return a different response format.
We are currently discussing the use of `Accept` headers for response
format versioning at
https://www.mediawiki.org/wiki/Talk:API_versioning. This will allow us
to more aggressively stabilize end points by giving us the option of
tweaking response formats without breaking existing clients.
> I am thinking if possible to use api.wikipedia as fallback, and use proxy
> cache as primary source any ajax example for doing that to handle possible
> failures?
Yes, this is certainly possible. However, you can rely on end points
currently marked as "unstable" in the REST API. Basically all of them
are used by a lot of production clients at this point, and are very
reliable. Once we introduce general `Accept` support, basically all of
the unstable end points will likely become officially "stable", and
several `experimental` end points will graduate to `unstable`.
> 3# Does /rest/ endpoint exist also for other languages?
Yes, it is available for all 800+ public Wikimedia projects at /api/rest_v1/.
[1]: https://github.com/wikimedia/restbase/pull/488/files#diff-2b6b60416eaafdf0a…
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation