Hi Dario,
That's a great news! I am trying to use this. Documentation for /page/title/ https://rest.wikimedia.org/en.wikipedia.org/v1/?doc#!/Page_content/page_title__get say "List all pages." but the request produce only <1000 titles. There should be other parameters like "limit=max" or something, shouldn't it? How I can get ALL titles in one or multiple requests? Where I can find documentation for this?
Appreciate your help!
Regards, Alex Druk
On Wed, Mar 11, 2015 at 5:49 AM, Dario Taraborelli < dtaraborelli@wikimedia.org> wrote:
Cross-posting from wikitech-l, this will definitely be of interest to those of you on this list who work with our APIs.
Begin forwarded message:
*From:* Gabriel Wicke gwicke@wikimedia.org *Date:* March 10, 2015 at 15:23:03 PDT *To:* Wikimedia developers wikitech-l@lists.wikimedia.org, wikitech-ambassdors@lists.wikimedia.org, Development and Operations Engineers engineering@lists.wikimedia.org, mediawiki-api@lists.wikimedia.org *Subject:* *[Engineering] Wikimedia REST content API is now available in beta*
Hello all,
I am happy to announce the beta release of the Wikimedia REST Content API at
Each domain has its own API documentation, which is auto-generated from Swagger API specs. For example, here is the link for the English Wikipedia:
https://rest.wikimedia.org/en.wikipedia.org/v1/?doc
At present, this API provides convenient and low-latency access to article HTML, page metadata and content conversions between HTML and wikitext. After extensive testing we are confident that these endpoints are ready for production use, but have marked them as 'unstable' until we have also validated this with production users. You can start writing applications that depend on it now, if you aren't afraid of possible minor changes before transitioning to 'stable' status. For the definition of the terms ‘stable’ and ‘unstable’ see https://www.mediawiki.org/wiki/API_versioning .
While general and not specific to VisualEditor, the selection of endpoints reflects this release's focus on speeding up VisualEditor. By storing private Parsoid round-trip information separately, we were able to reduce the HTML size by about 40%. This in turn reduces network transfer and processing times, which will make loading and saving with VisualEditor faster. We are also switching from a cache to actual storage, which will eliminate slow VisualEditor loads caused by cache misses. Other users of Parsoid HTML like Flow, HTML dumps, the OCG PDF renderer or Content translation will benefit similarly.
But, we are not done yet. In the medium term, we plan to further reduce the HTML size by separating out all read-write metadata. This should allow us to use Parsoid HTML with its semantic markup https://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec directly for both views and editing without increasing the HTML size over the current output. Combined with performance work in VisualEditor, this has the potential to make switching to visual editing instantaneous and free of any scrolling.
We are also investigating a sub-page-level edit API for micro-contributions and very fast VisualEditor saves. HTML saves don't necessarily have to wait for the page to re-render from wikitext, which means that we can potentially make them faster than wikitext saves. For this to work we'll need to minimize network transfer and processing time on both client and server.
More generally, this API is intended to be the beginning of a multi-purpose content API. Its implementation (RESTBase http://www.mediawiki.org/wiki/RESTBase) is driven by a declarative Swagger API specification, which helps to make it straightforward to extend the API with new entry points. The same API spec is also used to auto-generate the aforementioned sandbox environment, complete with handy "try it" buttons. So, please give it a try and let us know what you think!
This API is currently unmetered; we recommend that users not perform more than 200 requests per second and may implement limitations if necessary.
I also want to use this opportunity to thank all contributors who made this possible:
- Marko Obrovac, Eric Evans, James Douglas and Hardik Juneja on the
Services team worked hard to build RESTBase, and to make it as extensible and clean as it is now.
- Filippo Giunchedi, Alex Kosiaris, Andrew Otto, Faidon Liambotis, Rob
Halsell and Mark Bergsma helped to procure and set up the Cassandra storage cluster backing this API.
- The Parsoid team with Subbu Sastry, Arlo Breault, C. Scott Ananian and
Marc Ordinas i Llopis is solving the extremely difficult task of converting between wikitext and HTML, and built a new API that lets us retrieve and pass in metadata separately.
- On the MediaWiki core team, Brad Jorsch quickly created a minimal
authorization API that will let us support private wikis, and Aaron Schulz, Alex Monk and Ori Livneh built and extended the VirtualRestService that lets VisualEditor and MediaWiki in general easily access external services.
We welcome your feedback here: https://www.mediawiki.org/wiki/Talk:RESTBase - and in Phabricator https://phabricator.wikimedia.org/maniphest/task/create/?projects=RESTBase&title=Feedback: .
Sincerely --
Gabriel Wicke
Principal Software Engineer, Wikimedia Foundation
Engineering mailing list Engineering@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/engineering
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics