Thanks Gabriel, Jaime. It worked OK and it seems to be as fast as you
El 04/08/16 a les 23:34, Gabriel Wicke ha escrit:
Toni, we heavily use caching to speed up the REST API,
so making individual
requests is the fastest way to retrieve content. You can use parallelism to
achieve your desired throughput, and with HTTP/2 all those parallel
requests can even share a single TCP connection. The request limit for the
API overall is 200 req/s, as documented in
Hope this helps,
On Thu, Aug 4, 2016 at 2:20 PM, Jaime Crespo <jcrespo(a)wikimedia.org> wrote:
> Sorry, I am not sure 100%, if that is true, maybe creating a feature
> request may help suggesting its implementation?
> On Thu, Aug 4, 2016 at 3:09 PM, Toni Hermoso Pulido <toniher(a)cau.cat>
>> Thanks Jaime, so it only works with Action (MediaWiki default) API so
>> far, doesn't it?
>> El 08/04/2016 a les 10:07 AM, Jaime Crespo ha escrit:
>>> Hi, you can combine multiple pages with the "pipe" sign:
>>> (change 'jsonfm' for 'json' on a real request)
>>> There is a limit on the number of pages depending on your account
>>> rights, but it is very helpful to avoid round-trip latencies for us in
>>> high-latency places.
>>> On Thu, Aug 4, 2016 at 9:34 AM, Toni Hermoso Pulido <toniher(a)cau.cat>
>>>> is it already possible to retrieve data from different pages just by
>>>> using one request?
>>>> E.g by combining: