Le 30/11/2015 17:10, Petr Bena a écrit :
Hi,
I created this ticket: https://phabricator.wikimedia.org/T119878
The basic idea is that it shouldn't be a big problem to compress output of api.php script using some widely available library, like gzip.
That way the size of communication between client and server would be much smaller and users with slow internet might benefit from this. I am not sure how much the data would be reduced, but it could be a significant number in some cases.
What do you think about it? Is there any reason not to do that?
Note I don't propose some breaking change, rather just create an optional parameter "compression" that would be passed for API requests.
Hello,
That is supported by http clients/servers by sending a header.
curl --verbose https://www.mediawiki.org/w/api.php >/dev/null
Accept: */*
23kbytes payload
curl --compressed --verbose https://www.mediawiki.org/w/api.php >/dev/null
Accept: */* Accept-Encoding: deflate, gzip
6kbytes payload
So just pass: 'Accept-Encoding: gzip', and you should be served gziped content by Mediawiki.