Hi everyone,
Last couple of days, I was trying to write a script around WP API (http://en.wikipedia.org/w/api.php) and I'm struggling with "action=edit" while trying to edit a page because there seems to be limitation to the POST size.
The error that I'm receiving is vague: <error> Request: POST http://en.wikipedia.org/w/api.php?action=edit&format=xml, from 84.22.62.67 via knsq23.knams.wikimedia.org (squid/2.7.STABLE6) to ()
Error: ERR_INVALID_REQ, errno [No Error] at Thu, 11 Mar 2010 11:46:27 GMT </error>
AFAICS there is no problem with POST size up to 2.5KB, anything more than that results with that error.
So, is there any size limitation regarding POST?
Regards,
2010/3/11 Krenar Qehaja kedadi@gmail.com:
Hi everyone,
Last couple of days, I was trying to write a script around WP API (http://en.wikipedia.org/w/api.php) and I'm struggling with "action=edit" while trying to edit a page because there seems to be limitation to the POST size.
The error that I'm receiving is vague:
<error> Request: POST http://en.wikipedia.org/w/api.php?action=edit&format=xml, from 84.22.62.67 via knsq23.knams.wikimedia.org (squid/2.7.STABLE6) to ()
Error: ERR_INVALID_REQ, errno [No Error] at Thu, 11 Mar 2010 11:46:27 GMT
</error>
AFAICS there is no problem with POST size up to 2.5KB, anything more than that results with that error.
So, is there any size limitation regarding POST?
No, there are no such limitations. ERR_INVALID_REQ seems to indicate there's something wrong with your request, could you pastebin an example of a full request that fails, including all headers? Also, does retrying the request help?
Roan Kattouw (Catrope)
On Thursday 11 March 2010 15:14:49 Roan Kattouw wrote:
No, there are no such limitations. ERR_INVALID_REQ seems to indicate there's something wrong with your request, could you pastebin an example of a full request that fails, including all headers? Also, does retrying the request help?
I pasted a snippet of the code on pastebin.ca and you can find it at http://pastebin.ca/1833958 (I posted only the function that I'm having troubles with).
It is written on php. The commented "$this->text" (the first one) fails each time I try with "Error: ERR_INVALID_REQ, errno [No Error]". The uncommented "$this->text" (the second one) variable works all the time, and this is the response: <?xml version="1.0"?><api><edit result="Success" pageid="26495486" title="Wikipedia:WikiProject Albania/publicwatchlistauto" oldrevid="349201137" newrevid="349241149" newtimestamp="2010-03-11T16:15:39Z" />
This is the page that I'm dealing with: http://en.wikipedia.org/wiki/Wikipedia%3AWikiProject_Albania/publicwatchlist...
Basically the script takes all articles tagged with "WikiProject Albania" and puts them in a Public Watchlist, to track changes easily.
Regards,
2010/3/11 Krenar Qehaja kedadi@gmail.com:
I pasted a snippet of the code on pastebin.ca and you can find it at http://pastebin.ca/1833958 (I posted only the function that I'm having troubles with).
From that code, I can't really see what's wrong. Could you intercept
the raw HTTP request and response and pastebin those?
Roan Kattouw (Catrope)
On Thursday 11 March 2010 21:28:56 Roan Kattouw wrote:
From that code, I can't really see what's wrong. Could you intercept the raw HTTP request and response and pastebin those?
I'm sorry Roan, I didn't quiet understand the "raw HTTP request".
What do you exactly mean by that? Wireshark sniffed packets?
Regards,
I suspect that he is not giving a useragent which is causing the problem.
On Thu, Mar 11, 2010 at 3:34 PM, Krenar Qehaja kedadi@gmail.com wrote:
On Thursday 11 March 2010 21:28:56 Roan Kattouw wrote:
From that code, I can't really see what's wrong. Could you intercept the raw HTTP request and response and pastebin those?
I'm sorry Roan, I didn't quiet understand the "raw HTTP request".
What do you exactly mean by that? Wireshark sniffed packets?
Regards,
Krenar Qehaja
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
On Thursday 11 March 2010 21:37:17 Betacommand wrote:
I suspect that he is not giving a useragent which is causing the problem.
I had the UserAgent problem some time ago, but now I'm parsing it, see http://pastebin.ca/1833958.
Regards,
2010/3/11 Krenar Qehaja kedadi@gmail.com:
I'm sorry Roan, I didn't quiet understand the "raw HTTP request".
What do you exactly mean by that? Wireshark sniffed packets?
Unless CURL has a feature for capturing HTTP request and response headers, you may have to do that, yes.
Roan Kattouw (Catrope)
On Thursday 11 March 2010 21:52:51 Roan Kattouw wrote:
Unless CURL has a feature for capturing HTTP request and response headers, you may have to do that, yes.
* About to connect() to en.wikipedia.org port 80 (#0) * Trying 91.198.174.2... * connected * Connected to en.wikipedia.org (91.198.174.2) port 80 (#0)
POST /w/api.php?action=edit&format=xml HTTP/1.1
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.8) Gecko/20100214 Ubuntu/9.10 (karmic) Firefox/3.5.8 Host: en.wikipedia.org Accept: */* Cookie: enwikiUserName=Kedadi; enwikiUserID=252701; enwikiToken=e90204af77a78cc5fdfdd035a69c45c2; enwiki_session=39ee0fb34e4f5f68ce4977a3987f474e Content-Length: 1324 Content-Type: application/x-www-form-urlencoded Expect: 100-continue
* HTTP 1.0, assume close after body < HTTP/1.0 417 Expectation failed < Server: squid/2.7.STABLE6 < Date: Thu, 11 Mar 2010 22:30:07 GMT < Content-Type: text/html < Content-Length: 61649 < X-Squid-Error: ERR_INVALID_REQ 0 < X-Cache: MISS from knsq30.knams.wikimedia.org < X-Cache-Lookup: NONE from knsq30.knams.wikimedia.org:80 < Connection: close < * Closing connection #0
2010/3/11 Krenar Qehaja kedadi@gmail.com:
POST /w/api.php?action=edit&format=xml HTTP/1.1
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.8) Gecko/20100214 Ubuntu/9.10 (karmic) Firefox/3.5.8 Host: en.wikipedia.org Accept: */* Cookie: enwikiUserName=Kedadi; enwikiUserID=252701; enwikiToken=e90204af77a78cc5fdfdd035a69c45c2; enwiki_session=39ee0fb34e4f5f68ce4977a3987f474e Content-Length: 1324 Content-Type: application/x-www-form-urlencoded Expect: 100-continue
- HTTP 1.0, assume close after body
< HTTP/1.0 417 Expectation failed < Server: squid/2.7.STABLE6
Looks like our Squids don't support HTTP 1.1 continue requests for some weird reason. Try telling cURL to use HTTP 1.0 instead of HTTP 1.1, or (if possible) to not use continue requests.
Roan Kattouw (Catrope)
On Thursday 11 March 2010 23:42:56 Roan Kattouw wrote:
Looks like our Squids don't support HTTP 1.1 continue requests for some weird reason. Try telling cURL to use HTTP 1.0 instead of HTTP 1.1, or (if possible) to not use continue requests.
Yes, seems to be a Squid problem.
http://www.gnegg.ch/2007/02/the-return-of-except-100-continue/ solved the problem, by disabling "100-continue".
Thanks a lot for your time.
Regards,
2010/3/12 Krenar Qehaja kedadi@gmail.com:
Yes, seems to be a Squid problem.
Turns out it's not even a problem with our Squid configuration, but a lack of HTTP 1.1 support in the Squid software.
Roan Kattouw (Catrope)
On Thu, Mar 11, 2010 at 1:14 PM, Krenar Qehaja kedadi@gmail.com wrote:
Request: POST http://en.wikipedia.org/w/api.php?action=edit&format=xml, from 84.22.62.67 via knsq23.knams.wikimedia.org (squid/2.7.STABLE6) to ()
& You need to urlencode your query string, not your htmlencode.
Bryan
mediawiki-api@lists.wikimedia.org