Hi,
I am using mwclient 0.6.4 (r93) to import some Wiki pages from en.wikipedia to another wiki installation (presumably running Mediawiki 1.15).
Everything works fine, except when I try to import 'big' pages, e.g.:
http://en.wikipedia.org/wiki/Grb2
content= Mediawiki text file to be imported (111579 characters in this case)
When I try to write this page (or pages of similar size), I get the following error:
page = site.Pages['Grb2'] page.save(content)
Traceback: File "<stdin>", line 1, in <module> File "/usr/lib64/python2.6/site-packages/mwclient/page.py", line 142, in save result = do_edit() File "/usr/lib64/python2.6/site-packages/mwclient/page.py", line 137, in do_edit **data) File "/usr/lib64/python2.6/site-packages/mwclient/client.py", line 165, in api info = self.raw_api(action, **kwargs) File "/usr/lib64/python2.6/site-packages/mwclient/client.py", line 250, in raw_api return json.loads(json_data) File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads return _default_decoder.decode(s) File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded
I wonder if this error is due to server timeout or exceeding number of characters.. Any suggestions?
Thank you, Sal
2010/7/13 Sal976 salvatore.loguercio@googlemail.com:
I wonder if this error is due to server timeout or exceeding number of characters.. Any suggestions?
It spends so much time parsing the wikitext you supplied that PHP's max execution time limit was exceeded, which causes an empty (0-byte) response. You can fix the error message by checking for a response of length zero before you try to JSON-decode it.
Roan Kattouw (Catrope)
Thanks much for the quick answer, I see the problem now. I wonder how these large wikitexts could be written on my target Wiki. Is there a way to 'force' the PHP max execution time limit through the API? If not, I guess I will have to contact a sysop..
Roan Kattouw-2 wrote:
2010/7/13 Sal976 salvatore.loguercio@googlemail.com:
I wonder if this error is due to server timeout or exceeding number of characters.. Any suggestions?
It spends so much time parsing the wikitext you supplied that PHP's max execution time limit was exceeded, which causes an empty (0-byte) response. You can fix the error message by checking for a response of length zero before you try to JSON-decode it.
Roan Kattouw (Catrope)
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
2010/7/14 Salvatore Loguercio salvatore.loguercio@googlemail.com:
Thanks much for the quick answer, I see the problem now. I wonder how these large wikitexts could be written on my target Wiki. Is there a way to 'force' the PHP max execution time limit through the API? If not, I guess I will have to contact a sysop..
That would kind of defeat the purpose of the max execution time. AFAIK the page should still have been saved, just not have been parsed completely. You can check this in the history view.
You can ask a sysop to raise the max exec time, or to import these large pages using the importTextFile.php maintenance script.
Roan Kattouw (Catrope)
mediawiki-api@lists.wikimedia.org