Hi,
I am using mwclient 0.6.4 (r93) to import some Wiki pages from en.wikipedia to another wiki installation (presumably running Mediawiki 1.15).
Everything works fine, except when I try to import 'big' pages, e.g.:
http://en.wikipedia.org/wiki/Grb2
content= Mediawiki text file to be imported (111579 characters in this case)
When I try to write this page (or pages of similar size), I get the following error:
page = site.Pages['Grb2'] page.save(content)
Traceback: File "<stdin>", line 1, in <module> File "/usr/lib64/python2.6/site-packages/mwclient/page.py", line 142, in save result = do_edit() File "/usr/lib64/python2.6/site-packages/mwclient/page.py", line 137, in do_edit **data) File "/usr/lib64/python2.6/site-packages/mwclient/client.py", line 165, in api info = self.raw_api(action, **kwargs) File "/usr/lib64/python2.6/site-packages/mwclient/client.py", line 250, in raw_api return json.loads(json_data) File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads return _default_decoder.decode(s) File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded
I wonder if this error is due to server timeout or exceeding number of characters.. Any suggestions?
Thank you, Sal