Setting "bot":1" is working now. Thanks!
On Thu, Jul 22, 2010 at 4:28 PM, mediawiki-api-request@lists.wikimedia.orgwrote:
Send Mediawiki-api mailing list submissions to mediawiki-api@lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit https://lists.wikimedia.org/mailman/listinfo/mediawiki-api or, via email, send a message with subject or body 'help' to mediawiki-api-request@lists.wikimedia.org
You can reach the person managing the list at mediawiki-api-owner@lists.wikimedia.org
When replying, please edit your Subject line so it is more specific than "Re: Contents of Mediawiki-api digest..."
Today's Topics:
- Problem with mwclient page.save and large text files (Sal976)
- Re: Problem with mwclient page.save and large text files (Roan Kattouw)
- Re: Problem with mwclient page.save and large text files (Salvatore Loguercio)
- Re: Problem with mwclient page.save and large text files (Roan Kattouw)
- login from multiple (two) places? (Robert Ullmann)
- Re: login from multiple (two) places? (Roan Kattouw)
- Re: login from multiple (two) places? (Robert Ullmann)
- Re: login from multiple (two) places? (Carl (CBM))
- Mediawiki adds extra lines (rashi dhing)
- how to set flags to on in the api? (Python Script)
- Re: how to set flags to on in the api? (Brad Jorsch)
Message: 1 Date: Tue, 13 Jul 2010 08:06:19 -0700 (PDT) From: Sal976 salvatore.loguercio@googlemail.com Subject: [Mediawiki-api] Problem with mwclient page.save and large text files To: mediawiki-api@lists.wikimedia.org Message-ID: 29151604.post@talk.nabble.com Content-Type: text/plain; charset=us-ascii
Hi,
I am using mwclient 0.6.4 (r93) to import some Wiki pages from en.wikipedia to another wiki installation (presumably running Mediawiki 1.15).
Everything works fine, except when I try to import 'big' pages, e.g.:
http://en.wikipedia.org/wiki/Grb2
content= Mediawiki text file to be imported (111579 characters in this case)
When I try to write this page (or pages of similar size), I get the following error:
page = site.Pages['Grb2'] page.save(content)
Traceback: File "<stdin>", line 1, in <module> File "/usr/lib64/python2.6/site-packages/mwclient/page.py", line 142, in save result = do_edit() File "/usr/lib64/python2.6/site-packages/mwclient/page.py", line 137, in do_edit **data) File "/usr/lib64/python2.6/site-packages/mwclient/client.py", line 165, in api info = self.raw_api(action, **kwargs) File "/usr/lib64/python2.6/site-packages/mwclient/client.py", line 250, in raw_api return json.loads(json_data) File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads return _default_decoder.decode(s) File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded
I wonder if this error is due to server timeout or exceeding number of characters.. Any suggestions?
Thank you, Sal
-- View this message in context: http://old.nabble.com/Problem-with-mwclient-page.save-and-large-text-files-t... Sent from the WikiMedia API mailing list archive at Nabble.com.
Message: 2 Date: Tue, 13 Jul 2010 17:20:23 +0200 From: Roan Kattouw roan.kattouw@gmail.com Subject: Re: [Mediawiki-api] Problem with mwclient page.save and large text files To: "MediaWiki API announcements & discussion" mediawiki-api@lists.wikimedia.org Message-ID: AANLkTinSMXRCMzaPNfr1PqTHYLinVREKIruVrlsrQRPt@mail.gmail.com Content-Type: text/plain; charset=ISO-8859-1
2010/7/13 Sal976 salvatore.loguercio@googlemail.com:
I wonder if this error is due to server timeout or exceeding number of characters.. Any suggestions?
It spends so much time parsing the wikitext you supplied that PHP's max execution time limit was exceeded, which causes an empty (0-byte) response. You can fix the error message by checking for a response of length zero before you try to JSON-decode it.
Roan Kattouw (Catrope)
Message: 3 Date: Wed, 14 Jul 2010 01:39:12 -0700 (PDT) From: Salvatore Loguercio salvatore.loguercio@googlemail.com Subject: Re: [Mediawiki-api] Problem with mwclient page.save and large text files To: mediawiki-api@lists.wikimedia.org Message-ID: 29159629.post@talk.nabble.com Content-Type: text/plain; charset=us-ascii
Thanks much for the quick answer, I see the problem now. I wonder how these large wikitexts could be written on my target Wiki. Is there a way to 'force' the PHP max execution time limit through the API? If not, I guess I will have to contact a sysop..
Roan Kattouw-2 wrote:
2010/7/13 Sal976 salvatore.loguercio@googlemail.com:
I wonder if this error is due to server timeout or exceeding number of characters.. Any suggestions?
It spends so much time parsing the wikitext you supplied that PHP's max execution time limit was exceeded, which causes an empty (0-byte) response. You can fix the error message by checking for a response of length zero before you try to JSON-decode it.
Roan Kattouw (Catrope)
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
-- View this message in context: http://old.nabble.com/Problem-with-mwclient-page.save-and-large-text-files-t... Sent from the WikiMedia API mailing list archive at Nabble.com.
Message: 4 Date: Wed, 14 Jul 2010 12:22:58 +0200 From: Roan Kattouw roan.kattouw@gmail.com Subject: Re: [Mediawiki-api] Problem with mwclient page.save and large text files To: "MediaWiki API announcements & discussion" mediawiki-api@lists.wikimedia.org Message-ID: AANLkTinrrlRsfOIufQdpxyo36gifWJ5znIudlRUqf2j8@mail.gmail.com Content-Type: text/plain; charset=ISO-8859-1
2010/7/14 Salvatore Loguercio salvatore.loguercio@googlemail.com:
Thanks much for the quick answer, I see the problem now. I wonder how these large wikitexts could be written on my target Wiki. Is there a way to 'force' the PHP max execution time limit through the
API?
If not, I guess I will have to contact a sysop..
That would kind of defeat the purpose of the max execution time. AFAIK the page should still have been saved, just not have been parsed completely. You can check this in the history view.
You can ask a sysop to raise the max exec time, or to import these large pages using the importTextFile.php maintenance script.
Roan Kattouw (Catrope)
Message: 5 Date: Wed, 14 Jul 2010 14:39:58 +0300 From: Robert Ullmann rlullmann@gmail.com Subject: [Mediawiki-api] login from multiple (two) places? To: "MediaWiki API announcements & discussion" mediawiki-api@lists.wikimedia.org Message-ID: AANLkTillFYQg-wnd6UMibKx6VUF7EsKmF4erp39KdAMR@mail.gmail.com Content-Type: text/plain; charset=UTF-8
Hi,
I haven't been able to figure this out from the doc ...
Can I log in a user (bot) from more than one place (IP address) at the same time? I have an impending need to run Interwicket from more than one place, as the primary will be unavailable at times. Is logging in the other system going to log out the first, or some such? any bad effects?
Robert
Message: 6 Date: Wed, 14 Jul 2010 13:55:46 +0200 From: Roan Kattouw roan.kattouw@gmail.com Subject: Re: [Mediawiki-api] login from multiple (two) places? To: "MediaWiki API announcements & discussion" mediawiki-api@lists.wikimedia.org Message-ID: AANLkTikGvBlnNhmwINZUSpfJZQaxfyR8wUN1lnvxGiph@mail.gmail.com Content-Type: text/plain; charset=ISO-8859-1
2010/7/14 Robert Ullmann rlullmann@gmail.com:
Hi,
I haven't been able to figure this out from the doc ...
Can I log in a user (bot) from more than one place (IP address) at the same time? I have an impending need to run Interwicket from more than one place, as the primary will be unavailable at times. Is logging in the other system going to log out the first, or some such? any bad effects?
I believe it will cause such an effect, yes. AFAIK the only reliable way to be logged in in two places at once is to log in in one place and transfer the information in the login cookies to the second place.
I think you should just try it and see what happens; worst case you can detect you've been logged out and log in again, although that might not be very nice if you run stuff simultaneously from two places, as they'll spend a lot of time competing for logged-in status.
Roan Kattouw (Catrope)
Message: 7 Date: Wed, 14 Jul 2010 15:54:00 +0300 From: Robert Ullmann rlullmann@gmail.com Subject: Re: [Mediawiki-api] login from multiple (two) places? To: "MediaWiki API announcements & discussion" mediawiki-api@lists.wikimedia.org Message-ID: AANLkTinZEQJxvQ_QPe03kB1LzxiWZ_t-8q5tBlsPvwYl@mail.gmail.com Content-Type: text/plain; charset=UTF-8
Interesting.
I log in from two places and I get different session identifiers (not surprising), but the *same* token.
Apparently the problem is that if one logs out from anywhere, it invalidates everywhere. Not too sure about the exact conditions, because it is rather painful. (;-) But bots don't usually log out, so that isn't an issue, they stay logged in for years.
Might be okay.
On Wed, Jul 14, 2010 at 2:55 PM, Roan Kattouw roan.kattouw@gmail.com wrote:
2010/7/14 Robert Ullmann rlullmann@gmail.com:
Hi,
I haven't been able to figure this out from the doc ...
Can I log in a user (bot) from more than one place (IP address) at the same time? I have an impending need to run Interwicket from more than one place, as the primary will be unavailable at times. Is logging in the other system going to log out the first, or some such? any bad effects?
I believe it will cause such an effect, yes. AFAIK the only reliable way to be logged in in two places at once is to log in in one place and transfer the information in the login cookies to the second place.
I think you should just try it and see what happens; worst case you can detect you've been logged out and log in again, although that might not be very nice if you run stuff simultaneously from two places, as they'll spend a lot of time competing for logged-in status.
Roan Kattouw (Catrope)
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
Message: 8 Date: Thu, 15 Jul 2010 23:37:57 -0400 From: "Carl (CBM)" cbm.wikipedia@gmail.com Subject: Re: [Mediawiki-api] login from multiple (two) places? To: "MediaWiki API announcements & discussion" mediawiki-api@lists.wikimedia.org Message-ID: AANLkTimckxckEmCjvLa-7qUziinfByG78EFdJUyY-TQk@mail.gmail.com Content-Type: text/plain; charset=ISO-8859-1
On Wed, Jul 14, 2010 at 7:39 AM, Robert Ullmann rlullmann@gmail.com wrote:
Can I log in a user (bot) from more than one place (IP address) at the same time?
Yes; it works fine. I do it all the time with a script running on a remote server at the same time I am logged in on my local computer. I have never encountered any difficulties.
- Carl
Message: 9 Date: Sun, 18 Jul 2010 14:36:51 +0300 From: rashi dhing rashi.dhing@gmail.com Subject: [Mediawiki-api] Mediawiki adds extra lines To: mediawiki-api@lists.wikimedia.org Message-ID: AANLkTinRh59-9g_1YZK_i3jM0RnkDHHmGzrthdaozkAt@mail.gmail.com Content-Type: text/plain; charset="iso-8859-1"
Does anyone know how to remove the line breaks and the <p> and <pre> tags that mediawiki seems to add internally while publishing a page. I am working on this project where any text added by the user in the wiki is processed and transferred to an editor and while i am trying to maintain the alignment, mediawiki tends to add extra spaces and lines. How can I take care of this ?
Thanks,
Rashi D.