Bugs item #3587728, was opened at 2012-11-15 19:23
Message generated for change (Settings changed) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: trunk
>Status: Closed
>Resolution: Fixed
Priority: 7
Private: No
Submitted By: Hazard-SJ (hazard-sj)
Assigned to: Nobody/Anonymous (nobody)
Summary: Non-items on Wikidata cannot be edited
Initial Comment:
I've been trying to run the archivebot.py script on Wikidata, but due to a number of things, the first of which is a KeyError from wikipedia.py, it fails. I've done a number of things, including removing and modification of code from wikipedia.py to try to get into the archivebot.py script, and I believe that the error(s) I get from there are from my removals. At the moment, the problem after what I've tried is with getting the text of the page. I changed "lines = self.get().split('\n')" to "pg = pywikibot.Page(Site, self.title())" followed by "lines = pg.get().split('\n')", but pg.get() returns NoneType, and obviously as a result the splitting can't even take place.
It seems PyWikipedia was only changed for editing queries, or is part of the problem from the server's end? Thanks.
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2012-11-22 04:03
Message:
I changed:
https://www.mediawiki.org/wiki/Special:Code/pywikipedia/10746
and tested:
http://www.wikidata.org/w/index.php?title=User:Dexbot&curid=59002&diff=5316…
and that was OK.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-11-21 00:11
Message:
wikidata is hard-coded in pywikibot.Page._getEditPage() for all namespaces
and data items (since r10615). So this bug has not been fixed.
----------------------------------------------------------------------
Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-20 19:34
Message:
Nope, just checked out r10745 and ran "python archivebot.py -S
User:Hazard-Bot/Archiver" and got the error attached.
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2012-11-20 05:32
Message:
Is it ok now?
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-11-18 03:24
Message:
This bug is caused by r10615
http://www.mediawiki.org/w/index.php?title=Special:Code/pywikipedia/10615&p…
----------------------------------------------------------------------
Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-15 19:23
Message:
Pywikipedia [http] trunk/pywikipedia (r10732, 2012/11/14, 21:06:52)
Python 2.7.1 (r271:86832, Jan 4 2011, 13:57:14)
[GCC 4.5.2]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Bugs item #3587728, was opened at 2012-11-15 19:23
Message generated for change (Comment added) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: trunk
Status: Open
Resolution: None
Priority: 7
Private: No
Submitted By: Hazard-SJ (hazard-sj)
Assigned to: Nobody/Anonymous (nobody)
Summary: Non-items on Wikidata cannot be edited
Initial Comment:
I've been trying to run the archivebot.py script on Wikidata, but due to a number of things, the first of which is a KeyError from wikipedia.py, it fails. I've done a number of things, including removing and modification of code from wikipedia.py to try to get into the archivebot.py script, and I believe that the error(s) I get from there are from my removals. At the moment, the problem after what I've tried is with getting the text of the page. I changed "lines = self.get().split('\n')" to "pg = pywikibot.Page(Site, self.title())" followed by "lines = pg.get().split('\n')", but pg.get() returns NoneType, and obviously as a result the splitting can't even take place.
It seems PyWikipedia was only changed for editing queries, or is part of the problem from the server's end? Thanks.
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2012-11-22 04:03
Message:
I changed:
https://www.mediawiki.org/wiki/Special:Code/pywikipedia/10746
and tested:
http://www.wikidata.org/w/index.php?title=User:Dexbot&curid=59002&diff=5316…
and that was OK.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-11-21 00:11
Message:
wikidata is hard-coded in pywikibot.Page._getEditPage() for all namespaces
and data items (since r10615). So this bug has not been fixed.
----------------------------------------------------------------------
Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-20 19:34
Message:
Nope, just checked out r10745 and ran "python archivebot.py -S
User:Hazard-Bot/Archiver" and got the error attached.
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2012-11-20 05:32
Message:
Is it ok now?
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-11-18 03:24
Message:
This bug is caused by r10615
http://www.mediawiki.org/w/index.php?title=Special:Code/pywikipedia/10615&p…
----------------------------------------------------------------------
Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-15 19:23
Message:
Pywikipedia [http] trunk/pywikipedia (r10732, 2012/11/14, 21:06:52)
Python 2.7.1 (r271:86832, Jan 4 2011, 13:57:14)
[GCC 4.5.2]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Patches item #3367839, was opened at 2011-07-15 03:11
Message generated for change (Comment added) made by nobody
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=3367839&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: lankier (lankier)
Assigned to: Nobody/Anonymous (nobody)
Summary: Retrieve / edit the section
Initial Comment:
This patch adds new parameter 'section' to Page.get & Page.put.
(See also feature requests https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3104703&group_… )
Examples:
add a new section:
page.put('New section text', comment='New section header', section='new')
edit the top section:
text = page.get(section=0)
page.put(text+'\n\n==New section==\nNew text', section=0)
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2012-11-21 04:29
Message:
Have you ever thought about adding a little bit more than just your
articles? I mean, what you say is important and everything. Nevertheless
think of if you added some great visuals or video clips to give your posts
more, "pop"! Your content is excellent but with images and clips, this site
could definitely be one of the very best in its niche. Good blog!
cheap north face http://mfvfhwmjxm.yep.com
----------------------------------------------------------------------
Comment By: mpaa (mpaa)
Date: 2012-11-17 13:46
Message:
It is a useful one. Was looking for it recently.
Any plans for this?
----------------------------------------------------------------------
Comment By: Merlijn S. van Deen (valhallasw)
Date: 2012-03-21 11:06
Message:
Housekeeper's note: the patch applies cleanly to r10035 (with some fuzz).
----------------------------------------------------------------------
Comment By: Merlijn S. van Deen (valhallasw)
Date: 2012-03-21 10:03
Message:
Is referring to sections by ID a good idea? Isn't it worth the extra query
to get the section ID first? (by calling action=render). We currently
'support' sections by having a Page object with title "page title#section",
and I think this is the sensible way to work with sections.
----------------------------------------------------------------------
Comment By: Bináris (binbot)
Date: 2011-07-15 11:32
Message:
Great and important contribution, I support the fast acceptation! This
feature has long been missing from Pywiki.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=3367839&group_…
Feature Requests item #3588882, was opened at 2012-11-21 01:09
Message generated for change (Tracker Item Submitted) made by dixond
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3588882&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: DixonD (dixond)
Assigned to: Nobody/Anonymous (nobody)
Summary: Interwiki graph: use a different color for duplicate links
Initial Comment:
It would be very helpful if duplicate links for the same languages were shown in a different color. Sometimes an interwiki graph gets too large and it is not easy to find an issue. But it would help if we could easily find out that, for instance, there two links to en-wiki
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3588882&group_…
Bugs item #3587728, was opened at 2012-11-15 19:23
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: trunk
Status: Open
Resolution: None
Priority: 7
Private: No
Submitted By: Hazard-SJ (hazard-sj)
Assigned to: Nobody/Anonymous (nobody)
Summary: Non-items on Wikidata cannot be edited
Initial Comment:
I've been trying to run the archivebot.py script on Wikidata, but due to a number of things, the first of which is a KeyError from wikipedia.py, it fails. I've done a number of things, including removing and modification of code from wikipedia.py to try to get into the archivebot.py script, and I believe that the error(s) I get from there are from my removals. At the moment, the problem after what I've tried is with getting the text of the page. I changed "lines = self.get().split('\n')" to "pg = pywikibot.Page(Site, self.title())" followed by "lines = pg.get().split('\n')", but pg.get() returns NoneType, and obviously as a result the splitting can't even take place.
It seems PyWikipedia was only changed for editing queries, or is part of the problem from the server's end? Thanks.
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2012-11-21 00:11
Message:
wikidata is hard-coded in pywikibot.Page._getEditPage() for all namespaces
and data items (since r10615). So this bug has not been fixed.
----------------------------------------------------------------------
Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-20 19:34
Message:
Nope, just checked out r10745 and ran "python archivebot.py -S
User:Hazard-Bot/Archiver" and got the error attached.
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2012-11-20 05:32
Message:
Is it ok now?
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-11-18 03:24
Message:
This bug is caused by r10615
http://www.mediawiki.org/w/index.php?title=Special:Code/pywikipedia/10615&p…
----------------------------------------------------------------------
Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-15 19:23
Message:
Pywikipedia [http] trunk/pywikipedia (r10732, 2012/11/14, 21:06:52)
Python 2.7.1 (r271:86832, Jan 4 2011, 13:57:14)
[GCC 4.5.2]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Bugs item #3587728, was opened at 2012-11-15 19:23
Message generated for change (Comment added) made by hazard-sj
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: trunk
Status: Open
Resolution: None
Priority: 7
Private: No
Submitted By: Hazard-SJ (hazard-sj)
Assigned to: Nobody/Anonymous (nobody)
Summary: Non-items on Wikidata cannot be edited
Initial Comment:
I've been trying to run the archivebot.py script on Wikidata, but due to a number of things, the first of which is a KeyError from wikipedia.py, it fails. I've done a number of things, including removing and modification of code from wikipedia.py to try to get into the archivebot.py script, and I believe that the error(s) I get from there are from my removals. At the moment, the problem after what I've tried is with getting the text of the page. I changed "lines = self.get().split('\n')" to "pg = pywikibot.Page(Site, self.title())" followed by "lines = pg.get().split('\n')", but pg.get() returns NoneType, and obviously as a result the splitting can't even take place.
It seems PyWikipedia was only changed for editing queries, or is part of the problem from the server's end? Thanks.
----------------------------------------------------------------------
>Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-20 19:34
Message:
Nope, just checked out r10745 and ran "python archivebot.py -S
User:Hazard-Bot/Archiver" and got the error attached.
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2012-11-20 05:32
Message:
Is it ok now?
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-11-18 03:24
Message:
This bug is caused by r10615
http://www.mediawiki.org/w/index.php?title=Special:Code/pywikipedia/10615&p…
----------------------------------------------------------------------
Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-15 19:23
Message:
Pywikipedia [http] trunk/pywikipedia (r10732, 2012/11/14, 21:06:52)
Python 2.7.1 (r271:86832, Jan 4 2011, 13:57:14)
[GCC 4.5.2]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Bugs item #3587728, was opened at 2012-11-15 19:23
Message generated for change (Comment added) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: trunk
Status: Open
Resolution: None
Priority: 7
Private: No
Submitted By: Hazard-SJ (hazard-sj)
Assigned to: Nobody/Anonymous (nobody)
Summary: Non-items on Wikidata cannot be edited
Initial Comment:
I've been trying to run the archivebot.py script on Wikidata, but due to a number of things, the first of which is a KeyError from wikipedia.py, it fails. I've done a number of things, including removing and modification of code from wikipedia.py to try to get into the archivebot.py script, and I believe that the error(s) I get from there are from my removals. At the moment, the problem after what I've tried is with getting the text of the page. I changed "lines = self.get().split('\n')" to "pg = pywikibot.Page(Site, self.title())" followed by "lines = pg.get().split('\n')", but pg.get() returns NoneType, and obviously as a result the splitting can't even take place.
It seems PyWikipedia was only changed for editing queries, or is part of the problem from the server's end? Thanks.
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2012-11-20 05:32
Message:
Is it ok now?
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-11-18 03:24
Message:
This bug is caused by r10615
http://www.mediawiki.org/w/index.php?title=Special:Code/pywikipedia/10615&p…
----------------------------------------------------------------------
Comment By: Hazard-SJ (hazard-sj)
Date: 2012-11-15 19:23
Message:
Pywikipedia [http] trunk/pywikipedia (r10732, 2012/11/14, 21:06:52)
Python 2.7.1 (r271:86832, Jan 4 2011, 13:57:14)
[GCC 4.5.2]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3587728&group_…
Bugs item #3588463, was opened at 2012-11-19 01:25
Message generated for change (Tracker Item Submitted) made by eugo
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3588463&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: trunk
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: euku (eugo)
Assigned to: Nobody/Anonymous (nobody)
Summary: Bot saves big pages but doesn't know that it was saved
Initial Comment:
The bot can edit (i.e. the edit is actually made) a huge page (>900 KB), but it gets back an error and cannot proceed:
"""
Updating page [[Wikipedia:Bibliotheksrecherche/Anfragen/Archiv/2012]] via API
<urlopen error [Errno 35] Resource temporarily unavailable>
WARNING: Could not open 'http://de.wikipedia.org/w/api.php'. Maybe the server or
your connection is down. Retrying in 1 minutes...
"""
Run the attached script to see this error.
I remember there was a similar problem with Wikimedia's server configuration some time ago. But I don't know the details anymore...
---
Pywikipedia [http] trunk/pywikipedia (r10741, 2012/11/18, 20:22:23)
Python 2.7.3 (v2.7.3:70274d53c1dd, Apr 9 2012, 20:52:43)
[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3588463&group_…