Bugs item #2119685, was opened at 2008-09-19 20:53
Message generated for change (Settings changed) made by nicdumz
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2119685&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
>Priority: 8
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki.py does reverts
Initial Comment:
I got this message. As far as I know is this an old feature, that should be solved a long time ago.
At this moment I'm working with revision 5909. But I don't know with which version I did the reverts.
For the description of the feature see the message send by someones else that detected this feature.
Hello Carsrac,
I would like to draw your attention to these edits made by your bot: [http://de.wikipedia.org/w/index.php?title=Auge&diff=48157069&oldid=48020013 article de:Auge] and [http://la.wikipedia.org/w/index.php?title=The_Beatles&diff=562763&oldid=562… article la:The Beatles]. Obviously, the bot didn't edit the version that was at that time the current one, but an older one. That's equivalent to doing a revert. It seems that the bot edited the second last version that had been created by a bot. In these cases that meant a revert of [http://de.wikipedia.org/w/index.php?title=Auge&diff=48157069&oldid=47723213 two versions] of de:Auge and of [http://la.wikipedia.org/w/index.php?title=The_Beatles&diff=562763&oldid=551… 24 versions] of la:The Beatles.
How could that happen? Is there any reason to believe this didn't happen more often? I believe these are questions to be well considered since the bot seems to have done several thousand, if not tens of thousands of edits during that time. And it would be very assuring to know that the possibility that this still can or will happen can be ruled out.
--[[de:user:Lax]] - [[Gebruiker:88.65.169.226|88.65.169.226]] 19 sep 2008 10:15 (CEST)
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2008-10-01 22:41
Message:
This more likely seems to be the same as [ 1790473 ] Interwiki bot
overwrites changes, no edit conflict.
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2008-10-01 22:22
Message:
I don't know whether this and "[ 1999895 ] iw bot doesn't detect edit
conflict" are the same bug, but here it does not really seem to be an edit
conflict. We are talking about 11 and 19 days, repeat DAYS, old versions
here.
----------------------------------------------------------------------
Comment By: Carsrac (carsrac)
Date: 2008-09-20 03:16
Message:
It is the same problem as [ 1999895 ] iw bot doesn't detect edit conflict
----------------------------------------------------------------------
Comment By: NicDumZ Nicolas Dumazet (nicdumz)
Date: 2008-09-20 03:07
Message:
rciw.py for example *will* trigger this behavior quite often, sometimes
working on a version more than 10 hours old. I am not however, able to
confirm this on the standard interwiki.py
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2119685&group_…
Bugs item #2158228, was opened at 2008-10-10 23:48
Message generated for change (Comment added) made by nicdumz
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2158228&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: other
Group: None
Status: Deleted
Resolution: Duplicate
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: weblinkchecker.py doesn't report archive.org links anymore
Initial Comment:
Weblinkchecker does not report archive.org links anymore. On my run on Sept 26, it still reported the archive links, on Oct 3 weblinkchecker reported not a single (from several hundred dead links on that run).
For example http://web.archive.org/web/*/http://www.gruene-muenchen.de/landesverband.64… is available, but is no reported on http://de.wikipedia.org/wiki/Diskussion:Theresa_Schopper
During the run weblinkchecker gives the output:
Consulting the Internet Archive for http://www.gruene-muenchen.de/landesverband.6417.0.html
python version.py
Pywikipedia [http] trunk/pywikipedia (r5945, Oct 10 2008, 11:16:07)
Python 2.5.2 (r252:60911, Oct 5 2008, 19:24:49)
[GCC 4.3.2]
----------------------------------------------------------------------
Comment By: NicDumZ Nicolas Dumazet (nicdumz)
Date: 2008-10-20 05:19
Message:
bug 2158249 contains the exact same text
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2158228&group_…
Bugs item #2158228, was opened at 2008-10-10 23:48
Message generated for change (Settings changed) made by nicdumz
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2158228&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: other
Group: None
>Status: Deleted
>Resolution: Duplicate
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: weblinkchecker.py doesn't report archive.org links anymore
Initial Comment:
Weblinkchecker does not report archive.org links anymore. On my run on Sept 26, it still reported the archive links, on Oct 3 weblinkchecker reported not a single (from several hundred dead links on that run).
For example http://web.archive.org/web/*/http://www.gruene-muenchen.de/landesverband.64… is available, but is no reported on http://de.wikipedia.org/wiki/Diskussion:Theresa_Schopper
During the run weblinkchecker gives the output:
Consulting the Internet Archive for http://www.gruene-muenchen.de/landesverband.6417.0.html
python version.py
Pywikipedia [http] trunk/pywikipedia (r5945, Oct 10 2008, 11:16:07)
Python 2.5.2 (r252:60911, Oct 5 2008, 19:24:49)
[GCC 4.3.2]
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2158228&group_…
Bugs item #2169485, was opened at 2008-10-15 22:56
Message generated for change (Settings changed) made by nicdumz
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2169485&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Closed
Resolution: Fixed
Priority: 5
Private: No
Submitted By: giurrero (giurrero)
Assigned to: Nobody/Anonymous (nobody)
>Summary: image.py bug (improper re escaping)
Initial Comment:
image.py has a bug here:
if not site.nocapitalize:
old = '[' + self.oldImage[0].upper() + self.oldImage[0].lower() + ']' + self.oldImage[1:]
else:
old = self.oldImage
old = re.sub('[_ ]', '[_ ]', old)
escaped = re.escape(old)
if not self.loose or not self.newImage:
ImageRegex = re.compile(r'\[\[ *(?:' + '|'.join(site.namespace(6, all = True)) + ')\s*:\s*' + escaped + ' *(?P<parameters>\|[^\n]+|) *\]\]')
else:
ImageRegex = re.compile(r'' + escaped)
the escaping must be the first thing that you do, now if you replace [_ ] with [_ ], and do the escaping:
"my_image" -> "my[ _]image" -> "my\[\ \_\]image"
I think that the solution in wikipedia.py replaceImage will be the best
Pywikipedia [http] trunk/pywikipedia (r5976, Oct 15 2008, 17:28:48)
Python 2.5.2 (r252:60911, Aug 1 2008, 00:37:21)
[GCC 4.3.1 20080507 (prerelease) [gcc-4_3-branch revision 135036]]
----------------------------------------------------------------------
Comment By: NicDumZ Nicolas Dumazet (nicdumz)
Date: 2008-10-20 05:13
Message:
And similarly, if not self.nocapitalize:
imageName -> [iI]mageName -> \[iI\]mageName
Proper fix committed in r6002, thanks for the report !!
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2169485&group_…
Bugs item #2169485, was opened at 2008-10-15 22:56
Message generated for change (Settings changed) made by nicdumz
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2169485&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: giurrero (giurrero)
Assigned to: Nobody/Anonymous (nobody)
Summary: image.py bug
Initial Comment:
image.py has a bug here:
if not site.nocapitalize:
old = '[' + self.oldImage[0].upper() + self.oldImage[0].lower() + ']' + self.oldImage[1:]
else:
old = self.oldImage
old = re.sub('[_ ]', '[_ ]', old)
escaped = re.escape(old)
if not self.loose or not self.newImage:
ImageRegex = re.compile(r'\[\[ *(?:' + '|'.join(site.namespace(6, all = True)) + ')\s*:\s*' + escaped + ' *(?P<parameters>\|[^\n]+|) *\]\]')
else:
ImageRegex = re.compile(r'' + escaped)
the escaping must be the first thing that you do, now if you replace [_ ] with [_ ], and do the escaping:
"my_image" -> "my[ _]image" -> "my\[\ \_\]image"
I think that the solution in wikipedia.py replaceImage will be the best
Pywikipedia [http] trunk/pywikipedia (r5976, Oct 15 2008, 17:28:48)
Python 2.5.2 (r252:60911, Aug 1 2008, 00:37:21)
[GCC 4.3.1 20080507 (prerelease) [gcc-4_3-branch revision 135036]]
----------------------------------------------------------------------
Comment By: NicDumZ Nicolas Dumazet (nicdumz)
Date: 2008-10-20 05:13
Message:
And similarly, if not self.nocapitalize:
imageName -> [iI]mageName -> \[iI\]mageName
Proper fix committed in r6002, thanks for the report !!
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2169485&group_…
Bugs item #2169485, was opened at 2008-10-15 22:56
Message generated for change (Comment added) made by nicdumz
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2169485&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: giurrero (giurrero)
Assigned to: Nobody/Anonymous (nobody)
Summary: image.py bug
Initial Comment:
image.py has a bug here:
if not site.nocapitalize:
old = '[' + self.oldImage[0].upper() + self.oldImage[0].lower() + ']' + self.oldImage[1:]
else:
old = self.oldImage
old = re.sub('[_ ]', '[_ ]', old)
escaped = re.escape(old)
if not self.loose or not self.newImage:
ImageRegex = re.compile(r'\[\[ *(?:' + '|'.join(site.namespace(6, all = True)) + ')\s*:\s*' + escaped + ' *(?P<parameters>\|[^\n]+|) *\]\]')
else:
ImageRegex = re.compile(r'' + escaped)
the escaping must be the first thing that you do, now if you replace [_ ] with [_ ], and do the escaping:
"my_image" -> "my[ _]image" -> "my\[\ \_\]image"
I think that the solution in wikipedia.py replaceImage will be the best
Pywikipedia [http] trunk/pywikipedia (r5976, Oct 15 2008, 17:28:48)
Python 2.5.2 (r252:60911, Aug 1 2008, 00:37:21)
[GCC 4.3.1 20080507 (prerelease) [gcc-4_3-branch revision 135036]]
----------------------------------------------------------------------
Comment By: NicDumZ Nicolas Dumazet (nicdumz)
Date: 2008-10-20 05:13
Message:
And similarly, if not self.nocapitalize:
imageName -> [iI]mageName -> \[iI\]mageName
Proper fix committed in r6002, thanks for the report !!
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2169485&group_…
Bugs item #2180544, was opened at 2008-10-19 21:36
Message generated for change (Comment added) made by nicdumz
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2180544&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Closed
Resolution: Fixed
Priority: 5
Private: No
Submitted By: Bernhard Mayr (falk_steinhauer)
Assigned to: Nobody/Anonymous (nobody)
Summary: Bug in wikipedia.py and corresponding fix
Initial Comment:
Pywikipedia [http] trunk/pywikipedia (r6000, Oct 19 2008, 13:59:03)
Python 2.5.1 (r251:54863, Apr 18 2007, 08:51:08) [MSC v.1310 32 bit (Intel)]
The line preceeding to line 5229 of wikipedia.py needs to return from function allpages() when we catched the NotImplementedError.
Otherwise every robot will crash when being used for older wikis since the function steps through code that is only appropriate for new wikis.
----------------------------------------------------------------------
Comment By: NicDumZ Nicolas Dumazet (nicdumz)
Date: 2008-10-20 04:47
Message:
Thanks, fix committed in r6001 !
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2180544&group_…
Bugs item #2180544, was opened at 2008-10-19 21:36
Message generated for change (Settings changed) made by nicdumz
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2180544&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: Bernhard Mayr (falk_steinhauer)
Assigned to: Nobody/Anonymous (nobody)
Summary: Bug in wikipedia.py and corresponding fix
Initial Comment:
Pywikipedia [http] trunk/pywikipedia (r6000, Oct 19 2008, 13:59:03)
Python 2.5.1 (r251:54863, Apr 18 2007, 08:51:08) [MSC v.1310 32 bit (Intel)]
The line preceeding to line 5229 of wikipedia.py needs to return from function allpages() when we catched the NotImplementedError.
Otherwise every robot will crash when being used for older wikis since the function steps through code that is only appropriate for new wikis.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2180544&group_…