Bugs item #2193543, was opened at 2008-10-25 09:35
Message generated for change (Settings changed) made by silvonen
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2193543&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Mikko Silvonen (silvonen)
Assigned to: Nobody/Anonymous (nobody)
>Summary: Page generation with -start crashes with TypeError
Initial Comment:
What is causing this crash?
>interwiki.py -start:A
Checked for running processes. 1 processes currently running, including the current process.
NOTE: Number of pages queued is 0, trying to add 60 more.
Dump fi (wikipedia) saved
Traceback (most recent call last):
File "C:\svn\pywikipedia\interwiki.py", line 1771, in <module>
bot.run()
File "C:\svn\pywikipedia\interwiki.py", line 1520, in run
self.queryStep()
File "C:\svn\pywikipedia\interwiki.py", line 1494, in queryStep
self.oneQuery()
File "C:\svn\pywikipedia\interwiki.py", line 1462, in oneQuery
site = self.selectQuerySite()
File "C:\svn\pywikipedia\interwiki.py", line 1436, in selectQuerySite
self.generateMore(globalvar.maxquerysize - mycount)
File "C:\svn\pywikipedia\interwiki.py", line 1370, in generateMore
page = self.pageGenerator.next()
File "c:\svn\pywikipedia\pagegenerators.py", line 684, in DuplicateFilterPageGenerator
for page in generator:
File "c:\svn\pywikipedia\pagegenerators.py", line 235, in AllpagesPageGenerator
for page in site.allpages(start = start, namespace = namespace, includeredirects = includeredirects):
File "c:\svn\pywikipedia\wikipedia.py", line 5247, in allpages
for p in soup.api.query.allpages:
TypeError: 'NoneType' object is not iterable
>python version.py
Pywikipedia [http] trunk/pywikipedia (r6015, Oct 24 2008, 18:29:39)
Python 2.5.1 (r251:54863, May 1 2007, 17:47:05) [MSC v.1310 32 bit (Intel)]
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2193543&group_…
Bugs item #2193543, was opened at 2008-10-25 09:35
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2193543&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Mikko Silvonen (silvonen)
Assigned to: Nobody/Anonymous (nobody)
Summary: Page generation with -start crashes on TypeError
Initial Comment:
What is causing this crash?
>interwiki.py -start:A
Checked for running processes. 1 processes currently running, including the current process.
NOTE: Number of pages queued is 0, trying to add 60 more.
Dump fi (wikipedia) saved
Traceback (most recent call last):
File "C:\svn\pywikipedia\interwiki.py", line 1771, in <module>
bot.run()
File "C:\svn\pywikipedia\interwiki.py", line 1520, in run
self.queryStep()
File "C:\svn\pywikipedia\interwiki.py", line 1494, in queryStep
self.oneQuery()
File "C:\svn\pywikipedia\interwiki.py", line 1462, in oneQuery
site = self.selectQuerySite()
File "C:\svn\pywikipedia\interwiki.py", line 1436, in selectQuerySite
self.generateMore(globalvar.maxquerysize - mycount)
File "C:\svn\pywikipedia\interwiki.py", line 1370, in generateMore
page = self.pageGenerator.next()
File "c:\svn\pywikipedia\pagegenerators.py", line 684, in DuplicateFilterPageGenerator
for page in generator:
File "c:\svn\pywikipedia\pagegenerators.py", line 235, in AllpagesPageGenerator
for page in site.allpages(start = start, namespace = namespace, includeredirects = includeredirects):
File "c:\svn\pywikipedia\wikipedia.py", line 5247, in allpages
for p in soup.api.query.allpages:
TypeError: 'NoneType' object is not iterable
>python version.py
Pywikipedia [http] trunk/pywikipedia (r6015, Oct 24 2008, 18:29:39)
Python 2.5.1 (r251:54863, May 1 2007, 17:47:05) [MSC v.1310 32 bit (Intel)]
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2193543&group_…
Revision: 6015
Author: a_engels
Date: 2008-10-24 18:29:39 +0000 (Fri, 24 Oct 2008)
Log Message:
-----------
Fixed a bug where the number of pages to do was updated incorrectly when more than one hint was given for the same language.
Modified Paths:
--------------
trunk/pywikipedia/interwiki.py
Modified: trunk/pywikipedia/interwiki.py
===================================================================
--- trunk/pywikipedia/interwiki.py 2008-10-23 21:56:01 UTC (rev 6014)
+++ trunk/pywikipedia/interwiki.py 2008-10-24 18:29:39 UTC (rev 6015)
@@ -554,13 +554,16 @@
self.todo.append(page)
self.foundIn[page] = [None]
- def openSites(self):
+ def openSites(self, allowdoubles = False):
"""Return a list of sites for all things we still need to do"""
distinctSites = {}
for page in self.todo:
site = page.site()
- distinctSites[site] = site
+ if allowdoubles:
+ distinctSites[page] = site
+ else:
+ distinctSites[site] = site
return distinctSites.values()
def willWorkOn(self, site):
@@ -1332,7 +1335,7 @@
"""Add a single subject to the list"""
subj = Subject(page, hints = hints)
self.subjects.append(subj)
- for site in subj.openSites():
+ for site in subj.openSites(allowdoubles = True):
# Keep correct counters
self.plus(site)
Patches item #2192349, was opened at 2008-10-24 19:20
Message generated for change (Comment added) made by roboticuskhan
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2192349&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Translations
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Christoffer (roboticuskhan)
Assigned to: Nobody/Anonymous (nobody)
Summary: Faroese (fo) translations for interwiki.py
Initial Comment:
I have submitted a patch file containing the Faroese translations for interwiki.py provided by User:Quackor (http://fo.wikipedia.org/wiki/Br%C3%BAkari:Quackor), a native speaker of Faroese. Thanks.
----------------------------------------------------------------------
>Comment By: Christoffer (roboticuskhan)
Date: 2008-10-24 19:23
Message:
I forgot; I got the translations here:
http://fo.wikipedia.org/wiki/Wikipedia_kjak:%C3%81heitan_um_bott_st%C3%B8%C…
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2192349&group_…
Patches item #2192349, was opened at 2008-10-24 19:20
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2192349&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Translations
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Christoffer (roboticuskhan)
Assigned to: Nobody/Anonymous (nobody)
Summary: Faroese (fo) translations for interwiki.py
Initial Comment:
I have submitted a patch file containing the Faroese translations for interwiki.py provided by User:Quackor (http://fo.wikipedia.org/wiki/Br%C3%BAkari:Quackor), a native speaker of Faroese. Thanks.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2192349&group_…
Revision: 6014
Author: russblau
Date: 2008-10-23 21:56:01 +0000 (Thu, 23 Oct 2008)
Log Message:
-----------
better handling of sysop accounts; add new randompages parameter
Modified Paths:
--------------
branches/rewrite/pywikibot/site.py
Modified: branches/rewrite/pywikibot/site.py
===================================================================
--- branches/rewrite/pywikibot/site.py 2008-10-23 21:55:14 UTC (rev 6013)
+++ branches/rewrite/pywikibot/site.py 2008-10-23 21:56:01 UTC (rev 6014)
@@ -100,7 +100,7 @@
# to implement a specific interface, define a Site class that inherits
# from this
- def __init__(self, code, fam=None, user=None):
+ def __init__(self, code, fam=None, user=None, sysop=None):
"""
@param code: the site's language code
@type code: str
@@ -108,6 +108,8 @@
@type fam: str or Family
@param user: bot user name (optional)
@type user: str
+ @param sysop: sysop account user name (optional)
+ @type sysop: str
"""
self.__code = code.lower()
@@ -133,7 +135,7 @@
raise NoSuchSite("Language %s does not exist in family %s"
% (self.__code, self.__family.name))
- self._username = user
+ self._username = [user, sysop]
# following are for use with lock_page and unlock_page methods
self._pagemutex = threading.Lock()
@@ -184,8 +186,10 @@
def user(self):
"""Return the currently-logged in bot user, or None."""
- if self.logged_in():
- return self._username
+ if self.logged_in(True):
+ return self._username[True]
+ elif self.logged_in(False):
+ return self._username[False]
return None
def __getattr__(self, attr):
@@ -480,7 +484,7 @@
@param sysop: if True, require sysop privileges.
"""
- if self.userinfo['name'] != self._username:
+ if self.userinfo['name'] != self._username[sysop]:
return False
return (not sysop) or 'sysop' in self.userinfo['groups']
@@ -499,13 +503,13 @@
self._getsiteinfo()
# check whether a login cookie already exists for this user
if hasattr(self, "_userinfo"):
- if self.userinfo['name'] == self._username:
+ if self.userinfo['name'] == self._username[sysop]:
return
if not self.logged_in(sysop):
loginMan = api.LoginManager(site=self, sysop=sysop,
- user=self._username)
+ user=self._username[sysop])
if loginMan.login(retry = True):
- self._username = loginMan.username
+ self._username[sysop] = loginMan.username
if hasattr(self, "_userinfo"):
del self._userinfo
self.getuserinfo()
@@ -526,8 +530,10 @@
- blockinfo: present if user is blocked (dict)
"""
- if not hasattr(self, "_userinfo") or "rights" not in self._userinfo \
- or self._userinfo['name'] != self._username:
+ if (not hasattr(self, "_userinfo")
+ or "rights" not in self._userinfo
+ or self._userinfo['name']
+ != self._username["sysop" in self._userinfo["groups"]]):
uirequest = api.Request(
site=self,
action="query",
@@ -1854,7 +1860,7 @@
usprop="blockinfo|groups|editcount|registration")
return usgen
- def randompages(self, limit=1, namespaces=None):
+ def randompages(self, limit=1, namespaces=None, redirects=False):
"""Iterate a number of random pages.
Pages are listed in a fixed sequence, only the starting point is
@@ -1862,6 +1868,8 @@
@param limit: the maximum number of pages to iterate (default: 1)
@param namespaces: only iterate pages in these namespaces.
+ @param redirects: if True, include only redirect pages in results
+ (default: include only non-redirects)
"""
rngen = api.PageGenerator("random", site=self)
@@ -1871,6 +1879,8 @@
for ns in namespaces)
elif namespaces is not None:
rngen.request["grnnamespace"] = str(namespaces)
+ if redirects:
+ rngen.request["grnredirect"] = ""
return rngen
# catalog of editpage error codes, for use in generating messages
Bugs item #2158249, was opened at 2008-10-11 00:01
Message generated for change (Comment added) made by wikipedian
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2158249&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: other
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: weblinkchecker.py doesn't report archive.org links anymore
Initial Comment:
Weblinkchecker does not report archive.org links anymore. On my run on Sept 26, it still reported the archive links, on Oct 3 weblinkchecker reported not a single (from several hundred dead links on that run).
For example http://web.archive.org/web/*/http://www.gruene-muenchen.de/landesverband.64… is available, but is no reported on http://de.wikipedia.org/wiki/Diskussion:Theresa_Schopper
During the run weblinkchecker gives the output:
Consulting the Internet Archive for http://www.gruene-muenchen.de/landesverband.6417.0.html
python version.py
Pywikipedia [http] trunk/pywikipedia (r5945, Oct 10 2008, 11:16:07)
Python 2.5.2 (r252:60911, Oct 5 2008, 19:24:49)
[GCC 4.3.2]
----------------------------------------------------------------------
>Comment By: Daniel Herding (wikipedian)
Date: 2008-10-21 23:48
Message:
Fixed.
The reason was that the Internet Archive now uses GZIP compression.
urllib2 doesn't handle the decompression for us, so we have to do it
ourselves.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2158249&group_…