Revision: 6509
Author: filnik
Date: 2009-03-16 19:54:27 +0000 (Mon, 16 Mar 2009)
Log Message:
-----------
Bugfix
Modified Paths:
--------------
trunk/pywikipedia/checkimages.py
Modified: trunk/pywikipedia/checkimages.py
===================================================================
--- trunk/pywikipedia/checkimages.py 2009-03-15 12:35:14 UTC (rev 6508)
+++ trunk/pywikipedia/checkimages.py 2009-03-16 19:54:27 UTC (rev 6509)
@@ -1056,6 +1056,7 @@
This function check this.
"""
if template in self.list_licenses: # the list_licenses are loaded in the __init__ (not to load them multimple times)
+ self.license_selected = template.title().replace('Template:', '')
self.seems_ok = True
self.license_found = self.license_selected # let the last "fake" license normally detected
return True
@@ -1076,7 +1077,6 @@
we can find something in the info that we already have, then make a deeper check.
"""
for template in self.licenses_found:
- self.license_selected = template.title().replace('Template:', '')
result = self.miniTemplateCheck(template)
if result:
break
@@ -1086,12 +1086,11 @@
template.pageAPInfo()
except wikipedia.IsRedirectPage:
template = template.getRedirectTarget()
+ result = self.miniTemplateCheck(template)
+ if result:
+ break
except wikipedia.NoPage:
- continue
- self.license_selected = template.title().replace('Template:', '')
- result = self.miniTemplateCheck(template)
- if result:
- break
+ continue
def smartDetection(self):
""" The bot instead of checking if there's a simple template in the
@@ -1113,8 +1112,8 @@
# Found the templates ONLY in the image's description
for template_selected in templatesInTheImageRaw:
for templateReal in self.licenses_found:
- if self.convert_to_url(template_selected).lower().replace('template:', '') == \
- self.convert_to_url(templateReal.title().replace('template:', '')).lower():
+ if self.convert_to_url(template_selected).lower().replace('template%3a', '') == \
+ self.convert_to_url(templateReal.title()).lower().replace('template%3a', ''):
if templateReal not in self.allLicenses: # don't put the same template, twice.
self.allLicenses.append(templateReal)
if self.licenses_found != []:
Bugs item #2577598, was opened at 2009-02-07 15:49
Message generated for change (Comment added) made by twisted86
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2577598&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Mikko Silvonen (silvonen)
Assigned to: Nobody/Anonymous (nobody)
Summary: AttributeError: 'NoneType' object has no attribute 'query'
Initial Comment:
My autonomous interwiki run on all fiwiki categories crashed today with the following error.
...
======Post-processing [[fi:Luokka:Cradle of Filth]]======
Updating links on page [[es:Categoría:Cradle of Filth]].
No changes needed
Updating links on page [[ja:Category:kureidoru obu fuirusu]].
No changes needed
Updating links on page [[sk:Kategória:Cradle of Filth]].
No changes needed
Updating links on page [[en:Category:Cradle of Filth]].
No changes needed
Updating links on page [[fi:Luokka:Cradle of Filth]].
No changes needed
NOTE: The first unfinished subject is [[fi:Luokka:Cradle of Filthin albumit]]
NOTE: Number of pages queued is 99, trying to add 60 more.
Dump fi (wikipedia) saved
Traceback (most recent call last):
File "interwiki.py", line 1818, in <module>
bot.run()
File "interwiki.py", line 1538, in run
self.queryStep()
File "interwiki.py", line 1512, in queryStep
self.oneQuery()
File "interwiki.py", line 1480, in oneQuery
site = self.selectQuerySite()
File "interwiki.py", line 1454, in selectQuerySite
self.generateMore(globalvar.maxquerysize - mycount)
File "interwiki.py", line 1388, in generateMore
page = self.pageGenerator.next()
File "c:\svn\pywikipedia\pagegenerators.py", line 670, in NamespaceFilterPageGenerator
for page in generator:
File "c:\svn\pywikipedia\pagegenerators.py", line 688, in DuplicateFilterPageGenerator
for page in generator:
File "c:\svn\pywikipedia\pagegenerators.py", line 239, in AllpagesPageGenerator
for page in site.allpages(start = start, namespace = namespace, includeredirects = includeredirects):
File "c:\svn\pywikipedia\wikipedia.py", line 5424, in allpages
for p in soup.api.query.allpages:
AttributeError: 'NoneType' object has no attribute 'query'
>python version.py
Pywikipedia [http] trunk/pywikipedia (r6334, Feb 06 2009, 16:42:40)
Python 2.5.1 (r251:54863, May 1 2007, 17:47:05) [MSC v.1310 32 bit (Intel)]
----------------------------------------------------------------------
Comment By: Michael Heggen (twisted86)
Date: 2009-03-16 19:50
Message:
I am also getting this error when trying the following:
python replace.py -start:! epee épée
(or any other replace query using "start:!", for that matter)
Exact error:
Checked for running processes. 1 processes currently running, including
the current process.
Traceback (most recent call last):
File "/Users/michael/pywikipedia/pagegenerators.py", line 776, in
__iter__
for page in self.wrapped_gen:
File "/Users/michael/pywikipedia/pagegenerators.py", line 709, in
DuplicateFilterPageGenerator
for page in generator:
File "/Users/michael/pywikipedia/pagegenerators.py", line 248, in
AllpagesPageGenerator
for page in site.allpages(start = start, namespace = namespace,
includeredirects = includeredirects):
File "/Users/michael/pywikipedia/wikipedia.py", line 5503, in allpages
for p in soup.api.query.allpages:
AttributeError: 'NoneType' object has no attribute 'query'
'NoneType' object has no attribute 'query'
I am using:
Pywikipedia [http] trunk/pywikipedia (r6508, Mar 15 2009, 12:35:14)
Python 2.5.1 (r251:54863, Jan 13 2009, 10:26:13)
[GCC 4.0.1 (Apple Inc. build 5465)]
MediaWiki 1.14.0
PHP 5.2.8 (cgi-fcgi)
MySQL 5.0.67.d7-ourdelta-log
I just installed PyWikipedia, so this is pretty frustrating. If I knew how
to do anything in Python beyond "Hello, world!", I'd try to fix it, but I
am 15 years out of practice on coding anything.
----------------------------------------------------------------------
Comment By: Michael Heggen (twisted86)
Date: 2009-03-16 19:49
Message:
I am also getting this error when trying the following:
python replace.py -start:! epee épée
(or any other replace query using "start:!", for that matter)
Exact error:
Checked for running processes. 1 processes currently running, including
the current process.
Traceback (most recent call last):
File "/Users/michael/pywikipedia/pagegenerators.py", line 776, in
__iter__
for page in self.wrapped_gen:
File "/Users/michael/pywikipedia/pagegenerators.py", line 709, in
DuplicateFilterPageGenerator
for page in generator:
File "/Users/michael/pywikipedia/pagegenerators.py", line 248, in
AllpagesPageGenerator
for page in site.allpages(start = start, namespace = namespace,
includeredirects = includeredirects):
File "/Users/michael/pywikipedia/wikipedia.py", line 5503, in allpages
for p in soup.api.query.allpages:
AttributeError: 'NoneType' object has no attribute 'query'
'NoneType' object has no attribute 'query'
I am using:
Pywikipedia [http] trunk/pywikipedia (r6508, Mar 15 2009, 12:35:14)
Python 2.5.1 (r251:54863, Jan 13 2009, 10:26:13)
[GCC 4.0.1 (Apple Inc. build 5465)]
MediaWiki 1.14.0
PHP 5.2.8 (cgi-fcgi)
MySQL 5.0.67.d7-ourdelta-log
I just installed PyWikipedia, so this is pretty frustrating. If I knew how
to do anything in Python beyond "Hello, world!", I'd try to fix it, but I
am 15 years out of practice on coding anything.
----------------------------------------------------------------------
Comment By: Multichill (multichill)
Date: 2009-02-07 16:37
Message:
I had the same error yesterday. There seems to be something wrong with the
allpages generator.
The api was changed recently, maybe that has something to do with it.
soup = BeautifulSoup(text, convertEntities=BeautifulSoup.HTML_ENTITIES)
(line 5421 in wikipedia.py) should return an object.
Looks like soup exists, but api doesn't exist. That's strange. When i look
at http://commons.wikimedia.org/w/api.php api is the root element.
We should probably build in some checks to see if we got everything
instead of assuming we get it right straight away.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2577598&group_…
Bugs item #2577598, was opened at 2009-02-07 15:49
Message generated for change (Comment added) made by twisted86
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2577598&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Mikko Silvonen (silvonen)
Assigned to: Nobody/Anonymous (nobody)
Summary: AttributeError: 'NoneType' object has no attribute 'query'
Initial Comment:
My autonomous interwiki run on all fiwiki categories crashed today with the following error.
...
======Post-processing [[fi:Luokka:Cradle of Filth]]======
Updating links on page [[es:Categoría:Cradle of Filth]].
No changes needed
Updating links on page [[ja:Category:kureidoru obu fuirusu]].
No changes needed
Updating links on page [[sk:Kategória:Cradle of Filth]].
No changes needed
Updating links on page [[en:Category:Cradle of Filth]].
No changes needed
Updating links on page [[fi:Luokka:Cradle of Filth]].
No changes needed
NOTE: The first unfinished subject is [[fi:Luokka:Cradle of Filthin albumit]]
NOTE: Number of pages queued is 99, trying to add 60 more.
Dump fi (wikipedia) saved
Traceback (most recent call last):
File "interwiki.py", line 1818, in <module>
bot.run()
File "interwiki.py", line 1538, in run
self.queryStep()
File "interwiki.py", line 1512, in queryStep
self.oneQuery()
File "interwiki.py", line 1480, in oneQuery
site = self.selectQuerySite()
File "interwiki.py", line 1454, in selectQuerySite
self.generateMore(globalvar.maxquerysize - mycount)
File "interwiki.py", line 1388, in generateMore
page = self.pageGenerator.next()
File "c:\svn\pywikipedia\pagegenerators.py", line 670, in NamespaceFilterPageGenerator
for page in generator:
File "c:\svn\pywikipedia\pagegenerators.py", line 688, in DuplicateFilterPageGenerator
for page in generator:
File "c:\svn\pywikipedia\pagegenerators.py", line 239, in AllpagesPageGenerator
for page in site.allpages(start = start, namespace = namespace, includeredirects = includeredirects):
File "c:\svn\pywikipedia\wikipedia.py", line 5424, in allpages
for p in soup.api.query.allpages:
AttributeError: 'NoneType' object has no attribute 'query'
>python version.py
Pywikipedia [http] trunk/pywikipedia (r6334, Feb 06 2009, 16:42:40)
Python 2.5.1 (r251:54863, May 1 2007, 17:47:05) [MSC v.1310 32 bit (Intel)]
----------------------------------------------------------------------
Comment By: Michael Heggen (twisted86)
Date: 2009-03-16 19:49
Message:
I am also getting this error when trying the following:
python replace.py -start:! epee épée
(or any other replace query using "start:!", for that matter)
Exact error:
Checked for running processes. 1 processes currently running, including
the current process.
Traceback (most recent call last):
File "/Users/michael/pywikipedia/pagegenerators.py", line 776, in
__iter__
for page in self.wrapped_gen:
File "/Users/michael/pywikipedia/pagegenerators.py", line 709, in
DuplicateFilterPageGenerator
for page in generator:
File "/Users/michael/pywikipedia/pagegenerators.py", line 248, in
AllpagesPageGenerator
for page in site.allpages(start = start, namespace = namespace,
includeredirects = includeredirects):
File "/Users/michael/pywikipedia/wikipedia.py", line 5503, in allpages
for p in soup.api.query.allpages:
AttributeError: 'NoneType' object has no attribute 'query'
'NoneType' object has no attribute 'query'
I am using:
Pywikipedia [http] trunk/pywikipedia (r6508, Mar 15 2009, 12:35:14)
Python 2.5.1 (r251:54863, Jan 13 2009, 10:26:13)
[GCC 4.0.1 (Apple Inc. build 5465)]
MediaWiki 1.14.0
PHP 5.2.8 (cgi-fcgi)
MySQL 5.0.67.d7-ourdelta-log
I just installed PyWikipedia, so this is pretty frustrating. If I knew how
to do anything in Python beyond "Hello, world!", I'd try to fix it, but I
am 15 years out of practice on coding anything.
----------------------------------------------------------------------
Comment By: Multichill (multichill)
Date: 2009-02-07 16:37
Message:
I had the same error yesterday. There seems to be something wrong with the
allpages generator.
The api was changed recently, maybe that has something to do with it.
soup = BeautifulSoup(text, convertEntities=BeautifulSoup.HTML_ENTITIES)
(line 5421 in wikipedia.py) should return an object.
Looks like soup exists, but api doesn't exist. That's strange. When i look
at http://commons.wikimedia.org/w/api.php api is the root element.
We should probably build in some checks to see if we got everything
instead of assuming we get it right straight away.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2577598&group_…
Bugs item #2688795, was opened at 2009-03-16 17:15
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2688795&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: category
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Iván Pérez (murodeaguas)
Assigned to: Nobody/Anonymous (nobody)
Summary: "category.py add" doesn't follow redirects
Initial Comment:
When I run the command "category.py add" and follow the instructions on the screen, the bot begins adding categories to the articles that I've specified, but if these are redirects it doesn't work on them: It should work on the redirect target.
I have made a list [http://es.wikipedia.org/wiki/Usuario:Muro_de_Aguas/Bugs/pywikipedia] where (I think) you can see the problem easily.
Thanks.
=====================================================
Output of version.py
=====================================================
Pywikipedia [http] trunk/pywikipedia (r6508, Mar 15 2009, 12:35:14)
Python 2.5.1 (r251:54863, Apr 18 2007, 08:51:08) [MSC v.1310 32 bit (Intel)]
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2688795&group_…
Bugs item #2627985, was opened at 2009-02-22 16:20
Message generated for change (Comment added) made by carsrac
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2627985&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Carsrac (carsrac)
Assigned to: Nobody/Anonymous (nobody)
Summary: disambig in haw.wikipedia
Initial Comment:
{{Anakuhi:Huaʻōlelo puana like}} is the haw translation of {{template:disambig}}
----------------------------------------------------------------------
>Comment By: Carsrac (carsrac)
Date: 2009-03-13 22:16
Message:
coulld someone solved this or tell how i can solve it locally.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2627985&group_…
Revision: 6507
Author: filnik
Date: 2009-03-13 16:00:28 +0000 (Fri, 13 Mar 2009)
Log Message:
-----------
Bugfix -.- lower() seems not working in THAT position -.-
Modified Paths:
--------------
trunk/pywikipedia/checkimages.py
Modified: trunk/pywikipedia/checkimages.py
===================================================================
--- trunk/pywikipedia/checkimages.py 2009-03-11 17:54:39 UTC (rev 6506)
+++ trunk/pywikipedia/checkimages.py 2009-03-13 16:00:28 UTC (rev 6507)
@@ -1114,7 +1114,7 @@
for template_selected in templatesInTheImageRaw:
for templateReal in self.licenses_found:
if self.convert_to_url(template_selected).lower().replace('template:', '') == \
- self.convert_to_url(templateReal.title().lower().replace('template:', '')):
+ self.convert_to_url(templateReal.title().replace('template:', '')).lower():
if templateReal not in self.allLicenses: # don't put the same template, twice.
self.allLicenses.append(templateReal)
if self.licenses_found != []: