Bugs item #2035835, was opened at 2008-08-02 10:02
Message generated for change (Settings changed) made by sf-robot
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2035835&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
>Status: Closed
Resolution: Out of Date
Priority: 5
Private: No
Submitted By: Purodha B Blissenbach (purodha)
Assigned to: Nobody/Anonymous (nobody)
Summary: SaxParseBug caused error invalid literal for int()
Initial Comment:
I got an error message an trace dump from interwiki.py which afterwords continues gracefully. Here are the messages:
python /home/purodha/pywikipedia/interwiki.py -v -initialredirect -new:3
Checked for running processes. 1 processes currently running, including the current process.
Pywikipediabot (r5776 (wikipedia.py), Aug 01 2008, 15:39:04)
Python 2.5.2 (r252:60911, May 28 2008, 19:19:25)
[GCC 4.2.4 (Debian 4.2.4-1)]
Retrieving mediawiki messages from Special:Allmessages
WARNING: No character set found.
NOTE: Number of pages queued is 0, trying to add 60 more.
Getting 3 pages from wikipedia:ksh...
-- some lines skipped --
Getting 1 pages from wikipedia:am...
ERROR: SaxParseBug caused error invalid literal for int() with base 10: 'NS_CATEGORY'. Dump SaxParseBug_wikipedia_am__Sat_Aug__2_09-54-57_2008.dump created.
Traceback (most recent call last):
File "/home/purodha/pywikipedia/pagegenerators.py", line 768, in __iter__
for loaded_page in self.preload(somePages):
File "/home/purodha/pywikipedia/pagegenerators.py", line 785, in preload
wikipedia.getall(site, pagesThisSite)
File "/home/purodha/pywikipedia/wikipedia.py", line 2950, in getall
_GetAll(site, pages, throttle, force).run()
File "/home/purodha/pywikipedia/wikipedia.py", line 2798, in run
xml.sax.parseString(data, handler)
File "/usr/lib/python2.5/site-packages/_xmlplus/sax/__init__.py", line 47, in parseString
parser.parse(inpsrc)
File "/usr/lib/python2.5/site-packages/_xmlplus/sax/expatreader.py", line 109, in parse
xmlreader.IncrementalParser.parse(self, source)
File "/usr/lib/python2.5/site-packages/_xmlplus/sax/xmlreader.py", line 123, in parse
self.feed(buffer)
File "/usr/lib/python2.5/site-packages/_xmlplus/sax/expatreader.py", line 216, in feed
self._parser.Parse(data, isFinal)
File "/usr/lib/python2.5/site-packages/_xmlplus/sax/expatreader.py", line 312, in start_element
self._cont_handler.startElement(name, AttributesImpl(attrs))
File "/home/purodha/pywikipedia/xmlreader.py", line 150, in startElement
self.namespaceid = int(attrs['key'])
ValueError: invalid literal for int() with base 10: 'NS_CATEGORY'
invalid literal for int() with base 10: 'NS_CATEGORY'
Getting page [[am:????]]
etc.
----------------------------------------------------------------------
>Comment By: SourceForge Robot (sf-robot)
Date: 2010-01-12 02:22
Message:
This Tracker item was closed automatically by the system. It was
previously set to a Pending status, and the original submitter
did not respond within 14 days (the time period specified by
the administrator of this Tracker).
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-28 11:18
Message:
Since there are a lot of changes at the related scripts. Is this bug still
valid? Otherwise it could be closed due to out of date.
----------------------------------------------------------------------
Comment By: DarkoNeko (darkoneko)
Date: 2008-08-02 11:14
Message:
Logged In: YES
user_id=1809111
Originator: NO
Same for me
----version----
Pywikipedia [http] trunk/pywikipedia (r5781, Aug 01 2008, 21:44:26)
Python 2.5.1 (r251:54863, Apr 18 2007, 08:51:08) [MSC v.1310 32 bit
(Intel)]
----trace----
Updating links on page [[scn:1054]].
No changes needed
Updating links on page [[be:1054]].
No changes needed
Getting 60 pages from wikipedia:oc...
Getting 60 pages from wikipedia:mk...
Checked for running processes. 1 processes currently running, including
the curr
ent process.
Getting 60 pages from wikipedia:sw...
Getting 60 pages from wikipedia:pi...
Getting 60 pages from wikipedia:sa...
Getting 60 pages from wikipedia:am...
ERROR: SaxParseBug caused error invalid literal for int() with base 10:
'NS_CATE
GORY'. Dump SaxParseBug_wikipedia_am__Sat_Aug_02_13-07-31_2008.dump
created.
Traceback (most recent call last):
File "C:\Program Files\TortoiseSVN\pywikipedia\pagegenerators.py", line
762, i
n __iter__
for loaded_page in self.preload(somePages):
File "C:\Program Files\TortoiseSVN\pywikipedia\pagegenerators.py", line
785, i
n preload
wikipedia.getall(site, pagesThisSite)
File "C:\Program Files\TortoiseSVN\pywikipedia\wikipedia.py", line 2950,
in ge
tall
_GetAll(site, pages, throttle, force).run()
File "C:\Program Files\TortoiseSVN\pywikipedia\wikipedia.py", line 2798,
in ru
n
xml.sax.parseString(data, handler)
File "c:\Program Files\Python25\lib\xml\sax\__init__.py", line 49, in
parseStr
ing
parser.parse(inpsrc)
File "c:\Program Files\Python25\lib\xml\sax\expatreader.py", line 107,
in pars
e
xmlreader.IncrementalParser.parse(self, source)
File "c:\Program Files\Python25\lib\xml\sax\xmlreader.py", line 123, in
parse
self.feed(buffer)
File "c:\Program Files\Python25\lib\xml\sax\expatreader.py", line 207,
in feed
self._parser.Parse(data, isFinal)
File "c:\Program Files\Python25\lib\xml\sax\expatreader.py", line 301,
in star
t_element
self._cont_handler.startElement(name, AttributesImpl(attrs))
File "C:\Program Files\TortoiseSVN\pywikipedia\xmlreader.py", line 150,
in sta
rtElement
self.namespaceid = int(attrs['key'])
ValueError: invalid literal for int() with base 10: 'NS_CATEGORY'
invalid literal for int() with base 10: 'NS_CATEGORY'
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2035835&group_…
Bugs item #2930108, was opened at 2010-01-11 22:34
Message generated for change (Tracker Item Submitted) made by multichill
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2930108&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Multichill (multichill)
Assigned to: Nobody/Anonymous (nobody)
Summary: -usercontribs: broken
Initial Comment:
The pagegenerator -usercontribs should return all the edits by a certain user. It doesn't. It stops after a couple of pages:
D:\Wikipedia\pywikipedia>version.py
Pywikipedia [svn+ssh] multichill@trunk/pywikipedia (r7850, 2010/01/02, 12:59:20)
Python 2.5.1 (r251:54863, Apr 18 2007, 08:51:08) [MSC v.1310 32 bit (Intel)]
D:\Wikipedia\pywikipedia>imagecopy.py -lang:en -usercontribs:R.J.Oosterbaan
WARNING: Configuration variable 'CommonsDelinker' is defined but unknown. Misspe
lled?
Getting 60 pages from wikipedia:en...
Skipping File:HablehRud.JPG
Skipping File:PunataFan.png
Skipping File:KhuzdarFan.JPG
Skipping File:Flood&recession.JPG
Skipping File:BundPhoto.JPG
Skipping File:OkavangoPhoto.JPG
Skipping File:OkavangoMap.JPG
Skipping File:PunataReserv.JPG
Skipping File:PunataTransmis.JPG
Skipping File:PunataRegion.JPG
Skipping File:PunataCropping.JPG
Skipping File:StripCropping.jpg
Skipping File:GarmsarWatBal.JPG
Skipping File:GarmsarAquifer.JPG
Skipping File:GarmsarMap.JPG
Skipping File:Well&Karez.JPG
Skipping File:KhuzdarBund.JPG
Skipping File:KhuzdarAgri.JPG
Skipping File:OkavangoFan.png
Skipping File:GarmsarFan.png
Skipping File:TidalCrossSection.GIF
Getting 29 pages from wikipedia:en...
Skipping File:WellDrain.gif
Skipping File:WellDrain.png
Skipping File:AbadanIsland.GIF
Skipping File:Drainsection.PNG
Skipping File:AnisotropicSoil.PNG
Skipping File:DrainSection.jpg
Still 1 active threads, lets wait
All threads are done
D:\Wikipedia\pywikipedia>
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2930108&group_…
Bugs item #2929809, was opened at 2010-01-11 14:24
Message generated for change (Tracker Item Submitted) made by basilicofresco
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2929809&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Davide Bolsi (basilicofresco)
Assigned to: Nobody/Anonymous (nobody)
Summary: 'NoneType' object has no attribute 'strip'
Initial Comment:
I was getting a strange error with replace.py and some large file dumps, so I did some testing...
Well, I discovered that replace.py halts while loading some pages: eg. "Technical Architecture Group" on en.wikipedia, but I got the same error also with a page on it.wikipedia.
It halts on the very same page both with dump file and with direct page loading. This error is particular annoying because for example I'm not able to full scan the whole dump.
Examples:
1) direct page loading
C:\pywikipedia>replace.py -lang:en -page:"Technical Architecture Group" "a" "b"
Getting 1 pages from wikipedia:en...
Traceback (most recent call last):
File "C:\pywikipedia\pagegenerators.py", line 860, in __iter__
for loaded_page in self.preload(somePages):
File "C:\pywikipedia\pagegenerators.py", line 879, in preload
wikipedia.getall(site, pagesThisSite)
File "C:\pywikipedia\wikipedia.py", line 4159, in getall
_GetAll(site, pages, throttle, force).run()
File "C:\pywikipedia\wikipedia.py", line 3842, in run
xml.sax.parseString(data, handler)
File "C:\Python26\lib\xml\sax\__init__.py", line 49, in parseString
parser.parse(inpsrc)
File "C:\Python26\lib\xml\sax\expatreader.py", line 107, in parse
xmlreader.IncrementalParser.parse(self, source)
File "C:\Python26\lib\xml\sax\xmlreader.py", line 123, in parse
self.feed(buffer)
File "C:\Python26\lib\xml\sax\expatreader.py", line 207, in feed
self._parser.Parse(data, isFinal)
File "C:\Python26\lib\xml\sax\expatreader.py", line 304, in end_element
self._cont_handler.endElement(name)
File "C:\pywikipedia\xmlreader.py", line 182, in endElement
text, self.username,
AttributeError: MediaWikiXmlHandler instance has no attribute 'username'
MediaWikiXmlHandler instance has no attribute 'username'
2) dump file (on this dump "Successions of Philosophers" immediately precedes "Technical Architecture Group")
C:\pywikipedia>replace.py -xml:enwiki-20091128-pages-articles.xml -lang:en -xmlstart:"Successions of
Philosophers" "a" "b"
Reading XML dump...
Getting 1 pages from wikipedia:en...
>>> Successions of Philosophers <<<
[...cut......cut......cut...]
Do you want to accept these changes? ([y]es, [N]o, [e]dit, open in [b]rowser, [a]ll, [q]uit) n
Traceback (most recent call last):
File "C:\pywikipedia\pagegenerators.py", line 847, in __iter__
for page in self.wrapped_gen:
File "C:\pywikipedia\pagegenerators.py", line 779, in DuplicateFilterPageGenerator
for page in generator:
File "C:\pywikipedia\replace.py", line 218, in __iter__
for entry in self.parser:
File "C:\pywikipedia\xmlreader.py", line 295, in new_parse
for rev in self._parse(event, elem):
File "C:\pywikipedia\xmlreader.py", line 304, in _parse_only_latest
yield self._create_revision(revision)
File "C:\pywikipedia\xmlreader.py", line 341, in _create_revision
redirect=self.isredirect
File "C:\pywikipedia\xmlreader.py", line 64, in __init__
self.username = username.strip()
AttributeError: 'NoneType' object has no attribute 'strip'
'NoneType' object has no attribute 'strip'
Thanks in advance!
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2929809&group_…
Bugs item #2890078, was opened at 2009-10-31 16:53
Message generated for change (Comment added) made by nakor-wikipedia
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2890078&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Nakor Wikipedia (nakor-wikipedia)
Assigned to: Nobody/Anonymous (nobody)
Summary: Issue in userlib.py
Initial Comment:
userlib.contributions() changed the format it outputs. Please revert to the previous format %Y-%m-%dT%H:%M:%SZ .
The new format should be proposed as an option. Just changing the format without compatibility breaks existing scripts.
----------------------------------------------------------------------
>Comment By: Nakor Wikipedia (nakor-wikipedia)
Date: 2010-01-10 18:53
Message:
Here is the patch for fixing this issue. Please commit it.
Index: userlib.py
===================================================================
--- userlib.py (revision 7869)
+++ userlib.py (working copy)
@@ -223,7 +223,7 @@
for c in result['query']['usercontribs']:
yield (wikipedia.Page(self.site(), c['title'],
defaultNamespace=c['ns']),
c['revid'],
- wikipedia.parsetime2stamp(c['timestamp']),
+ c['timestamp'],
c['comment']
)
nbresults += 1
----------------------------------------------------------------------
Comment By: Nakor Wikipedia (nakor-wikipedia)
Date: 2009-10-31 17:01
Message:
Below is the patch for fixing this issue:
Index: userlib.py
===================================================================
--- userlib.py (revision 7576)
+++ userlib.py (working copy)
@@ -249,7 +249,7 @@
for c in result['query']['usercontribs']:
yield (wikipedia.Page(self.site(), c['title'],
defaultNamespace=c['ns']),
c['revid'],
- wikipedia.parsetime2stamp(c['timestamp']),
+ c['timestamp'],
c['comment']
)
nbresults += 1
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2890078&group_…
Patches item #2929350, was opened at 2010-01-10 16:41
Message generated for change (Tracker Item Submitted) made by nobody
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2929350&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: KeyError in userlib.py
Initial Comment:
try to block a user with :
badUser = userlib.User(site=self.mySite, name="123.123.123.123")
badUser.block(expiry="2 hours", reason="did bad things", noCreate=True, onAutoblock=True, allowUsertalk=False)
You will get an KeyError. This is a patch for solving the problem.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2929350&group_…
Patches item #2926457, was opened at 2010-01-05 21:28
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2926457&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Translations
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: masti (masti01)
>Assigned to: xqt (xqt)
Summary: +pl for category_redirect.py
Initial Comment:
add pl translation for category_redirect.py
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2010-01-10 15:31
Message:
done in r7868. Thanks masti :)
btw: your patch didn't work
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2926457&group_…
Support Requests item #2923020, was opened at 2009-12-29 21:17
Message generated for change (Comment added) made by keyril
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603139&aid=2923020&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Install Problem
Group: None
>Status: Closed
Priority: 1
Private: No
Submitted By: Keyril (keyril)
Assigned to: xqt (xqt)
Summary: Login problem
Initial Comment:
Sorry, but nothing i try works.
I just downloaded r7844, tried again the login.py, but failed with new error messages.
It should be easy:
* local installed MediaWiki 1.13.1
* Windows XP
* path: d:\xampp\htdocs\elysium
* Wikiname: Elysium
* user-config.py:
family = 'elysium'
mylang = 'en'
usernames['elysium']['en'] = 'WikiBot'
*
# -*- coding: utf-8 -*- # REQUIRED
import config, family, urllib # REQUIRED
class Family(family.Family): # REQUIRED
def __init__(self): # REQUIRED
family.Family.__init__(self) # REQUIRED
self.name = 'elysium' # REQUIRED; replace with actual name
self.langs = { # REQUIRED
'en': '127.0.0.1/elysium', # Include one line for each wiki in family
}
# IMPORTANT: if your wiki does not support the api.php interface,
# you must uncomment the second line of this method:
def apipath(self, code):
raise NotImplementedError, "%s wiki family does not support api.php" % self.name
return '%s/api.php' % self.scriptpath(code)
# Which version of MediaWiki is used? REQUIRED
def version(self, code):
# Replace with the actual version being run on your wiki
return '1.13.1'
def code2encoding(self, code):
"""Return the encoding for a specific language wiki"""
# Most wikis nowadays use UTF-8, but change this if yours uses
# a different encoding
return 'utf-8'
#def path(self, code):
# return '/elysium'
----------------------------------------------------------------------
>Comment By: Keyril (keyril)
Date: 2010-01-09 22:12
Message:
Thank you very much! Now it works without the /w/ workaround!
Sorry for the late answer - i have been on vacation (without pc).
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2010-01-04 10:36
Message:
You may use the following entries in the family file but do not redefine
the path instance:
self.langs = {
'en': 'localhost',
}
def scriptpath(self, code):
return '/elysium'
----------------------------------------------------------------------
Comment By: Keyril (keyril)
Date: 2010-01-01 19:28
Message:
It seems, that the login works fine if you use the default path (which is
not described), e.g. 127.0.0.1/elysium/w/index.php...
In this case you can delete the "def path" definition, uncomment "apipath"
(for unknown reason, because the api is accessible) and then the login
works.
Family file should look like (whole Wiki MUST be under
htdocs\elysium\w\...:
# -*- coding: utf-8 -*- # REQUIRED
import config, family, urllib # REQUIRED
class Family(family.Family): # REQUIRED
def __init__(self): # REQUIRED
family.Family.__init__(self) # REQUIRED
self.name = 'elysium' # REQUIRED; replace with actual name
self.langs = { # REQUIRED
'en': '127.0.0.1/elysium', # Include one line for each wiki
in family
}
# IMPORTANT: if your wiki does not support the api.php interface,
# you must uncomment the second line of this method:
def apipath(self, code):
raise NotImplementedError, "%s wiki family does not support
api.php" % self.name
return '%s/api.php' % self.scriptpath(code)
# Which version of MediaWiki is used? REQUIRED
def version(self, code):
# Replace with the actual version being run on your wiki
return '1.13.1'
def code2encoding(self, code):
"""Return the encoding for a specific language wiki"""
# Most wikis nowadays use UTF-8, but change this if yours uses
# a different encoding
return 'utf-8'
----------------------------------------------------------------------
Comment By: Keyril (keyril)
Date: 2010-01-01 19:23
Message:
workaround possible (see comment)
----------------------------------------------------------------------
Comment By: Keyril (keyril)
Date: 2009-12-31 08:35
Message:
correct url would be:
http://127.0.0.1/elysium/index.php/Main_Page
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-30 18:27
Message:
What is the right URL to get the Mainpage of your wiki?
----------------------------------------------------------------------
Comment By: Keyril (keyril)
Date: 2009-12-29 21:24
Message:
Sorry, hit the button too quick:
If i try above version of elysium_family.py i get error message for url
127.0.0.1/elysium/w/index.php...
How do i get rid of the "w"?
If i try and uncomment the lowest part with "def path" the path will be
127.0.0.1/elysium/elysium/index.php...
If i set self.langs to 127.0.0.1 without "elysium", the path should be
correct, but the error message is "wrong password or CAPTCHA answer"
I really don't know how to get it working.
Thanks in advance.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603139&aid=2923020&group_…
Bugs item #2926171, was opened at 2010-01-05 12:16
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2926171&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
>Category: login
Group: None
>Status: Closed
>Resolution: Works For Me
Priority: 5
Private: No
Submitted By: Yr Wyddfa (yrwyddfa)
Assigned to: xqt (xqt)
Summary: Errorkey: query
Initial Comment:
Hi!
I've got a problem activating pywikipediabot - scripts at our private wiki, and since today, the same error message while logging in.
I'm using Linux ubuntu 9.04 and MediaWiki 1.14.0
Here is the terminal output:
/pywikipedia$ python login.py
Password for user weBot on Dairpedia:de:
Logging in to Dairpedia:de as weBot
Traceback (most recent call last):
File "login.py", line 397, in <module>
main()
File "login.py", line 393, in main
loginMan.login()
File "login.py", line 282, in login
cookiedata = self.getCookie(api)
File "login.py", line 170, in getCookie
response, data = self.site.postData(address, self.site.urlEncode(predata), sysop=self.sysop)
File "/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line 5897, in postData
self._getUserDataOld(text, sysop = sysop)
File "/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line 6168, in _getUserDataOld
blocked = self._getBlock(sysop = sysop)
File "/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line 5500, in _getBlock
data = query.GetData(params, self)['query']['userinfo']
KeyError: 'query'
This KeyError also occured, when the login was successful and I tried to start any script.
Please tell me if you need any additional data.
Thanks a lot in advance!
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2010-01-09 19:37
Message:
This would be my next question about and asking for the family filr and the
URLs. :)
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2010-01-09 11:53
Message:
Me again ;)
I searched a bit in the folder structure of our server and found something
interesting: There is a folder called "/w" (as for the standard folder of
our wiki is "/wiki"). So I talked to our Server-admin and he told me that
he created the folder and an api.php which redirects to the api.php in the
/wiki folder. The reason was: login.py always tries to call a non
existing /w/api.php, even
def path(self, code):
return '/wiki'
is enabled in the _family.py
But I guess this should be solved in a separate thread, so this one could
be marked as "solved" - sorry for that!
----------------------------------------------------------------------
Comment By: Yr Wyddfa (yrwyddfa)
Date: 2010-01-06 00:06
Message:
Using the -verbose option the terminal gives back these informations:
/pywikipedia$ python login.py -verbose
Pywikipediabot [http] trunk/pywikipedia (r7850, 2010/01/02, 12:59:20)
Python 2.6.2 (release26-maint, Apr 19 2009, 01:56:41)
[GCC 4.3.3]
Password for user weBot on Dairpedia:de:
Logging in to Dairpedia:de as weBot
==== API action:query ====
meta: userinfo
uiprop: blockinfo
----------------
Requesting API query from Dairpedia:de
Traceback (most recent call last):
File "login.py", line 397, in <module>
main()
File "login.py", line 393, in main
loginMan.login()
File "login.py", line 282, in login
cookiedata = self.getCookie(api)
File "login.py", line 170, in getCookie
response, data = self.site.postData(address,
self.site.urlEncode(predata), sysop=self.sysop)
File
"/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line
5897, in postData
self._getUserDataOld(text, sysop = sysop)
File
"/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line
6168, in _getUserDataOld
blocked = self._getBlock(sysop = sysop)
File
"/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line
5500, in _getBlock
data = query.GetData(params, self)['query']['userinfo']
KeyError: 'query'
I added your suggested lines into my user_config.py to disable API.
Unfortunately the terminal output was exactly the same.
Could the version of my python (v. 2.6.2) be the problem? I heard about
several compatibility problems with python 3.0 so I didn't install that
version.
Thanx for your time!
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2010-01-05 18:51
Message:
migth be a bug. Please try again with -verbose option to get more
information about this. It seems the structure has been changed after mw
1.14
You also could try to run your bot with api disabled. Just put the
following statements in your user-config.py:
use_api = False
use_api_login = False
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2926171&group_…
Feature Requests item #2928894, was opened at 2010-01-09 17:56
Message generated for change (Settings changed) made by
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=2928894&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Priority: 5
Private: No
Submitted By: Iván Pérez ()
Assigned to: Nobody/Anonymous (nobody)
>Summary: Adding titles to external links
Initial Comment:
Hi all. It would be nice a script (based in reflinks.py) that put titles to external links that don't have.
For example: "[http://en.wikipedia.org]" → "[http://en.wikipedia.org Wikipedia, the free encyclopedia]"
I ask for a script that makes the same as reflinks.py (or a fix to reflinks to allow that), but not only for external links in references.
Thanks in advance.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=2928894&group_…
Feature Requests item #2928894, was opened at 2010-01-09 17:56
Message generated for change (Tracker Item Submitted) made by
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=2928894&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Priority: 5
Private: No
Submitted By: Iván Pérez ()
Assigned to: Nobody/Anonymous (nobody)
Summary: reflinks.py
Initial Comment:
Hi all. It would be nice a script (based in reflinks.py) that put titles to external links that don't have.
For example: "[http://en.wikipedia.org]" → "[http://en.wikipedia.org Wikipedia, the free encyclopedia]"
I ask for a script that makes the same as reflinks.py (or a fix to reflinks to allow that), but not only for external links in references.
Thanks in advance.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=2928894&group_…