Bugs item #2927337, was opened at 2010-01-07 06:12
Message generated for change (Settings changed) made by silvonen
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2927337&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
>Priority: 8
Private: No
Submitted By: Mikko Silvonen (silvonen)
Assigned to: Nobody/Anonymous (nobody)
Summary: Interwiki run terminates with KeyError in _getEditPage
Initial Comment:
This now happens every time I try to restart my autonomous interwiki run:
>interwiki.py -autonomous -lang:fi -namespace:0 -start:"Steffi Graf"
...
Updating links on page [[sv:Stena Line]].
No changes needed
Getting 2 pages from wikipedia:tg...
Getting 2 pages from wikipedia:nds-nl...
Getting 2 pages from wikipedia:bar...
Traceback (most recent call last):
File "c:\svn\pywikipedia\pagegenerators.py", line 860, in __iter__
for loaded_page in self.preload(somePages):
File "c:\svn\pywikipedia\pagegenerators.py", line 879, in preload
wikipedia.getall(site, pagesThisSite)
File "c:\svn\pywikipedia\wikipedia.py", line 4158, in getall
_GetAll(site, pages, throttle, force).run()
File "c:\svn\pywikipedia\wikipedia.py", line 3837, in run
xml.sax.parseString(data, handler)
File "C:\Python25\lib\xml\sax\__init__.py", line 49, in parseString
parser.parse(inpsrc)
File "C:\Python25\lib\xml\sax\expatreader.py", line 107, in parse
xmlreader.IncrementalParser.parse(self, source)
File "C:\Python25\lib\xml\sax\xmlreader.py", line 123, in parse
self.feed(buffer)
File "C:\Python25\lib\xml\sax\expatreader.py", line 207, in feed
self._parser.Parse(data, isFinal)
File "C:\Python25\lib\xml\sax\expatreader.py", line 304, in end_element
self._cont_handler.endElement(name)
File "c:\svn\pywikipedia\xmlreader.py", line 182, in endElement
text, self.username,
AttributeError: MediaWikiXmlHandler instance has no attribute 'username'
MediaWikiXmlHandler instance has no attribute 'username'
Dump fi (wikipedia) appended.
Traceback (most recent call last):
File "C:\svn\pywikipedia\interwiki.py", line 2370, in <module>
main()
File "C:\svn\pywikipedia\interwiki.py", line 2344, in main
bot.run()
File "C:\svn\pywikipedia\interwiki.py", line 2104, in run
self.queryStep()
File "C:\svn\pywikipedia\interwiki.py", line 2077, in queryStep
self.oneQuery()
File "C:\svn\pywikipedia\interwiki.py", line 2073, in oneQuery
subject.batchLoaded(self)
File "C:\svn\pywikipedia\interwiki.py", line 1240, in batchLoaded
if not page.exists():
File "c:\svn\pywikipedia\wikipedia.py", line 996, in exists
self.get()
File "c:\svn\pywikipedia\wikipedia.py", line 684, in get
self._contents = self._getEditPage(get_redirect = get_redirect, throttle = throttle, sysop = sysop)
File "c:\svn\pywikipedia\wikipedia.py", line 767, in _getEditPage
self._userName = pageInfo['revisions'][0]['user']
KeyError: 'user'
>python version.py
Pywikipedia [http] trunk/pywikipedia (r7862, 2010/01/06, 10:53:08)
Python 2.5.4 (r254:67916, Jan 29 2009, 12:02:11) [MSC v.1310 32 bit (Intel)]
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2927337&group_…
Bugs item #2927337, was opened at 2010-01-07 06:12
Message generated for change (Tracker Item Submitted) made by silvonen
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2927337&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Mikko Silvonen (silvonen)
Assigned to: Nobody/Anonymous (nobody)
Summary: Interwiki run terminates with KeyError in _getEditPage
Initial Comment:
This now happens every time I try to restart my autonomous interwiki run:
>interwiki.py -autonomous -lang:fi -namespace:0 -start:"Steffi Graf"
...
Updating links on page [[sv:Stena Line]].
No changes needed
Getting 2 pages from wikipedia:tg...
Getting 2 pages from wikipedia:nds-nl...
Getting 2 pages from wikipedia:bar...
Traceback (most recent call last):
File "c:\svn\pywikipedia\pagegenerators.py", line 860, in __iter__
for loaded_page in self.preload(somePages):
File "c:\svn\pywikipedia\pagegenerators.py", line 879, in preload
wikipedia.getall(site, pagesThisSite)
File "c:\svn\pywikipedia\wikipedia.py", line 4158, in getall
_GetAll(site, pages, throttle, force).run()
File "c:\svn\pywikipedia\wikipedia.py", line 3837, in run
xml.sax.parseString(data, handler)
File "C:\Python25\lib\xml\sax\__init__.py", line 49, in parseString
parser.parse(inpsrc)
File "C:\Python25\lib\xml\sax\expatreader.py", line 107, in parse
xmlreader.IncrementalParser.parse(self, source)
File "C:\Python25\lib\xml\sax\xmlreader.py", line 123, in parse
self.feed(buffer)
File "C:\Python25\lib\xml\sax\expatreader.py", line 207, in feed
self._parser.Parse(data, isFinal)
File "C:\Python25\lib\xml\sax\expatreader.py", line 304, in end_element
self._cont_handler.endElement(name)
File "c:\svn\pywikipedia\xmlreader.py", line 182, in endElement
text, self.username,
AttributeError: MediaWikiXmlHandler instance has no attribute 'username'
MediaWikiXmlHandler instance has no attribute 'username'
Dump fi (wikipedia) appended.
Traceback (most recent call last):
File "C:\svn\pywikipedia\interwiki.py", line 2370, in <module>
main()
File "C:\svn\pywikipedia\interwiki.py", line 2344, in main
bot.run()
File "C:\svn\pywikipedia\interwiki.py", line 2104, in run
self.queryStep()
File "C:\svn\pywikipedia\interwiki.py", line 2077, in queryStep
self.oneQuery()
File "C:\svn\pywikipedia\interwiki.py", line 2073, in oneQuery
subject.batchLoaded(self)
File "C:\svn\pywikipedia\interwiki.py", line 1240, in batchLoaded
if not page.exists():
File "c:\svn\pywikipedia\wikipedia.py", line 996, in exists
self.get()
File "c:\svn\pywikipedia\wikipedia.py", line 684, in get
self._contents = self._getEditPage(get_redirect = get_redirect, throttle = throttle, sysop = sysop)
File "c:\svn\pywikipedia\wikipedia.py", line 767, in _getEditPage
self._userName = pageInfo['revisions'][0]['user']
KeyError: 'user'
>python version.py
Pywikipedia [http] trunk/pywikipedia (r7862, 2010/01/06, 10:53:08)
Python 2.5.4 (r254:67916, Jan 29 2009, 12:02:11) [MSC v.1310 32 bit (Intel)]
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2927337&group_…
Patches item #2835479, was opened at 2009-08-11 06:12
Message generated for change (Comment added) made by russblau
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2835479&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
>Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Jean-Daniel Fekete (jdfekte)
Assigned to: Nobody/Anonymous (nobody)
Summary: Allowing xmlreader to read from stdin
Initial Comment:
XML dumps are huge and distributed in 7zip format now. This very small patch allows dumps to be read from the standard input using '-' as file name.
----------------------------------------------------------------------
>Comment By: Russell Blau (russblau)
Date: 2010-01-06 14:07
Message:
This does not belong in the 'rewrite' category, as the rewrite branch does
not yet support reading xml files at all
----------------------------------------------------------------------
Comment By: siebrand (siebrand)
Date: 2009-11-30 06:48
Message:
@valhallasw: Do you suggest that the patch should be rejected based on your
reasoning?
----------------------------------------------------------------------
Comment By: Jean-Daniel Fekete (jdfekte)
Date: 2009-10-08 06:12
Message:
Decompressing on the fly is important to avoid creating huge files. I need
to process the full dump with revisions of the French wikipedia (for now)
and decompressing it is unreasonable. Using pipes is a standard practice
and they are not slow compared to the parsing and processing time of
Python.
I understand the aesthetic objection but pragmatically, the "-" syntax is
quite convenient and the changes are quite minimal.
----------------------------------------------------------------------
Comment By: siebrand (siebrand)
Date: 2009-10-02 05:43
Message:
Submitter, please address comment dated 2009-08-11 13:45 by valhallasw.
Otherwise this patch will be rejected for certain after 2 weeks.
----------------------------------------------------------------------
Comment By: Merlijn S. van Deen (valhallasw)
Date: 2009-08-11 07:45
Message:
I never see the point of programs adding '-' as magic filename. Unix has
/dev/stdin, dos/windows have CON. Secondly, it is probably better to use an
internal 7zip decompressor, as pipes tend to be slow.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2835479&group_…
Bugs item #2776167, was opened at 2009-04-20 08:49
Message generated for change (Comment added) made by russblau
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2776167&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: rewrite
Group: None
>Status: Closed
>Resolution: Works For Me
Priority: 5
Private: No
Submitted By: liangent (liangent)
Assigned to: Nobody/Anonymous (nobody)
Summary: bug in site.py of rewrite branch
Initial Comment:
Traceback (most recent call last):
File "./archiver.py", line 7, in <module>
now = Site().getcurrenttime() # or from link's site?
File "/usr/lib/python2.5/site-packages/pywikibot/site.py", line 794, in getcurrenttime
ts = self.getcurrenttimestamp()
File "/usr/lib/python2.5/site-packages/pywikibot/site.py", line 789, in getcurrenttimestamp
result = r.submit()
File "/usr/lib/python2.5/site-packages/pywikibot/data/api.py", line 184, in submit
self.site.throttle(write=write)
File "/usr/lib/python2.5/site-packages/pywikibot/site.py", line 176, in __getattr__
return self.__class__.attr
AttributeError: type object 'APISite' has no attribute 'attr'
----------------------------------------------------------------------
>Comment By: Russell Blau (russblau)
Date: 2010-01-06 14:06
Message:
Appears to have been fixed at some point.
----------------------------------------------------------------------
Comment By: liangent (liangent)
Date: 2009-05-06 07:22
Message:
does the problem occur only when getting server time?
----------------------------------------------------------------------
Comment By: Russell Blau (russblau)
Date: 2009-05-05 13:26
Message:
I have also seen this problem on rare occasions but I haven't been able to
track down the cause.
----------------------------------------------------------------------
Comment By: liangent (liangent)
Date: 2009-05-05 02:08
Message:
i didn't find any problem on self.site.throttle(write=write)
and this problem never occurred again.
(actually i didn't use Site().getcurrenttime() anymore)
but what was wrong?
----------------------------------------------------------------------
Comment By: liangent (liangent)
Date: 2009-04-20 09:32
Message:
i forgot to log in too...
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2009-04-20 09:30
Message:
no no no ... i met another problem, and i don't know whether it has
connection with my patch ... they seem to occur randomly ...
liangent@oiweb:~$ wiki/bot/archiver/archiver.py
Found 1 wikipedia:zh processes running, including this one.
Traceback (most recent call last):
File "wiki/bot/archiver/archiver.py", line 7, in <module>
now = Site().getcurrenttime() # or from link's site?
File "/usr/lib/python2.5/site-packages/pywikibot/site.py", line 794, in
getcurrenttime
ts = self.getcurrenttimestamp()
File "/usr/lib/python2.5/site-packages/pywikibot/site.py", line 789, in
getcurrenttimestamp
result = r.submit()
File "/usr/lib/python2.5/site-packages/pywikibot/data/api.py", line 184,
in submit
self.site.throttle(write=write)
TypeError: 'property' object is not callable
liangent@oiweb:~$
----------------------------------------------------------------------
Comment By: liangent (liangent)
Date: 2009-04-20 09:25
Message:
it seems problem occurs only when throttling
try to run a lot of instances to test
my code starts with
#!/usr/bin/env python
# -*- coding: utf_8 -*-
import re
import pywikibot
from pywikibot import Link, Site, Page
import datetime
now = Site().getcurrenttime()
----------------------------------------------------------------------
Comment By: Russell Blau (russblau)
Date: 2009-04-20 09:04
Message:
Sorry, forgot to log in before last comment!
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2009-04-20 09:03
Message:
I'll apply this patch, but the original code works for me in Python 2.5.2:
>>> import pywikibot
>>> s = pywikibot.Site()
>>> now = s.getcurrenttime()
Found 1 wikipedia:en processes running, including this one.
>>> print now
2009-04-20T13:00:47Z
>>>
Question: in your 'archiver.py' script, did you import Site from
pywikibot, or from some other module? If you import it directly from
pywikibot.site, it won't work!
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2776167&group_…
Bugs item #2629586, was opened at 2009-02-23 03:41
Message generated for change (Comment added) made by russblau
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2629586&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: rewrite
Group: None
>Status: Pending
Resolution: None
Priority: 5
Private: No
Submitted By: NicDumZ — Nicolas Dumazet (nicdumz)
Assigned to: Russell Blau (russblau)
Summary: TerminalHandler.emit crashes when message is a string
Initial Comment:
Sometimes self.format(record) can return a string, and this case is currently not handled.
How to trigger this behavior ?
1) modify Site.loadpageinfo() so that the error "u"loadpageinfo: Query on %s returned data on '%s' will be raised everytime.
2) run category.py move -from:nonasciititle -to:nonasciititle2
3) the loadpageinfo error will be triggered; category.py will catch this Error at top-level and will call pywikibot.logging.exception("Fatal error:")
Here, this gives :
pywikibot/scripts$ python category.py move -from:"Athlète du combiné nordique aux Jeux olympiques" -to:"Coureur du combiné nordique aux Jeux olympiques" -debug
Reading dump from category.dump.bz2
Found 1 wikipedia:fr processes running, including this one.
Traceback (most recent call last):
File ".../pywikibot/bot.py", line 95, in emit
"xmlcharrefreplace"))
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 671: ordinal not in range(128)
Dumping to category.dump.bz2, please wait...
I don't understand exactly how a string is returned by format(), and why a unicode message is expected, but it happens.
The stacktrace here is parcticularly cryptic. True, I'm still not used to the logging system, but I had to place manually old-fashioned "print"s everywhere to track the issue and understand what CAUSED this. :/
I have patched emit() in r6423 so it doesn't crash on a string message. However, Russ, I think that you might want to fix the source of the problem, in the logging system itself, rather than solving the effect. Feel free to revert this :)
----------------------------------------------------------------------
>Comment By: Russell Blau (russblau)
Date: 2010-01-06 14:04
Message:
I'm not sure if there is anything here that still needs to be fixed; fixing
bugs in the logging module is certainly outside the abilities of this
project! ;-)
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2629586&group_…
Bugs item #2619054, was opened at 2009-02-20 03:04
Message generated for change (Comment added) made by russblau
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2619054&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: rewrite
Group: None
>Status: Pending
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: NicDumZ — Nicolas Dumazet (nicdumz)
Assigned to: Russell Blau (russblau)
Summary: clarify between limit, number, batch and step parameters
Initial Comment:
I had a strange behavior of replace.py -weblink: that I couldn't quite diagnose: some pages were not treated.
First of all, those detailed logs are a great gift. They are a bit messy to understand at first, but thanks to those I found the bug and fixed it in r6386 ( http://svn.wikimedia.org/viewvc/pywikipedia?view=rev&revision=6386 ).
I believe that this parameter confusion is a very bad habit we have from the old framework. (the only reason there we have those bugs is because we merged pagegenerators from trunk.) We need to agree on common parameters for generators that have a global meaning, and stick to it.
I personally think that -limit might be a bit confusing (is it an api limit, a limit enforced by the local application on a huge fetched set, etc ?), while -number appears a bit more clear. But it's a personal opinion =)
What about -number for "number of items to retrieve", and -step, or -maxstep for the maximum number of items to retrieve at once ?
Actually, I don't mind about the names; we just need to agree on something meaningful enough, and document them in the file headings.
On a sidenote, replace.py -fix:yu-tld -weblink:*.yu is actually running on fr.wp. No issues sighted. =)
----------------------------------------------------------------------
>Comment By: Russell Blau (russblau)
Date: 2010-01-06 13:59
Message:
This was fixed a while back but I neglected to close the bug; please reopen
if any continuing problems exist.
----------------------------------------------------------------------
Comment By: NicDumZ — Nicolas Dumazet (nicdumz)
Date: 2009-02-21 23:50
Message:
Well I think that one of the first steps here is to consider what is
currently done in the old pagegenerators =)
Here's a small summary of the "limits" enforced by our old
pagegenerators.
The overall internal naming consistency factor is quite low for now, not
to mention the surprising facts I found :s
I've considered for each generator, the pagegenerators function, and its
Site/Page/Image/Category counterpart: unless noted, both function parameter
namings are consistent.
* shortpages, new(pages|images), unusedfiles, withoutinterwiki,
uncategorized(images|categories|pages), unwatchedpages, ancientpages,
deadendpages, longpages, shortpages, search
They use "number" (meant as "batch"/"max") + boolean "repeat". Overall,
you can get either "number" items, or all.
* random(page|redirect) are good examples of inconsistencies:
they use number (batch/max) + repeat, but since Special:Random gives only
one page at a time, the actual "batch" parameter is always 1. (behavior is
"for _ in range(number), fetch one page")
And if repeat=True ... those functions never stop, if I'm right.
irrrk !!
* filelinks, imagelinks, interwiki
they scrap the article wikipage, and yield everything in one step from the
wikitext
* categorymembers, subcategories
they scrap category pages. No parameter is available, since the UI doesn't
let us customize the number of displayed links. Follows the (next) links on
the category page. Stops when all the items have been retrieved.
* allpages, prefixindex, getReferences
no function parameters. They use config.special_page_limit as "batch/max",
and all items are retrieved through repeated queries.
if special_page_limit > 999, getReferences sets it back to 999. (?!)
* linksearch
pagegenerators has a "step=500" parameter, the corresponding Site function
uses "limit=500". Meant as "batch/batch": all the links are retrieved
through repeated queries
* usercontribs
number=250, meant as "batch/max". All the contribs are retrieved through
repeated queries. if number>500, sets it back to 500
It seems that the most common used combination is number+repeat. But I
really don't think that it is the way to go, since you cannot accurately
describe the total number of items you want to retrieve: either number,
either all items...
I think a "batch" + "total" integer parameters could be more useful here
(namings are illustrative)
On the other hand, users should be able to say "I want to retrieve all the
items": looking into the code, I see that a "-1" convention is used now. If
I understand things correctly, it is used in a "batch" context: if we call
set_maximum_items(-1), in most of the cases, the API uses its default
xxlimit number. We could use such a convention for our "total" parameter
too. Be it -1, or None, whatever, but I think that with such a policy, we
should cover all the use cases.
Given what I found, I really don't think that backwards compatibility
should be a priority here. I would rather introduce a breaking change in
namings, so that people don't expect the new limits to work "as in the old
framework"... because in the old framework, limit behaviors were not even
internally consistent...
----------------------------------------------------------------------
Comment By: Russell Blau (russblau)
Date: 2009-02-20 10:00
Message:
A good point. A query can have two different types of limits: the limit on
the number of pages/links/whatever retrieved from the API in a single
request (defaults to "max"), and the limit on the total number of items to
be retrieved from a repeated query. We should do this in a way that is (a)
internally consistent among all generators, and (b) as much as possible,
backwards-compatible with the old pagegenerators module (but this is
secondary to getting something that works).
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2619054&group_…
Bugs item #2926171, was opened at 2010-01-05 12:16
Message generated for change (Settings changed) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2926171&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
>Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Yr Wyddfa (yrwyddfa)
Assigned to: xqt (xqt)
Summary: Errorkey: query
Initial Comment:
Hi!
I've got a problem activating pywikipediabot - scripts at our private wiki, and since today, the same error message while logging in.
I'm using Linux ubuntu 9.04 and MediaWiki 1.14.0
Here is the terminal output:
/pywikipedia$ python login.py
Password for user weBot on Dairpedia:de:
Logging in to Dairpedia:de as weBot
Traceback (most recent call last):
File "login.py", line 397, in <module>
main()
File "login.py", line 393, in main
loginMan.login()
File "login.py", line 282, in login
cookiedata = self.getCookie(api)
File "login.py", line 170, in getCookie
response, data = self.site.postData(address, self.site.urlEncode(predata), sysop=self.sysop)
File "/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line 5897, in postData
self._getUserDataOld(text, sysop = sysop)
File "/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line 6168, in _getUserDataOld
blocked = self._getBlock(sysop = sysop)
File "/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line 5500, in _getBlock
data = query.GetData(params, self)['query']['userinfo']
KeyError: 'query'
This KeyError also occured, when the login was successful and I tried to start any script.
Please tell me if you need any additional data.
Thanks a lot in advance!
----------------------------------------------------------------------
Comment By: Yr Wyddfa (yrwyddfa)
Date: 2010-01-06 00:06
Message:
Using the -verbose option the terminal gives back these informations:
/pywikipedia$ python login.py -verbose
Pywikipediabot [http] trunk/pywikipedia (r7850, 2010/01/02, 12:59:20)
Python 2.6.2 (release26-maint, Apr 19 2009, 01:56:41)
[GCC 4.3.3]
Password for user weBot on Dairpedia:de:
Logging in to Dairpedia:de as weBot
==== API action:query ====
meta: userinfo
uiprop: blockinfo
----------------
Requesting API query from Dairpedia:de
Traceback (most recent call last):
File "login.py", line 397, in <module>
main()
File "login.py", line 393, in main
loginMan.login()
File "login.py", line 282, in login
cookiedata = self.getCookie(api)
File "login.py", line 170, in getCookie
response, data = self.site.postData(address,
self.site.urlEncode(predata), sysop=self.sysop)
File
"/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line
5897, in postData
self._getUserDataOld(text, sysop = sysop)
File
"/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line
6168, in _getUserDataOld
blocked = self._getBlock(sysop = sysop)
File
"/home/heiko/Dokumente/DairAlainn/Dairpedia/pywikipedia/wikipedia.py", line
5500, in _getBlock
data = query.GetData(params, self)['query']['userinfo']
KeyError: 'query'
I added your suggested lines into my user_config.py to disable API.
Unfortunately the terminal output was exactly the same.
Could the version of my python (v. 2.6.2) be the problem? I heard about
several compatibility problems with python 3.0 so I didn't install that
version.
Thanx for your time!
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2010-01-05 18:51
Message:
migth be a bug. Please try again with -verbose option to get more
information about this. It seems the structure has been changed after mw
1.14
You also could try to run your bot with api disabled. Just put the
following statements in your user-config.py:
use_api = False
use_api_login = False
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2926171&group_…
Bugs item #2926481, was opened at 2010-01-05 22:12
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2926481&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: other
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: masti (masti01)
>Assigned to: xqt (xqt)
Summary: disambredir crashes on pl.wiki
Initial Comment:
pwpl]$ python pywikipedia/disambredir.py
Getting [[Kategoria:Strony ujednoznaczniające]] list starting at !...
Getting 60 pages from wikipedia:pl...
Getting 55 pages from wikipedia:pl...
Sleeping for 4.6 seconds, 2010-01-05 22:04:48
>>> 1 Armia <<<
>>> 1 Armia Pancerna <<<
>>> 1 Batalion Ciężkich Karabinów Maszynowych <<<
Traceback (most recent call last):
File "pywikipedia/disambredir.py", line 165, in <module>
main()
File "pywikipedia/disambredir.py", line 159, in main
workon(page,links)
File "pywikipedia/disambredir.py", line 121, in workon
text = treat(text, page2, target)
File "pywikipedia/disambredir.py", line 38, in treat
linkR = re.compile(r'\[\[(?P<title>[^\]\|#]*)(?P<section>#[^\]\|]*)?(\|(?P<label>[^\]]*))?\]\](?P<linktrail>' + linktrail + ')')
NameError: global name 'linktrail' is not defined
Pywikipedia (r7857 (wikipedia.py), 2010/01/05, 17:43:13)
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)]
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2010-01-06 07:50
Message:
fixed in r7861
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2926481&group_…
Patches item #2926489, was opened at 2010-01-05 22:23
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2926489&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Translations
Group: None
>Status: Closed
>Resolution: Accepted
Priority: 5
Private: No
Submitted By: masti (masti01)
>Assigned to: xqt (xqt)
Summary: +pl/szl for wikipedia_family
Initial Comment:
removal of old unused Disambig templates for pl.wiki
add Disambig template and kategory for szl.wiki
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2010-01-06 07:08
Message:
done in r7860. Thanks!
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2926489&group_…
Patches item #2926428, was opened at 2010-01-05 20:38
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2926428&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Translations
Group: None
>Status: Closed
>Resolution: Accepted
Priority: 5
Private: No
Submitted By: masti (masti01)
>Assigned to: xqt (xqt)
Summary: +pl for category.py
Initial Comment:
add missing pl messages for category.py
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2010-01-06 06:51
Message:
done in r7859
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=2926428&group_…