Bugs item #2922193, was opened at 2009-12-28 11:56
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922193&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: masti (masti01)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki fails on pl.wikibooks
Initial Comment:
while running interwiki.py on pl.wiikibooks the script goes thru pages but from time to time it stalls with: "Received incomplete XML data. Sleeping for 15 seconds..." error and stays like that forever
example:
$python pywikipedia/interwiki.py -start:"H. K. T." -auto
NOTE: Number of pages queued is 0, trying to add 60 more.
Getting 60 pages from wikisource:pl...
[then some output about processing]
[and then ...]
======Post-processing [[pl:Ha! jeszcze o mnie...]]======
Updating links on page [[pl:Ha! jeszcze o mnie...]].
No changes needed
======Post-processing [[pl:H. K. T.]]======
Updating links on page [[pl:H. K. T.]].
No changes needed
NOTE: The first unfinished subject is [[pl:Had we never loved so kindly]]
NOTE: Number of pages queued is 10, trying to add 60 more.
Getting 60 pages from wikisource:pl...
Received incomplete XML data. Sleeping for 15 seconds...
Received incomplete XML data. Sleeping for 30 seconds...
Received incomplete XML data. Sleeping for 45 seconds...
Received incomplete XML data. Sleeping for 60 seconds...
Received incomplete XML data. Sleeping for 75 seconds...
Received incomplete XML data. Sleeping for 135 seconds...
Received incomplete XML data. Sleeping for 195 seconds...
Received incomplete XML data. Sleeping for 255 seconds...
Received incomplete XML data. Sleeping for 315 seconds...
and so on ...
python pywikipedia/version.py
Pywikipedia (r7830 (wikipedia.py), 2009/12/27, 14:20:21)
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)]
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2009-12-29 19:55
Message:
You can try to export this page. [[s:pl:Specjalna:Eksport]] It also is not
possible. Maybe the page is corrupt. We should ask at tech-channel.
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-29 19:49
Message:
You're right the output for this page is:
'\n <id>14057</id>\n'
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-29 19:35
Message:
I guess this is a problem of the page 'Historyja literatury angielskiej'.
You may try the statement below but with
page=wikipedia.Page(site, 'Historyja literatury angielskiej')
It won't work. But I've no idea for now.
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-29 15:51
Message:
Yes, it works:
[mst@pl37007 pywikipedia]$ python
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import wikipedia
>>> site = wikipedia.getSite('pl','wikisource')
>>> ga = wikipedia._GetAll(site, pages=[], throttle=0, force=True)
>>> page=wikipedia.Page(site, 'H. K. T.')
>>> ga.pages = [page]
>>> data=ga.getData()
>>> data[-20:]
'/page>\n</mediawiki>\n'
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-29 09:16
Message:
Could you test the following statements in your python shell (idle):
>>> import wikipedia
>>> site = wikipedia.getSite('pl', 'wikisource')
>>> ga = wikipedia._GetAll(site, pages=[], throttle=0, force=True)
>>> page=wikipedia.Page(site, 'H. K. T')
>>> ga.pages = [page]
>>> data=ga.getData()
>>> data[-20:]
As result you should see this:
'einfo>\n</mediawiki>\n'
>>>
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-28 18:50
Message:
one thing was wrong: it's pl.wikisource not wikibooks. This error persist
since some months already, and it happens everytime I run a bot. So this
should not be due to server performance. I was runnig a bot on several
other projects as well today and this happens only on pl.wikisource. This
is just an example but the bot stalls on several different articles.
I am using API.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-28 18:38
Message:
I found the server was very slow today for pl sites. I had a similar delay
on pl-wiki. Maybe it works late.
On the other hand you can try via the API. Does this work?
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922193&group_…
Bugs item #2922193, was opened at 2009-12-28 11:56
Message generated for change (Comment added) made by masti01
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922193&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: masti (masti01)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki fails on pl.wikibooks
Initial Comment:
while running interwiki.py on pl.wiikibooks the script goes thru pages but from time to time it stalls with: "Received incomplete XML data. Sleeping for 15 seconds..." error and stays like that forever
example:
$python pywikipedia/interwiki.py -start:"H. K. T." -auto
NOTE: Number of pages queued is 0, trying to add 60 more.
Getting 60 pages from wikisource:pl...
[then some output about processing]
[and then ...]
======Post-processing [[pl:Ha! jeszcze o mnie...]]======
Updating links on page [[pl:Ha! jeszcze o mnie...]].
No changes needed
======Post-processing [[pl:H. K. T.]]======
Updating links on page [[pl:H. K. T.]].
No changes needed
NOTE: The first unfinished subject is [[pl:Had we never loved so kindly]]
NOTE: Number of pages queued is 10, trying to add 60 more.
Getting 60 pages from wikisource:pl...
Received incomplete XML data. Sleeping for 15 seconds...
Received incomplete XML data. Sleeping for 30 seconds...
Received incomplete XML data. Sleeping for 45 seconds...
Received incomplete XML data. Sleeping for 60 seconds...
Received incomplete XML data. Sleeping for 75 seconds...
Received incomplete XML data. Sleeping for 135 seconds...
Received incomplete XML data. Sleeping for 195 seconds...
Received incomplete XML data. Sleeping for 255 seconds...
Received incomplete XML data. Sleeping for 315 seconds...
and so on ...
python pywikipedia/version.py
Pywikipedia (r7830 (wikipedia.py), 2009/12/27, 14:20:21)
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)]
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-29 19:49
Message:
You're right the output for this page is:
'\n <id>14057</id>\n'
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-29 19:35
Message:
I guess this is a problem of the page 'Historyja literatury angielskiej'.
You may try the statement below but with
page=wikipedia.Page(site, 'Historyja literatury angielskiej')
It won't work. But I've no idea for now.
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-29 15:51
Message:
Yes, it works:
[mst@pl37007 pywikipedia]$ python
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import wikipedia
>>> site = wikipedia.getSite('pl','wikisource')
>>> ga = wikipedia._GetAll(site, pages=[], throttle=0, force=True)
>>> page=wikipedia.Page(site, 'H. K. T.')
>>> ga.pages = [page]
>>> data=ga.getData()
>>> data[-20:]
'/page>\n</mediawiki>\n'
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-29 09:16
Message:
Could you test the following statements in your python shell (idle):
>>> import wikipedia
>>> site = wikipedia.getSite('pl', 'wikisource')
>>> ga = wikipedia._GetAll(site, pages=[], throttle=0, force=True)
>>> page=wikipedia.Page(site, 'H. K. T')
>>> ga.pages = [page]
>>> data=ga.getData()
>>> data[-20:]
As result you should see this:
'einfo>\n</mediawiki>\n'
>>>
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-28 18:50
Message:
one thing was wrong: it's pl.wikisource not wikibooks. This error persist
since some months already, and it happens everytime I run a bot. So this
should not be due to server performance. I was runnig a bot on several
other projects as well today and this happens only on pl.wikisource. This
is just an example but the bot stalls on several different articles.
I am using API.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-28 18:38
Message:
I found the server was very slow today for pl sites. I had a similar delay
on pl-wiki. Maybe it works late.
On the other hand you can try via the API. Does this work?
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922193&group_…
Bugs item #2922193, was opened at 2009-12-28 11:56
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922193&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: masti (masti01)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki fails on pl.wikibooks
Initial Comment:
while running interwiki.py on pl.wiikibooks the script goes thru pages but from time to time it stalls with: "Received incomplete XML data. Sleeping for 15 seconds..." error and stays like that forever
example:
$python pywikipedia/interwiki.py -start:"H. K. T." -auto
NOTE: Number of pages queued is 0, trying to add 60 more.
Getting 60 pages from wikisource:pl...
[then some output about processing]
[and then ...]
======Post-processing [[pl:Ha! jeszcze o mnie...]]======
Updating links on page [[pl:Ha! jeszcze o mnie...]].
No changes needed
======Post-processing [[pl:H. K. T.]]======
Updating links on page [[pl:H. K. T.]].
No changes needed
NOTE: The first unfinished subject is [[pl:Had we never loved so kindly]]
NOTE: Number of pages queued is 10, trying to add 60 more.
Getting 60 pages from wikisource:pl...
Received incomplete XML data. Sleeping for 15 seconds...
Received incomplete XML data. Sleeping for 30 seconds...
Received incomplete XML data. Sleeping for 45 seconds...
Received incomplete XML data. Sleeping for 60 seconds...
Received incomplete XML data. Sleeping for 75 seconds...
Received incomplete XML data. Sleeping for 135 seconds...
Received incomplete XML data. Sleeping for 195 seconds...
Received incomplete XML data. Sleeping for 255 seconds...
Received incomplete XML data. Sleeping for 315 seconds...
and so on ...
python pywikipedia/version.py
Pywikipedia (r7830 (wikipedia.py), 2009/12/27, 14:20:21)
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)]
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2009-12-29 19:35
Message:
I guess this is a problem of the page 'Historyja literatury angielskiej'.
You may try the statement below but with
page=wikipedia.Page(site, 'Historyja literatury angielskiej')
It won't work. But I've no idea for now.
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-29 15:51
Message:
Yes, it works:
[mst@pl37007 pywikipedia]$ python
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import wikipedia
>>> site = wikipedia.getSite('pl','wikisource')
>>> ga = wikipedia._GetAll(site, pages=[], throttle=0, force=True)
>>> page=wikipedia.Page(site, 'H. K. T.')
>>> ga.pages = [page]
>>> data=ga.getData()
>>> data[-20:]
'/page>\n</mediawiki>\n'
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-29 09:16
Message:
Could you test the following statements in your python shell (idle):
>>> import wikipedia
>>> site = wikipedia.getSite('pl', 'wikisource')
>>> ga = wikipedia._GetAll(site, pages=[], throttle=0, force=True)
>>> page=wikipedia.Page(site, 'H. K. T')
>>> ga.pages = [page]
>>> data=ga.getData()
>>> data[-20:]
As result you should see this:
'einfo>\n</mediawiki>\n'
>>>
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-28 18:50
Message:
one thing was wrong: it's pl.wikisource not wikibooks. This error persist
since some months already, and it happens everytime I run a bot. So this
should not be due to server performance. I was runnig a bot on several
other projects as well today and this happens only on pl.wikisource. This
is just an example but the bot stalls on several different articles.
I am using API.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-28 18:38
Message:
I found the server was very slow today for pl sites. I had a similar delay
on pl-wiki. Maybe it works late.
On the other hand you can try via the API. Does this work?
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922193&group_…
Bugs item #2922896, was opened at 2009-12-29 17:25
Message generated for change (Comment added) made by wikimercy
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922896&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
>Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Mercy (wikimercy)
Assigned to: Nobody/Anonymous (nobody)
Summary: Bot corrupting characters on eo.wp
Initial Comment:
See http://en.wikipedia.org/w/index.php?title=User_talk:Mercy&diff=334694634&ol… for more information.
An example of a wrong edit can be found here: http://eo.wikipedia.org/w/index.php?title=Maigret&diff=2662069&oldid=2500605
The whole situation happened when I used the feature.py script to add Link_FA templates to the articles.
I'm using Python 2.6.3.
----------------------------------------------------------------------
>Comment By: Mercy (wikimercy)
Date: 2009-12-29 19:15
Message:
Hi, my current SVN revision is nr. 7844 and I'm using the API.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-29 19:03
Message:
I need further informations:
- the actual release of pywikipediabot you are using
- is use_api=True in your user-config.py
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922896&group_…
Bugs item #2922896, was opened at 2009-12-29 17:25
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922896&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
>Status: Pending
Resolution: None
Priority: 5
Private: No
Submitted By: Mercy (wikimercy)
Assigned to: Nobody/Anonymous (nobody)
Summary: Bot corrupting characters on eo.wp
Initial Comment:
See http://en.wikipedia.org/w/index.php?title=User_talk:Mercy&diff=334694634&ol… for more information.
An example of a wrong edit can be found here: http://eo.wikipedia.org/w/index.php?title=Maigret&diff=2662069&oldid=2500605
The whole situation happened when I used the feature.py script to add Link_FA templates to the articles.
I'm using Python 2.6.3.
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2009-12-29 19:03
Message:
I need further informations:
- the actual release of pywikipediabot you are using
- is use_api=True in your user-config.py
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922896&group_…
Bugs item #2922193, was opened at 2009-12-28 11:56
Message generated for change (Comment added) made by masti01
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922193&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: masti (masti01)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki fails on pl.wikibooks
Initial Comment:
while running interwiki.py on pl.wiikibooks the script goes thru pages but from time to time it stalls with: "Received incomplete XML data. Sleeping for 15 seconds..." error and stays like that forever
example:
$python pywikipedia/interwiki.py -start:"H. K. T." -auto
NOTE: Number of pages queued is 0, trying to add 60 more.
Getting 60 pages from wikisource:pl...
[then some output about processing]
[and then ...]
======Post-processing [[pl:Ha! jeszcze o mnie...]]======
Updating links on page [[pl:Ha! jeszcze o mnie...]].
No changes needed
======Post-processing [[pl:H. K. T.]]======
Updating links on page [[pl:H. K. T.]].
No changes needed
NOTE: The first unfinished subject is [[pl:Had we never loved so kindly]]
NOTE: Number of pages queued is 10, trying to add 60 more.
Getting 60 pages from wikisource:pl...
Received incomplete XML data. Sleeping for 15 seconds...
Received incomplete XML data. Sleeping for 30 seconds...
Received incomplete XML data. Sleeping for 45 seconds...
Received incomplete XML data. Sleeping for 60 seconds...
Received incomplete XML data. Sleeping for 75 seconds...
Received incomplete XML data. Sleeping for 135 seconds...
Received incomplete XML data. Sleeping for 195 seconds...
Received incomplete XML data. Sleeping for 255 seconds...
Received incomplete XML data. Sleeping for 315 seconds...
and so on ...
python pywikipedia/version.py
Pywikipedia (r7830 (wikipedia.py), 2009/12/27, 14:20:21)
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)]
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-29 15:51
Message:
Yes, it works:
[mst@pl37007 pywikipedia]$ python
Python 2.6.2 (r262:71600, Aug 21 2009, 12:22:21)
[GCC 4.4.1 20090818 (Red Hat 4.4.1-6)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import wikipedia
>>> site = wikipedia.getSite('pl','wikisource')
>>> ga = wikipedia._GetAll(site, pages=[], throttle=0, force=True)
>>> page=wikipedia.Page(site, 'H. K. T.')
>>> ga.pages = [page]
>>> data=ga.getData()
>>> data[-20:]
'/page>\n</mediawiki>\n'
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-29 09:16
Message:
Could you test the following statements in your python shell (idle):
>>> import wikipedia
>>> site = wikipedia.getSite('pl', 'wikisource')
>>> ga = wikipedia._GetAll(site, pages=[], throttle=0, force=True)
>>> page=wikipedia.Page(site, 'H. K. T')
>>> ga.pages = [page]
>>> data=ga.getData()
>>> data[-20:]
As result you should see this:
'einfo>\n</mediawiki>\n'
>>>
----------------------------------------------------------------------
Comment By: masti (masti01)
Date: 2009-12-28 18:50
Message:
one thing was wrong: it's pl.wikisource not wikibooks. This error persist
since some months already, and it happens everytime I run a bot. So this
should not be due to server performance. I was runnig a bot on several
other projects as well today and this happens only on pl.wikisource. This
is just an example but the bot stalls on several different articles.
I am using API.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-12-28 18:38
Message:
I found the server was very slow today for pl sites. I had a similar delay
on pl-wiki. Maybe it works late.
On the other hand you can try via the API. Does this work?
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2922193&group_…
Bugs item #2825996, was opened at 2009-07-23 16:01
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2825996&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
>Status: Closed
>Resolution: Works For Me
Priority: 5
Private: No
Submitted By: JAn (jandudik)
>Assigned to: xqt (xqt)
Summary: neverending loading
Initial Comment:
I run bot
interwiki.py -force -whenneeded -continue -lang:na
In dump was list of articles, every line unique
But bot still loads new pages,
(NOTE: The first unfinished subject is [[na:Uetersen]]
NOTE: Number of pages queued is 99, trying to add 60 more. )
so I interrupted it and in dump there were many links more than 1 time. (see attachments)
after [[Żagań]] loaded [[Tarnów]]
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2009-12-29 13:25
Message:
I tested both examples with pagegenerators and it stopped.
(release 7844)
----------------------------------------------------------------------
Comment By: Mikko Silvonen (silvonen)
Date: 2009-07-29 19:16
Message:
Bots now get stuck also when you use the -start parameter near the end of
the wiki. I do think this is a bug.
Here's an example on a small wiki:
> python version.py
Pywikipedia [http] trunk/pywikipedia (r7101, 2009/07/27, 15:54:13)
Python 2.5.4 (r254:67916, Jan 29 2009, 12:02:11) [MSC v.1310 32 bit
(Intel)]
> pagegenerators.py -lang:fiu-vro -start:Z
Ähijärv
Äksi kihlkund
Ängä oja
Äniniidü lump
Ärnu jõgi
Ärq ei lääq
Ärqseletüs
Äühvoja
Õdagu-Viro maakund
Õdagumeresoomõ keeleq
Õdagumeri
Õdri järv
Õigustiidüs
Õuraasia
Õuro
Õuruupa
Õuruupa Liit
Üleherküs
Ülembjärv
Ülemine järv
Ülemäne jõgi
Ãœlene internetitunnus
Üräski oja
Ãœtidse osaga hulgaq
Ãœtidse osalda hulgaq
Ãœtine osa
Ãœtiskuningriik
Ãœts ummamuudu liin
Ütsik täht
Üvvärjärv
Üü tulõk
Üübjärv
Dông Hoi
Swinoujscie
Swietochlowice
Zemaidi kiil
When I press Ctrl+C, I get this traceback.
Traceback (most recent call last):
File "C:\svn\pywikipedia\pagegenerators.py", line 1138, in <module>
for page in gen:
File "C:\svn\pywikipedia\pagegenerators.py", line 738, in
DuplicateFilterPageGenerator
for page in generator:
File "C:\svn\pywikipedia\pagegenerators.py", line 263, in
AllpagesPageGenerator
for page in site.allpages(start = start, namespace = namespace,
includeredirects = includeredirects):
File "c:\svn\pywikipedia\wikipedia.py", line 5681, in allpages
get_throttle()
File "c:\svn\pywikipedia\wikipedia.py", line 3373, in __call__
time.sleep(waittime)
KeyboardInterrupt
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2009-07-24 09:14
Message:
It is bug. In older versions bot loads only rest of pages and then stop.
Now there is *infinite loop*.
- restore is useful for dump where is the last article present.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2009-07-24 07:24
Message:
It's not a bug. Use -restore instead of the -continue option and it will
stop.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2825996&group_…
Bugs item #2882418, was opened at 2009-10-20 17:30
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2882418&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
>Status: Closed
>Resolution: Out of Date
Priority: 5
Private: No
Submitted By: aindrij Bondarenko (paleozavr)
>Assigned to: xqt (xqt)
Summary: double interwiki
Initial Comment:
Curious addition by my bot is detected in pl-wiki [http://pl.wikipedia.org/w/index.php?title=Cigliano&action=historysubmit&dif…]. Please make such doings impossible for bots using interwiki.py
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2009-12-29 13:18
Message:
I guess this has been fixes in r7234
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2882418&group_…
Bugs item #2616329, was opened at 2009-02-19 14:47
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2616329&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
>Assigned to: xqt (xqt)
Summary: -until: doesn't work correctly
Initial Comment:
When I run bot with -start:foo and [[foo]] doesn't exist, bot will start from fist next word (fop)
But when I run with -until:foo and foo does not exist, bot doesn't stop on [[fonz]] nor [[fop]] but continues to last article
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2009-12-29 13:10
Message:
fixed in r7844.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2616329&group_…