https://bugzilla.wikimedia.org/show_bug.cgi?id=59649
Web browser: ---
Bug ID: 59649
Summary: transferbot does not support namespaces in right way
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: info(a)gno.de
Classification: Unclassified
Mobile Platform: ---
c:\Pywikipedia\ssh\pywikibot\core>pwb.py transferbot -simulate
-page:user:xqt/Te
st -tolang:en -overwrite -tofamily:wiktionary
Page transfer configuration
---------------------------
Source: Site("de", "wikipedia")
Target: Site("en", "wiktionary")
Pages to transfer: -page:user:xqt/Test
Prefix for transferred pages:
Moving [[Benutzer:Xqt/Test]] to [[wiktionary:en:Benutzer:Xqt/Test]]...
<noinclude>
The appended summary text is:
<small>This page was moved from [[wikipedia:de:Benutzer:Xqt/Test]]. It's edit
hi
story can be viewed at [[Benutzer:Xqt/Test/edithistory]]</small>
There are wrong namespaces aliases on target site. This could be solved easily,
but I don't know how to proceed with prefix because that might be a namespace.
What is the right way:
prefix + namespace + title
or
prefix + title
or
namespace + prefix + title
where title is without namespace in this sample.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=62964
Bug ID: 62964
Summary: RuntimeError: maximum recursion depth exceeded in cmp
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: maarten(a)mdammers.nl
Web browser: ---
Mobile Platform: ---
I'm running:
pwb.py claimit.py -lang:tl P31 Q55488 P17 Q928
-catr:Category:Mga_estasyon_ng_daangbakal_ng_Pilipinas
Processing [[tl:Estasyong daangbakal ng Abenidang E. Delos Santos]]
Processing [[tl:Estasyong daangbakal ng Laong Laan]]
Processing [[tl:Estasyong daangbakal ng Nichols]]
Processing [[tl:Estasyong daangbakal ng Santa Mesa]]
Processing [[tl:Estasyong Abad Santos ng LRT]]
Processing [[tl:Estasyong Anonas ng LRT]]
[[tl:Estasyong Anonas ng LRT]] doesn't have a wikidata item :(
Processing [[tl:Estasyong Baclaran ng LRT]]
Processing [[tl:Estasyong Betty Go-Belmonte ng LRT]]
Processing [[tl:Estasyong Blumentritt ng LRT]]
Processing [[tl:Estasyong Sentrong Araneta-Cubao ng LRT]]
[[tl:Estasyong Sentrong Araneta-Cubao ng LRT]] doesn't have a wikidata item :(
Processing [[tl:Estasyong Abenidang E.Delos Santos ng LRT]]
Processing [[tl:Estasyong Doroteo Jose ng LRT]]
[[tl:Estasyong Doroteo Jose ng LRT]] doesn't have a wikidata item :(
Processing [[tl:Estasyong Gilmore ng LRT]]
[[tl:Estasyong Gilmore ng LRT]] doesn't have a wikidata item :(
Processing [[tl:Estasyong J. Ruiz ng LRT]]
Processing [[tl:Estasyong Katipunan ng LRT]]
Processing [[tl:Estasyong Legarda ng LRT]]
Processing [[tl:Estasyong Quirino ng LRT]]
Processing [[tl:Estasyong Recto ng LRT]]
Processing [[tl:Estasyong Santolan ng LRT]]
[[tl:Estasyong Santolan ng LRT]] doesn't have a wikidata item :(
Processing [[tl:Estasyong Tayuman ng LRT]]
Processing [[tl:Estasyong V. Mapa ng LRT]]
Processing [[tl:Estasyong Ayala ng MRT]]
Processing [[tl:Estasyong Boni ng MRT]]
Processing [[tl:Estasyong Buendia ng MRT]]
Processing [[tl:Estasyong Sentrong Araneta-Cubao ng MRT]]
Processing [[tl:Estasyong Guadalupe ng MRT]]
Processing [[tl:Estasyong Abenida Hilaga ng MRT]]
Processing [[tl:Estasyong Kamuning ng MRT]]
Processing [[tl:Estasyong Magallanes ng MRT]]
Processing [[tl:Estasyong Ortigas ng MRT]]
Processing [[tl:Estasyong Abenida Quezon ng MRT]]
Processing [[tl:Estasyong Bulebar Shaw ng MRT]]
Processing [[tl:Estasyong Santolan ng MRT]]
Processing [[tl:Estasyong Abenida Taft ng MRT]]
Traceback (most recent call last):
File "C:\pywikibot\core\pwb.py", line 143, in <module>
run_python_file(fn, argv, argvu)
File "C:\pywikibot\core\pwb.py", line 67, in run_python_file
exec(compile(source, filename, "exec"), main_mod.__dict__)
File "C:\pywikibot\core\scripts\claimit.py", line 219, in <module>
main()
File "C:\pywikibot\core\scripts\claimit.py", line 216, in main
bot.run()
File "C:\pywikibot\core\scripts\claimit.py", line 111, in run
for page in self.generator:
File "C:\pywikibot\core\pywikibot\pagegenerators.py", line 799, in
DuplicateFi
lterPageGenerator
for page in generator:
File "C:\pywikibot\core\pywikibot\pagegenerators.py", line 661, in
Categorized
PageGenerator
for a in category.articles(**kwargs):
File "C:\pywikibot\core\pywikibot\page.py", line 1879, in articles
endsort=endsort
(.....)
File "C:\pywikibot\core\pywikibot\page.py", line 1879, in articles
endsort=endsort
File "C:\pywikibot\core\pywikibot\page.py", line 1879, in articles
endsort=endsort
File "C:\pywikibot\core\pywikibot\page.py", line 1879, in articles
endsort=endsort
File "C:\pywikibot\core\pywikibot\page.py", line 1879, in articles
endsort=endsort
File "C:\pywikibot\core\pywikibot\page.py", line 1879, in articles
endsort=endsort
File "C:\pywikibot\core\pywikibot\page.py", line 1871, in articles
for subcat in self.subcategories(step=step):
File "C:\pywikibot\core\pywikibot\page.py", line 1777, in subcategories
total=total, content=content):
File "C:\pywikibot\core\pywikibot\site.py", line 1833, in categorymembers
**cmargs)
File "C:\pywikibot\core\pywikibot\site.py", line 832, in _generator
gen = gen_class(type_arg, site=self, **args)
File "C:\pywikibot\core\pywikibot\data\api.py", line 786, in __init__
QueryGenerator.__init__(self, generator=generator, **kwargs)
File "C:\pywikibot\core\pywikibot\data\api.py", line 517, in __init__
self.request = Request(**kwargs)
File "C:\pywikibot\core\pywikibot\data\api.py", line 137, in __init__
self.update(**kwargs)
File "C:\Python27\lib\_abcoll.py", line 492, in update
if isinstance(other, Mapping):
File "C:\Python27\lib\abc.py", line 141, in __instancecheck__
subtype in cls._abc_negative_cache):
File "C:\Python27\lib\_weakrefset.py", line 73, in __contains__
return wr in self.data
RuntimeError: maximum recursion depth exceeded in cmp
<type 'exceptions.RuntimeError'>
CRITICAL: Waiting for 1 network thread(s) to finish. Press ctrl-c to abort
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55272
Web browser: ---
Bug ID: 55272
Summary: redirectRegex throws type error
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: ASSIGNED
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1201/
Reported by: dnessett
Created on: 2010-06-24 16:31:01
Subject: redirectRegex throws type error
Assigned to: xqt
Original description:
Running MW 1.13.2, the following command throws a type error:
$ python add\_text.py
-cat:Pages\_with\_too\_many\_expensive\_parser\_function\_calls -text:" "
-summary:"Test edit:Category jog for \[\[:Category:Pages with too many
expensive parser function calls|Pages with too many expensive parser function
calls\]\]"
The result is:
Getting \[\[Category:Pages with too many expensive parser function calls\]\]...
Loading 2009 White House Forum on Health Reform/Related Articles...
Do you want to accept these changes? \(\[y\]es, \[N\]o, \[a\]ll\) a
Updating page \[\[2009 White House Forum on Health Reform/Related Articles\]\]
via API
Loading 2010 United Kingdom general election/Related Articles...
Traceback \(most recent call last\):
File "add\_text.py", line 417, in <module>
main\(\)
File "add\_text.py", line 413, in main
create=talkPage\)
File "add\_text.py", line 201, in add\_text
text = page.get\(\)
File "/usr/local/src/python/pywikipedia/local\_sites/wikipedia.py", line 619,
in get
self.\_contents = self.\_getEditPage\(get\_redirect = get\_redirect, throttle =
throttle, sysop = sysop\)
File "/usr/local/src/python/pywikipedia/local\_sites/wikipedia.py", line 727,
in \_getEditPage
m = self.site\(\).redirectRegex\(\).match\(pagetext\)
File "/usr/local/src/python/pywikipedia/local\_sites/wikipedia.py", line 6644,
in redirectRegex
pattern = r'\(?:' + '|'.join\(keywords\) + '\)'
TypeError
version.py output is:
$ python version.py
Pywikipedia \[http\] trunk/pywikipedia \(r8311, 2010/06/22, 13:20:10\)
Python 2.5.2 \(r252:60911, Jan 20 2010, 21:48:48\)
\[GCC 4.2.4 \(Ubuntu 4.2.4-1ubuntu3\)\]
config-settings:
use\_api = True
use\_api\_login = True
This error occurs due to the following bug in the code. At line 6642 is the
following code fragment:
try:
keywords = self.getmagicwords\('redirect'\)
pattern = r'\(?:' + '|'.join\(keywords\) + '\)'
except KeyError:
\# no localized keyword for redirects
pattern = r'\#%s' % default
getmagicwords is a one line method that simply calls siteinfo \(line 5480\)
with the key 'magicwords'. At line 5518, siteinfo calls getData to obtain site
data. When looking for magicwords, the method executes "for entry in
data\[key\]" at line 5527. For certain versions of MW, magicwords are not
returned as part of the site data and therefore data\[key\] returns a null
result. Eventually, this leads to the KeyError exception at line 5538.
The bug arises because siteinfo catches the KeyError exception and returns a
result of "None". When the call is unwound back to line 6643 the provision for
a KeyError at line 6645 is vacuous. The KeyError has already been caught by
siteinfo.
Consequently, the statement at line 6644 executes. This causes a TypeError
since the keyword arguement to .join\(\) is null.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=64846
Bug ID: 64846
Summary: Provide sensible documentation platform
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: valhallasw(a)arctus.nl
CC: info(a)gno.de, maarten(a)mdammers.nl
Blocks: 64840
Web browser: ---
Mobile Platform: ---
The typical documentation platform for Python projects is Sphinx (combined with
readthedocs, typically). However, we have Mediawiki.org, which has the
advantage of having translated pages.
It would be nice to have the English docs in the git repository, as that would
make keeping them up-to-date *much* easier. In addition, that would make it
much easier to auto-generate parts of the documentation (e.g. script parameters
and API docs).
It would be nice if we could find a way to have the best of both worlds, e.g.
by having something that parses docs to Translate-enabled wikitext.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=64841
Bug ID: 64841
Summary: Document possible configuration settings
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: documentation
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: valhallasw(a)arctus.nl
Blocks: 64840
Web browser: ---
Mobile Platform: ---
Currently, https://www.mediawiki.org/wiki/Manual:Pywikibot/user-config.py has
some options, but it's more aimed at creating a config file from scratch (for
which we now have generate-user-config.py).
Would be good to have nicer docs for this.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55234
Web browser: ---
Bug ID: 55234
Summary: bug with section titles in interwiki.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: interwiki.py
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1350/
Reported by: toto-azero
Created on: 2011-09-25 11:54:52
Subject: bug with section titles in interwiki.py
Original description:
There is a bug with the interwiki.py script : when the bot finds out an
interwiki-link which must be deleted \(without -force or -cleanup options\),
and when this link is at the end of a section, the bot destroys the next
section.
See this for example :
http://ja.wikipedia.org/w/index.php?title=ヘンリー銃&diff=37754031&oldid=37738852
Although this diff is quite old, this bug is still present in the script \(I've
made an try to check : see http://fr.wikipedia.org/w/index.php?diff=70277802\).
\--------------------
python version.py
Pywikipedia \[http\] trunk/pywikipedia \(r9543, 2011/09/25, 09:08:55\)
Python 2.7.1 \(r271:86832, Jan 4 2011, 13:57:14\)
\[GCC 4.5.2\]
config-settings:
use\_api = True
use\_api\_login = True
unicode test: ok
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55584
Web browser: ---
Bug ID: 55584
Summary: Don't clutter /usr/lib/python2.7/site-packages/
Product: Pywikibot
Version: unspecified
Hardware: All
OS: Linux
Status: UNCONFIRMED
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: yardenack(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
It's reasonable to expect setup.py to install things in this directory:
/usr/lib/python2.7/site-packages/pywikibot/
But it's not reasonable for it to also put things in these directories:
/usr/lib/python2.7/site-packages/externals/
/usr/lib/python2.7/site-packages/scripts/
/usr/lib/python2.7/site-packages/tests/
What if every python project tried to do that?
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55323
Web browser: ---
Bug ID: 55323
Summary: interwiki links on subpages in templates
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: ASSIGNED
Severity: normal
Priority: Unprioritized
Component: interwiki.py
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/665/
Reported by: Anonymous user
Created on: 2008-03-24 13:19:34
Subject: interwiki links on subpages in templates
Assigned to: bewareofdoug
Original description:
In English and some other major wikipedias interwiki links are placed on /doc
subpage \(or whatever it's called\) in templates. Interwiki bot should check if
such a page exists and not place interwiki links on main template page but
place/update links on that subpage. Otherwise, everytime a bot places interwiki
on a template with this structure, the main template page needs to be cleaned
and interwiki links moved to a subpage manually
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55321
Web browser: ---
Bug ID: 55321
Summary: interwiki.py moving away {{Link FA}}
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: interwiki.py
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/725/
Reported by: Anonymous user
Created on: 2008-06-02 13:02:43
Subject: interwiki.py moving away {{Link FA}}
Original description:
featured.py puts \{\{Link FA\}\} either before all interwikis or right next to
the affected interwiki depending on the wiki setting.
Nevertheless, interwiki.py always moves \{\{Link FA\}\} to the top of
interwikis no matter what language is being updated.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=66063
Bug ID: 66063
Summary: network errors not handled correctly
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: jayvdb(a)gmail.com
Web browser: ---
Mobile Platform: ---
In the event of a http issue of any kind, pywiki will retry 25 times by
default, with a 2 minute delay, meaning a 50 minute wait until it returns to
the command line unless the user interrupts the process with ^C.
Each retry it will print an exception.
In the case I am seeing now, the issue is a firewall preventing access to the
wiki. The firewall immediately responds, so this is not a timeout, and 25
retries 2 minutes apart will not magically make the problem disappear.
WARNING: Waiting 120 seconds before retrying.
ERROR: Traceback (most recent call last):
File "..pywikibot/data/api.py", line 306, in submit
body=paramstring)
File "..pywikibot/comms/http.py", line 155, in request
raise request.data
error: ...
And at the end it will print a long traceback, ending with
File "..pywikibot/data/api.py", line 434, in wait
raise TimeoutError("Maximum retries attempted without success.")
pywikibot.data.api.TimeoutError
<class 'pywikibot.data.api.TimeoutError'>
CRITICAL: Waiting for 1 network thread(s) to finish. Press ctrl-c to abort
The error need not have been a TimeoutError - all http errors are reported as a
TimeoutError.
The http layer should differentiate real timeout errors from other errors that
it can not fix (especially permanent errors like 403 being caused by a
firewall).
--
You are receiving this mail because:
You are the assignee for the bug.