https://bugzilla.wikimedia.org/show_bug.cgi?id=55522
Web browser: ---
Bug ID: 55522
Summary: Option to delete image in imagecopy.py once it's been
transferred
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Once an image has been copied with imagecopy.py and a sysop account is
configured for that wiki, we should provide an option to delete the image.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55024
Web browser: ---
Bug ID: 55024
Summary: Support of preload param in page.get()
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/320/
Reported by: Anonymous user
Created on: 2012-11-17 21:33:36
Subject: Support of preload param in page.get()
Original description:
Would it be nice to support 'inprop': 'preload' for 'prop': 'info' when
calling query GetData\(\) to get text even if page is does not exist.
It could be added as argument to page.get\(..., preload = False\) and used to
return data\['query'\]\['pages'\]\["-1"\]\['preload'\] with something like:
if data\['query'\]\['pages'\].keys\(\)\[0\] == "-1":
if 'missing' in pageInfo:
if preload:
....
That would be useful when trying to get text in still not created page.
This kind of pages are quite common on en:Wikisource, due text layer being
already present in used djvu files.
E.g. \[Page:Debates in the Several State Conventions, v1.djvu/189\].
Thanks and bye
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55157
Web browser: ---
Bug ID: 55157
Summary: -always invalid in movepages.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1556/
Reported by: yfdyh000
Created on: 2013-01-03 01:33:19
Subject: -always invalid in movepages.py
Original description:
Pywikipedia trunk/pywikipedia/ \(r10857, 2013/01/01, 10:55:12\)
Python 2.7.3 \(default, Apr 10 2012, 23:31:26\) \[MSC v.1500 32 bit \(Intel\)\]
config-settings:
use\_api = True
use\_api\_login = True
unicode test: ok
Always automatically start submission after running, not prompt user to
confirm.
Because 'if self.addprefix or self.appendAll or self.regexAll:' is False, so it
directly submitted by 'else:'.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55133
Web browser: ---
Bug ID: 55133
Summary: Parsing error for Link instances
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1643/
Reported by: xqt
Created on: 2013-07-14 14:07:44.612000
Subject: Parsing error for Link instances
Original description:
I get a parsing error for Link.parse() e.g. for the following statements:
>>> import pwb; import pywikibot as wp
>>> l = wp.Link(u'w:de:Foo')
>>> l
Traceback (most recent call last):
File "<pyshell#135>", line 1, in <module>
l
File "pywikibot\page.py", line 2931, in __repr__
return "pywikibot.page.Link(%r, %r)" % (self.title, self.site)
File "pywikibot\page.py", line 3101, in title
self.parse()
File "pywikibot\page.py", line 3007, in parse
% self._text)
Error: Improperly formatted interwiki link 'w:de:Foo'
using wikipedia: instead of w: the Link is wrong:
>>> l = wp.Link(u'wikipedia:de:Foo')
>>> l
pywikibot.page.Link(u'De:Foo', Site("de", "wikipedia"))
>>>
It works right for wikt: and wiktionary:
>>> l = wp.Link(u'wikt:de:Foo')
>>> l
pywikibot.page.Link(u'Foo', Site("de", "wiktionary"))
>>>
>>> l = wp.Link(u'wikt:de:Foo')
>>> l
pywikibot.page.Link(u'Foo', Site("de", "wiktionary"))
>>>
>>> l = wp.Link(u'wiktionary:de:Foo')
>>> l
pywikibot.page.Link(u'Foo', Site("de", "wiktionary"))
>>>
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55072
Web browser: ---
Bug ID: 55072
Summary: delete.py should accept pageids
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/231/
Reported by: mike_lifeguard
Created on: 2009-11-17 13:22:27
Subject: delete.py should accept pageids
Original description:
It is apparently possible to delete invalid titles using the API by specifying
the pageid. delete.py should therefore accept pageids to delete, rather than
titles. This should be possible with the -pageid:1234 parameter, or
-file:/whatever -pageid if the file is a list of pageids rather than titles.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55069
Web browser: ---
Bug ID: 55069
Summary: Adding titles to external links
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/236/
Reported by: Anonymous user
Created on: 2010-01-09 16:56:40
Subject: Adding titles to external links
Original description:
Hi all. It would be nice a script \(based in reflinks.py\) that put titles to
external links that don't have.
For example: "\[http://en.wikipedia.org\]" → "\[http://en.wikipedia.org
Wikipedia, the free encyclopedia\]"
I ask for a script that makes the same as reflinks.py \(or a fix to reflinks to
allow that\), but not only for external links in references.
Thanks in advance.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55043
Web browser: ---
Bug ID: 55043
Summary: welcom.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/294/
Reported by: reza1615
Created on: 2011-11-10 13:54:49
Subject: welcom.py
Original description:
in welcome.py bot in -random case
Is it possible to add sign name which is chosen as random in Edit summary?
now in Special:Contributions it is not possible to find which signs is used\!
for example:
Edit summary==> welcome\! \(bye user:reza1615 sign\)
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55048
Web browser: ---
Bug ID: 55048
Summary: Add roman.py to framework
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/285/
Reported by: binbot
Created on: 2011-04-30 12:05:25
Subject: Add roman.py to framework
Original description:
I wrote a new script, see at
http://hu.wikipedia.org/wiki/Szerkeszt%C5%91:BinBot/roman.py
It transforms any Arabic number to Roman and Romans to Arabic from 1 to 3999
\(the largest regular Roman number\) and has a list comprehension function as
well.
I would like to have it in the framework, because TOCbot
\(http://hu.wikipedia.org/wiki/Szerkeszt%C5%91:Bin%C3%A1ris/TOCbot\) will need
it. Please help to supply a version number.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55049
Web browser: ---
Bug ID: 55049
Summary: enable/disable cosmetic_changes per namespace
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/283/
Reported by: Anonymous user
Created on: 2011-02-09 07:24:27
Subject: enable/disable cosmetic_changes per namespace
Original description:
Cosmetic changes are helpful for most documents, but dangerous for some
namespaces, esp. Template namespace.
It would be helpful if:
cosmetic\_changes\_disable\_namespace = \{'Template'\}
or something.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55040
Web browser: ---
Bug ID: 55040
Summary: Ignore archives
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/297/
Reported by: hiw
Created on: 2011-12-29 20:09:27
Subject: Ignore archives
Original description:
Since there are sometimes interwikis on pages like it happened here:
http://de.wikipedia.org/w/index.php?title=Wikipedia:Auskunft/Archiv/2010/Wo…
Since this page is an archive, marked with the template \{\{Archiv\}\} I
suggest adding all those templates to the ignore page section in the
pagegenerator.
--
You are receiving this mail because:
You are the assignee for the bug.