https://bugzilla.wikimedia.org/show_bug.cgi?id=69090
Bug ID: 69090
Summary: upload.py attempts uploads on wikis that dont allows
uploads
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: jayvdb(a)gmail.com
Web browser: ---
Mobile Platform: ---
Uploads are disabled on Wikidata: https://www.wikidata.org/wiki/Special:Upload
upload.py happily allows the user to attempt to upload, but then reports an
APIError. If pywikibot cant detect whether uploads are enabled before upload,
it should detect this APIError and provide a nice response for the user.
$ python pwb.py -family:wikidata -lang:wikidata scripts/upload.py
No input filename given
File or URL where image is now: blahblah.png
The filename on the target wiki will default to: blahblah.png
Enter a better name, or press enter to accept:
The suggested description is:
Do you want to change this description? ([y]es, [N]o) y
Uploading file to wikidata:wikidata via API....
Reading file blahblah.png
ERROR: Upload error:
Traceback (most recent call last):
File "scripts/upload.py", line 216, in upload_image
ignore_warnings=self.ignoreWarning)
File ".../pywikibot/site.py", line 3432, in upload
result = req.submit()
File ".../pywikibot/data/api.py", line 418, in submit
raise APIError(code, info, **result["error"])
APIError: uploaddisabled: Uploads are not enabled. Make sure $wgEnableUploads
is set to true in LocalSettings.php and the PHP ini setting file_uploads is
true
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=61555
Bug ID: 61555
Summary: Check for new messages on wiki during bot run
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: valhallasw(a)arctus.nl
Web browser: ---
Mobile Platform: ---
At the moment (at least for core, and probably also for compat), new messages
are only seen if the bot is restarted. For long-running bots, it's sensible to
break/pause/stop editing on a certain wiki when a talk page is edited.
Getting the info from the wiki is easy -- just add meta=userinfo&uiprop=hasmsg
to the api query. Afterwards, meta=notifications can be used for details.
Added bonus: every api query returns a check whether the user is still logged
in correctly.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=57995
Web browser: ---
Bug ID: 57995
Summary: Add function for list=watchlistraw
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Didn't see a function which retrieves a user's watchlist (Site.watchlist_revs
fetches changes on the watchlist)
This isn't as easy as it looks since the elements are returned directly under
'watchlistraw' and not 'query' like every other API module...
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55024
Web browser: ---
Bug ID: 55024
Summary: Support of preload param in page.get()
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/320/
Reported by: Anonymous user
Created on: 2012-11-17 21:33:36
Subject: Support of preload param in page.get()
Original description:
Would it be nice to support 'inprop': 'preload' for 'prop': 'info' when
calling query GetData\(\) to get text even if page is does not exist.
It could be added as argument to page.get\(..., preload = False\) and used to
return data\['query'\]\['pages'\]\["-1"\]\['preload'\] with something like:
if data\['query'\]\['pages'\].keys\(\)\[0\] == "-1":
if 'missing' in pageInfo:
if preload:
....
That would be useful when trying to get text in still not created page.
This kind of pages are quite common on en:Wikisource, due text layer being
already present in used djvu files.
E.g. \[Page:Debates in the Several State Conventions, v1.djvu/189\].
Thanks and bye
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=64853
Bug ID: 64853
Summary: Port djvutext.py to core
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: valhallasw(a)arctus.nl
Blocks: 55880
Web browser: ---
Mobile Platform: ---
Might be non-trivial due to dependency on a djvu reading program ('djvused'),
part of djvulibre-bin
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55157
Web browser: ---
Bug ID: 55157
Summary: -always invalid in movepages.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1556/
Reported by: yfdyh000
Created on: 2013-01-03 01:33:19
Subject: -always invalid in movepages.py
Original description:
Pywikipedia trunk/pywikipedia/ \(r10857, 2013/01/01, 10:55:12\)
Python 2.7.3 \(default, Apr 10 2012, 23:31:26\) \[MSC v.1500 32 bit \(Intel\)\]
config-settings:
use\_api = True
use\_api\_login = True
unicode test: ok
Always automatically start submission after running, not prompt user to
confirm.
Because 'if self.addprefix or self.appendAll or self.regexAll:' is False, so it
directly submitted by 'else:'.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=68659
Bug ID: 68659
Summary: imagetransfer fails badly if specified page does not
begin with correct namespace
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: jayvdb(a)gmail.com
Web browser: ---
Mobile Platform: ---
$ python pwb.py imagetransfer -family:wikipedia -lang:en
"Joachim_Holst-Jensen_1923.jpg"
Traceback (most recent call last):
File "pwb.py", line 157, in <module>
run_python_file(fn, argv, argvu)
File "pwb.py", line 67, in run_python_file
exec(compile(source, filename, "exec"), main_mod.__dict__)
File "scripts/imagetransfer.py", line 356, in <module>
main()
File "scripts/imagetransfer.py", line 353, in main
bot.run()
File "scripts/imagetransfer.py", line 273, in run
{'title': page.title(), 'ns': pywikibot.Site().image_namespace()})
File ".../pywikibot/data/api.py", line 856, in result
p = PageGenerator.result(self, pagedata)
File ".../pywikibot/data/api.py", line 840, in result
update_page(p, pagedata)
File ".../pywikibot/data/api.py", line 996, in update_page
"Page %s has neither 'pageid' nor 'missing' attribute" % pagedict['title'])
AssertionError: Page Joachim Holst-Jensen 1923.jpg has neither 'pageid' nor
'missing' attribute
<type 'exceptions.AssertionError'>
CRITICAL: Waiting for 1 network thread(s) to finish. Press ctrl-c to abort
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55133
Web browser: ---
Bug ID: 55133
Summary: Parsing error for Link instances
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1643/
Reported by: xqt
Created on: 2013-07-14 14:07:44.612000
Subject: Parsing error for Link instances
Original description:
I get a parsing error for Link.parse() e.g. for the following statements:
>>> import pwb; import pywikibot as wp
>>> l = wp.Link(u'w:de:Foo')
>>> l
Traceback (most recent call last):
File "<pyshell#135>", line 1, in <module>
l
File "pywikibot\page.py", line 2931, in __repr__
return "pywikibot.page.Link(%r, %r)" % (self.title, self.site)
File "pywikibot\page.py", line 3101, in title
self.parse()
File "pywikibot\page.py", line 3007, in parse
% self._text)
Error: Improperly formatted interwiki link 'w:de:Foo'
using wikipedia: instead of w: the Link is wrong:
>>> l = wp.Link(u'wikipedia:de:Foo')
>>> l
pywikibot.page.Link(u'De:Foo', Site("de", "wikipedia"))
>>>
It works right for wikt: and wiktionary:
>>> l = wp.Link(u'wikt:de:Foo')
>>> l
pywikibot.page.Link(u'Foo', Site("de", "wiktionary"))
>>>
>>> l = wp.Link(u'wikt:de:Foo')
>>> l
pywikibot.page.Link(u'Foo', Site("de", "wiktionary"))
>>>
>>> l = wp.Link(u'wiktionary:de:Foo')
>>> l
pywikibot.page.Link(u'Foo', Site("de", "wiktionary"))
>>>
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55072
Web browser: ---
Bug ID: 55072
Summary: delete.py should accept pageids
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/231/
Reported by: mike_lifeguard
Created on: 2009-11-17 13:22:27
Subject: delete.py should accept pageids
Original description:
It is apparently possible to delete invalid titles using the API by specifying
the pageid. delete.py should therefore accept pageids to delete, rather than
titles. This should be possible with the -pageid:1234 parameter, or
-file:/whatever -pageid if the file is a list of pageids rather than titles.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55069
Web browser: ---
Bug ID: 55069
Summary: Adding titles to external links
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/236/
Reported by: Anonymous user
Created on: 2010-01-09 16:56:40
Subject: Adding titles to external links
Original description:
Hi all. It would be nice a script \(based in reflinks.py\) that put titles to
external links that don't have.
For example: "\[http://en.wikipedia.org\]" → "\[http://en.wikipedia.org
Wikipedia, the free encyclopedia\]"
I ask for a script that makes the same as reflinks.py \(or a fix to reflinks to
allow that\), but not only for external links in references.
Thanks in advance.
--
You are receiving this mail because:
You are the assignee for the bug.