https://bugzilla.wikimedia.org/show_bug.cgi?id=54544
Web browser: ---
Bug ID: 54544
Summary: important fix for i18n.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/patches/617/
Reported by: Anonymous user
Created on: 2013-05-28 23:05:42
Subject: important fix for i18n.py
Original description:
the "i18n.py" file in the "/pywikibot" directory has a useful "translate"
function; but it doesn't work well with multiple "\{\{PLURAL\}\}" directives in
the same string: for example, the following code:
pywikibot.i18n.translate\('en',\{'en':'%\(links\)d
\{\{PLURAL:%\(links\)d|link|links\}\} and %\(apples\)d
\{\{PLURAL:%\(apples\)d|apple|apples\}\}'\},\{'links':1,'apples':4\}\)
returns "1 link and 4 link" instead of "1 link and 4 apples".
I've fixed that doing a "while" loop for each \{\{PLURAL\}\} and
replacing/translating only one of them at a time; probably other functions need
such fixes as well.
My version is attached, feel free to improve it and include it in the
pywikipediabot rewrite branch.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54573
Web browser: ---
Bug ID: 54573
Summary: rvdiffto parameter implementation
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/patches/447/
Reported by: Anonymous user
Created on: 2010-05-27 03:19:29
Subject: rvdiffto parameter implementation
Original description:
No revisions diff text loading function is implemented in the framework. Here
is one:
Changelog:
Modified site.loadrevisions\(\) method to support rvdiffto parameter.
Added a Page.Revision.Diff class for storing the diff text and revto id.
Modified api.update\_page\(\) to save the new diff information.
A method from Page.py is still missing to get diffs just like you get a
revision now. But you can get the diff text from
page.\_revision\[id\].diff.text directly for now.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54549
Web browser: ---
Bug ID: 54549
Summary: Not load unnecessary data in token()
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: ASSIGNED
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/patches/606/
Reported by: Anonymous user
Created on: 2013-04-08 12:22:18
Subject: Not load unnecessary data in token()
Assigned to: legoktm
Original description:
In token\(\), It queries info and \*all\* revisions of a page. Querying all
revisions is too expensive and unnecessary. For example, if I just want to
process the last revision of several pages and put them back, with the old
code, it loads all revisions when putting. It makes putting in rewrite branch
spend time about 10x compared to that in the trunk.
The patch I am presenting just do not load revisions. It shouldn't break other
functions since needed data has included in the info of a page.
\----
Pywikibot branches/rewrite/ \(r11357, 2013/04/07, 14:50:30, ok\)
Python 2.7.3 \(default, Sep 26 2012, 21:53:58\)
\[GCC 4.7.2\]
unicode test: ok
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54745
Web browser: ---
Bug ID: 54745
Summary: Page.iterlanglinks for a page on commons returns pages
on commons
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
>>> import pywikibot as p
>>> s=p.Site('commons','commons')
>>> pg=p.Page(s, 'New York City')
>>> for i in pg.iterlanglinks(): print i.site
...
commons:commons
<snip>
There are a few things that are working together to cause this:
iterlanglinks calls Site.pagelanglinks which does:
yield pywikibot.Link.langlinkUnsafe(linkdata['lang'],
linkdata['*'],
source=self)
In langlinkUnsafe, there is:
link._site = pywikibot.Site(lang, source.family.name)
Now, unfortunately for commons:
>>> p.Site('en','commons')
Site("commons", "commons")
Another issue is that
https://commons.wikimedia.org/w/api.php?action=query&titles=New%20York%20Ci…
(the actual API query we make) only returns language codes, not full database
names.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54556
Web browser: ---
Bug ID: 54556
Summary: Showing content as summary of a page
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/patches/570/
Reported by: t-shrinivasan
Created on: 2012-10-25 19:02:20
Subject: Showing content as summary of a page
Original description:
It will be nice to have an option to have the contnet of a page as summary.
for the file pagefromfile.py
I added this feature and attaching the diff file.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54564
Web browser: ---
Bug ID: 54564
Summary: Ignore swiss-related articles in spellcheck.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/patches/550/
Reported by: loxley
Created on: 2012-05-12 12:22:48
Subject: Ignore swiss-related articles in spellcheck.py
Original description:
Add a switch \(-ignoreswiss\) to make the checker ignore articles with a
"<\!--schweizbezogen-->" comment. This is useful on the german wikipedia
where spelling is different in swiss articles.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54569
Web browser: ---
Bug ID: 54569
Summary: Retrieve / edit the section
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/patches/521/
Reported by: lankier
Created on: 2011-07-15 10:11:55
Subject: Retrieve / edit the section
Original description:
This patch adds new parameter 'section' to Page.get & Page.put.
\(See also feature requests
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3104703&group\…
\)
Examples:
add a new section:
page.put\('New section text', comment='New section header', section='new'\)
edit the top section:
text = page.get\(section=0\)
page.put\(text+'\n\n==New section==\nNew text', section=0\)
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54685
Web browser: ---
Bug ID: 54685
Summary: WindowsError: [Error 32] while renaming log file
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: info(a)gno.de
Classification: Unclassified
Mobile Platform: ---
Getting [[Kategorie:Wikipedia:Seite mit fehlendem References-Tag]] list...
Traceback (most recent call last):
File "C:\Python27\lib\logging\handlers.py", line 78, in emit
self.doRollover()
File "C:\Python27\lib\logging\handlers.py", line 338, in doRollover
os.rename(self.baseFilename, dfn)
WindowsError: [Error 32] Der Prozess kann nicht auf die Datei zugreifen, da sie
von einem anderen Prozess verwendet wird
Logged from file wikipedia.py, line 9333
Getting 5 pages from wikipedia:de...
Traceback (most recent call last):
File "C:\Python27\lib\logging\handlers.py", line 78, in emit
self.doRollover()
File "C:\Python27\lib\logging\handlers.py", line 338, in doRollover
os.rename(self.baseFilename, dfn)
WindowsError: [Error 32] Der Prozess kann nicht auf die Datei zugreifen, da sie
von einem anderen Prozess verwendet wird
Logged from file wikipedia.py, line 9333
>>> Briefwahl <<<
Traceback (most recent call last):
File "C:\Python27\lib\logging\handlers.py", line 78, in emit
self.doRollover()
File "C:\Python27\lib\logging\handlers.py", line 338, in doRollover
os.rename(self.baseFilename, dfn)
WindowsError: [Error 32] Der Prozess kann nicht auf die Datei zugreifen, da sie
von einem anderen Prozess verwendet wird
Logged from file wikipedia.py, line 9333
No changes necessary: references tag found.
Traceback (most recent call last):
File "C:\Python27\lib\logging\handlers.py", line 78, in emit
self.doRollover()
File "C:\Python27\lib\logging\handlers.py", line 338, in doRollover
os.rename(self.baseFilename, dfn)
WindowsError: [Error 32] Der Prozess kann nicht auf die Datei zugreifen, da sie
von einem anderen Prozess verwendet wird
Logged from file wikipedia.py, line 9333
C:\pwb\compat>version.py
Pywikipedia wikipedia.py (r-1 (unknown), ???????, 2013/09/19, 07:37:28,
OUTDATED
)
Python 2.7.3 (default, Apr 10 2012, 23:24:47) [MSC v.1500 64 bit (AMD64)]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
C:\pwb\compat>
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54572
Web browser: ---
Bug ID: 54572
Summary: non ascii in system messages and max retry
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: ASSIGNED
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/patches/477/
Reported by: lankier
Created on: 2010-10-22 10:26:44
Subject: non ascii in system messages and max retry
Assigned to: xqt
Original description:
This patch fixed two issues:
1\. Ubuntu has non ascii in system messages.
Test:
$ sudo ifconfig eth0 down
$ cat test.py
import wikipedia
site = wikipedia.getSite\(\)
page = wikipedia.Page\(site, 'S'\)
text = page.get\(\)
$ LANG=ru\_RU.utf8 python test.py
Error downloading data: 'ascii' codec can't decode byte 0xd0 in position 27:
ordinal not in range\(128\)
Request
ru:/w/api.php?inprop=protection%7Ctalkid%7Csubjectid%7Curl%7Creadable&format=json&rvprop=content%7Cids%7Cflags%7Ctimestamp%7Cuser%7Ccomment%7Csize&prop=revisions%7Cinfo&titles=S&rvlimit=1&action=query
Retrying in 1 minutes...
^C
After fix \(added "e = unicode\(str\(e\), locale.getpreferredencoding\(\)\)"\):
$ LANG=ru\_RU.utf8 python test.py
<urlopen error \[Errno 101\] Сеть недоступна>
WARNING: Could not open \[...\]
2\. Added raise MaxTriesExceededError when max tries exceeded.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54311
Web browser: ---
Bug ID: 54311
Summary: Random api errors
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: Wikidata
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: maarten(a)mdammers.nl
Classification: Unclassified
Mobile Platform: ---
I'm running several bots and every once in a while a bot crashes:
Adding new template claim to [[wikidata:Q6614219]]
Traceback (most recent call last):
File "C:\pywikibot\core\add_template.py", line 120, in <module>
main()
File "C:\pywikibot\core\add_template.py", line 117, in main
bot.run()
File "C:\pywikibot\core\add_template.py", line 52, in run
item.addClaim(newclaim)
File "C:\pywikibot\core\pywikibot\page.py", line 2676, in addClaim
self.repo.addClaim(self, claim, bot=bot, **kwargs)
File "C:\pywikibot\core\pywikibot\site.py", line 709, in callee
return fn(self, *args, **kwargs)
File "C:\pywikibot\core\pywikibot\site.py", line 3578, in addClaim
data = req.submit()
File "C:\pywikibot\core\pywikibot\data\api.py", line 394, in submit
raise APIError(code, info, **result["error"])
pywikibot.data.api.APIError: badtoken: * '''Sorry! We could not process your
edi
t due to a loss of session data.'''
Please try again.
If it still does not work, try [[Special:UserLogout|logging out]] and logging
ba
ck in.
* There seems to be a problem with your login session;
this action has been canceled as a precaution against session hijacking.
Go back to the previous page, reload that page and then try again.
This exception should be caught and the addClaim should be retried.
Maybe a Wikidata bug should be filed too to figure out why we seem to loose
session data every once in a while
--
You are receiving this mail because:
You are the assignee for the bug.