https://bugzilla.wikimedia.org/show_bug.cgi?id=55020
Web browser: ---
Bug ID: 55020
Summary: webcitearchiver.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/324/
Reported by: n-fran
Created on: 2013-01-23 15:58:51
Subject: webcitearchiver.py
Original description:
References to the Internet-resources often become dead and information on
external sites is often moved. Therefore, it is very need a bot that would
create archive copies of web pages on the WebCite or similar web-resources. The
principle of its operation is:
http://ru.wikipedia.org/w/index.php?title=%D0%A0%D0%B0%D0%B9%D0%BA%D0%BE%D0…
In Russian Wikipedia is User:WebCite Archiver, in the English Wikipedia has a
User:WebCiteBOT. In addition, the English Wikipedia has a User:lowercase
sigmabot III, written in python. The source code here:
https://github.com/legoktm/webcite-bot
But setting up and using lowercase sigmabot III are available only for
experienced python programmers. We would like a version of the bot, which could
be understood as a novice programmers and non-programmers. Also, as with all
scripts pywikipediabot, the treatment of which requires no special skills.
Thanks.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55019
Web browser: ---
Bug ID: 55019
Summary: standardize_notes.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/325/
Reported by: n-fran
Created on: 2013-01-24 13:21:43
Subject: standardize_notes.py
Original description:
Bot old, but desired, it can convert reflinks in cite web, and then such web
links can be archived with the help of WebCiteBOT. But you need to update the
script. An example of his work at the moment:
http://u.to/WZHLAg
That should be correct:
1\) Template \{\{web reference\}\} is not used, instead of him using a template
\{\{cite web\}\}.
2\) Template \{\{subst:Footnote3text\}\} no longer used.
3\) Bot creates the 'Notes' when this section is already there.
Thanks.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55018
Web browser: ---
Bug ID: 55018
Summary: standardize_notes.py encoding
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/327/
Reported by: n-fran
Created on: 2013-01-25 14:38:46
Subject: standardize_notes.py encoding
Original description:
If I want to add to the script text of russian letters, is this error:
UnicodeDecodeError: 'ascii' codec can't decode byte 0xd0 in position 0: ordinal
not in range\(128\)
To avoid this error, I think, it is necessary to register in the code bot these
or any of the other lines:
\# -\*- coding: utf-8 -\*-
import sys
reload\(sys\)
sys.setdefaultencoding\('utf-8'\)
And my bot started to function. Thanks.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55148
Web browser: ---
Bug ID: 55148
Summary: update wikipdata's item one time!
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1589/
Reported by: reza1615
Created on: 2013-02-27 10:18:28
Subject: update wikipdata's item one time!
Original description:
Now if we want add data below we should edit item 5 times\! it will crowded
and messy the history \(imagine updating item with 60 interwikis\!\)
lang:en >label:foo
lang:de >label:foo
lang:fa >label:foo
labe:ru >label:foo
lang:nl >;abel foo
setitem should accept item in this style
items=\[\{lang:en,label:foo\},\{lang:de,label:foo\},\{lang:fa,label:foo\},\{lang:ru,label:foo\},\{lang:nl,label:foo\}\]
also for wikilinks and descriptions and aliases.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55004
Web browser: ---
Bug ID: 55004
Summary: claimit.py sample: add options for entry of time
values
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/342/
Reported by: apac1
Created on: 2013-08-29 03:52:03.793000
Subject: claimit.py sample: add options for entry of time values
Original description:
Add the currently unsupported datatype
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55001
Web browser: ---
Bug ID: 55001
Summary: claimit.py sample: add option to read from file (list
of items and properties)
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/345/
Reported by: apac1
Created on: 2013-08-29 03:57:41.898000
Subject: claimit.py sample: add option to read from file (list of items and
properties)
Original description:
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55012
Web browser: ---
Bug ID: 55012
Summary: pagegenerators.py -new for other namespaces
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/334/
Reported by: cdpark
Created on: 2013-03-27 02:38:37
Subject: pagegenerators.py -new for other namespaces
Original description:
"pagegenerators.py -new -ns:14" does not work now. It was generated new
caterogories in past.
It would be helpful if command line supports namespace argument \(for
monitoring, ...\)
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55009
Web browser: ---
Bug ID: 55009
Summary: Manual mode in fixing_redirects.py
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/337/
Reported by: n-fran
Created on: 2013-04-09 11:43:17
Subject: Manual mode in fixing_redirects.py
Original description:
It is necessary to enter opportunity manual, not automatic work for this
script. Reasons:
1\) Let's say B is the redirection, and A - article name. When the bot looks
articles, he automatically corrects \[\[B\]\] on \[\[A|B\]\] that doesn't bring
any benefit and can complicate reading wikitext for inexperienced users. In my
opinion, in a case when \[\[B\]\] is a name of the redirection, the bot
shouldn't make change on the page.
2\) In the Russian Wikipedia it isn't recommended to correct the redirection in
some cases. Probably, in other language sections there are similar rules.
http://ru.wikipedia.org/wiki/%D0%92%D0%B8%D0%BA%D0%B8%D0%BF%D0%B5%D0%B4%D0%…
Thanks.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55005
Web browser: ---
Bug ID: 55005
Summary: Create pagegenerators for Wikidata (items from file,
items using a given property, items linking to items,
etc)
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: ASSIGNED
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/341/
Reported by: apac1
Created on: 2013-08-29 03:50:31.994000
Subject: Create pagegenerators for Wikidata (items from file, items using a
given property, items linking to items, etc)
Assigned to: legoktm
Original description:
Currently pagegenerators.py doesn't seem to work for Wikidata. Either existing
generators would need to be adapted or new ones created.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=54745
Web browser: ---
Bug ID: 54745
Summary: Page.iterlanglinks for a page on commons returns pages
on commons
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
>>> import pywikibot as p
>>> s=p.Site('commons','commons')
>>> pg=p.Page(s, 'New York City')
>>> for i in pg.iterlanglinks(): print i.site
...
commons:commons
<snip>
There are a few things that are working together to cause this:
iterlanglinks calls Site.pagelanglinks which does:
yield pywikibot.Link.langlinkUnsafe(linkdata['lang'],
linkdata['*'],
source=self)
In langlinkUnsafe, there is:
link._site = pywikibot.Site(lang, source.family.name)
Now, unfortunately for commons:
>>> p.Site('en','commons')
Site("commons", "commons")
Another issue is that
https://commons.wikimedia.org/w/api.php?action=query&titles=New%20York%20Ci…
(the actual API query we make) only returns language codes, not full database
names.
--
You are receiving this mail because:
You are the assignee for the bug.