Bugs item #3606641, was opened at 2013-03-02 14:39
Message generated for change (Comment added) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3606641&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: category
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: alchimista ()
Assigned to: Nobody/Anonymous (nobody)
Summary: category_redirect:NameError: global name 'query_type'
Initial Comment:
I'm getting the following error:
Done checking hard-redirect category pages.
Traceback (most recent call last):
File "category_redirect.py", line 579, in <module>
main()
File "category_redirect.py", line 574, in main
bot.run()
File "category_redirect.py", line 409, in run
prop='info|categoryinfo'):
File "category_redirect.py", line 266, in query_results
querydata.update(result['query-continue'][query_type])
NameError: global name 'query_type' is not defined
Last January week it runned fine, so it's provably related to some change on other file in the meanwhile.
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2013-03-07 03:23
Message:
It worked for me on Persian Wikipedia
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3606641&group_…
Bugs item #3594792, was opened at 2012-12-11 05:37
Message generated for change (Settings changed) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3594792&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: cosmetic changes
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: reza (reza1615)
Assigned to: Huji Lee (huji)
Summary: cosmetic_changes.py bug in persian
Initial Comment:
Hi,
cosmetic_changes.py has bug in line 740 changes , to ٬
in some cases that text has Latin characters it shouldn't change it
http://fa.wikipedia.org/w/index.php?title=%D8%A8%D9%87%D8%A7%D8%B1_%D8%B4%D…
it changed
Zaman Əsgərli, ''XIX əsr Azərbaycan şeri antologiyası'', Bakı, "Şərq-Qərb", 2005, p. 254, ISBN 9952-418-69-5
to
Zaman Əsgərli، ''XIX əsr Azərbaycan şeri antologiyası''، Bakı، "Şərq-Qərb"، 2005، p. 254، ISBN 9952-418-69-5
which is not correct please solve it
----------------------------------------------------------------------
Comment By: Huji Lee (huji)
Date: 2012-12-28 12:38
Message:
Fine. I was trying to push for unit testing these before pushing them.
Amird says that these tests have been done before. I assume good faith, and
oblige. Amird has a new idea to implement.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-12-27 06:04
Message:
This is not fixed but ignored
----------------------------------------------------------------------
Comment By: Huji Lee (huji)
Date: 2012-12-25 12:34
Message:
I meant to say
https://www.mediawiki.org/wiki/Special:Code/pywikipedia/10832
----------------------------------------------------------------------
Comment By: Huji Lee (huji)
Date: 2012-12-25 12:33
Message:
With https://www.mediawiki.org/wiki/Special:Code/pywikipedia/10788 I
completely removed the comma conversion. It appears that false positive
rate is too high.
----------------------------------------------------------------------
Comment By: reza (reza1615)
Date: 2012-12-24 05:33
Message:
no it has other bug
http://fa.wikipedia.org/w/index.php?title=%D8%A2%D9%84%DB%8C%D8%A7%DA%98%D9…
it changes , to ،
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2012-12-11 17:52
Message:
Dear Reza,
I made a change:
https://www.mediawiki.org/wiki/Special:Code/pywikipedia/10788
Update your code and run it. is it working correctly?
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3594792&group_…
Bugs item #3605408, was opened at 2013-02-20 03:39
Message generated for change (Settings changed) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3605408&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
>Status: Closed
>Resolution: Out of Date
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: update items in wikidata
Initial Comment:
I want to add new lang to an item in wikidata I use this code it says page is updated but it doesn't change page!
dataSite=wikipedia.getSite('wikidata','wikidata')
data = pywikibot.DataPage(dataSite, 'Q1234')
lang='fa'
fa_title='تست'
data.setitem(summary,items={'type': u'item', 'label': lang, 'value': fa_title})
data.setitem(summary,items={'type': u'sitelink', 'site': lang, 'title': fa_title})
----------------------------------------------------------------------
Comment By: reza (reza1615)
Date: 2013-02-22 03:20
Message:
I confused :)
please write two simple examples which can
1-add a new item to wikidata
2-add a lang , title and lable to existing item
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2013-02-22 02:20
Message:
you first must call
data.get()
before changing a DataPage,otherwise the DataPage.title() is unknown
----------------------------------------------------------------------
Comment By: reza (reza1615)
Date: 2013-02-21 09:14
Message:
as you said I wrote the code
#!/usr/bin/python
# -*- coding: utf-8 -*-
import wikipedia
lang='fa'
fa_title=u'آبشار لاتون'
summary=u'آبشار لاتون,آبشار لاتون'
site=wikipedia.getSite('fa',fam='wikipedia')
page=wikipedia.Page(site,fa_title)
data=wikipedia.DataPage(page)
data.setitem(summary,items={'type': u'item', 'label': lang, 'value':
fa_title})
data.setitem(summary,items={'type': u'sitelink', 'site': lang, 'title':
fa_title})
it shows
Sleeping for 19.6 seconds, 2013-02-21 17:10:27
Updating page [[wikidata:None]] via API
Sleeping for 19.6 seconds, 2013-02-21 17:10:47
Updating page [[wikidata:None]] via API
it doesn't edit wikidata and add fa interwiki to Q5058182
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2013-02-21 08:46
Message:
btw: there is difference between changing an entity and making a item. I'm
working on making an item via API, but John Blad (API manager of wikidata)
is in the vaction. I'll add this feature to the PWB ASAP
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2013-02-21 08:42
Message:
i know the difference between wikipedia and wikidata,. when i say do it,
trust me and do it, for god's sake. if didn't work tell me
----------------------------------------------------------------------
Comment By: reza (reza1615)
Date: 2013-02-21 08:30
Message:
this my code
#!/usr/bin/python
# -*- coding: utf-8 -*-
import wikipedia
dataSite=wikipedia.getSite('wikidata','wikidata')
lang='fa'
fa_title=u'قنات آب خنک، بیرجند'
summary=u'قنات آب خنک، بیرجند, قنات آب خنک،
بیرجند'
page=wikipedia.Page(dataSite,fa_title)
data = wikipedia.DataPage(page)
data.setitem(summary,items={'type': u'item', 'label': lang, 'value':
fa_title})
data.setitem(summary,items={'type': u'sitelink', 'site': lang, 'title':
fa_title})
it shows me this message
Sleeping for 9.6 seconds, 2013-02-21 16:27:25
Updating page [[wikidata:قنات آب خنک، بیرجند]] via API
Sleeping for 9.6 seconds, 2013-02-21 16:27:35
Updating page [[wikidata:قنات آب خنک، بیرجند]] via API
but it don't edit wikidata! and if you check this page you can not find it!
----------------------------------------------------------------------
Comment By: reza (reza1615)
Date: 2013-02-21 08:13
Message:
@amir if I why should I write
site=wikipedia.getSite('fa',fam='wikipedia')
I want to edit in wikidata not fa.wiki.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2013-02-20 23:49
Message:
you may also use
data = wikipedia.DataPage(1234)
for creating the repository Page
or
site = wikipedia.getSite()
data = wikipedia.DataPage(site.data_repository(), "Q1234")
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2013-02-20 04:03
Message:
you must run it in this way:
site=wikipedia.getSite('fa',fam='wikipedia')
page=wikipedia.Page(site,'تست')
data=wikipedia.DataPage(page)
data.setitem(summary,items={'type': u'item', 'label': lang, 'value':
fa_title})
data.setitem(summary,items={'type': u'sitelink', 'site': lang, 'title':
fa_title})
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3605408&group_…
Bugs item #3605546, was opened at 2013-02-21 08:04
Message generated for change (Settings changed) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3605546&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
>Status: Closed
Resolution: Fixed
Priority: 5
Private: No
Submitted By: reza (reza1615)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki.py for other namespaces for en, it, hu, he
Initial Comment:
Hi after https://www.mediawiki.org/wiki/Special:Code/pywikipedia/11073 we can not add interwiki to categories and according to wikidata we can not add none article pages to wikidata.
please solve this problem now many categories needs interwiki and we made many categories (by bot) which are only connected to en.wiki!
----------------------------------------------------------------------
Comment By: JAn (jandudik)
Date: 2013-02-21 13:53
Message:
And there is some way to adding interwiki using pywikipedia bot?
How?
In documentation is nothing...
----------------------------------------------------------------------
Comment By: JAn (jandudik)
Date: 2013-02-21 13:52
Message:
http://www.wikidata.org/wiki/Wikidata:Requests_for_comment/Inclusion_of_non…
<cite>I belive that we have reach now a strong concensus: pages of all
namespaces exept User: are allowed into Wikidata</cite>
----------------------------------------------------------------------
Comment By: reza (reza1615)
Date: 2013-02-21 09:32
Message:
Wikidata:Requests for comment/Inclusion of non-article pages
is the link of decision for not to add other namespaces! so we should user
interwikibot for other namespaces
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3605546&group_…
Bugs item #3606932, was opened at 2013-03-05 08:00
Message generated for change (Comment added) made by nettrom
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3606932&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: rewrite
Group: rewrite
>Status: Closed
>Resolution: Duplicate
Priority: 5
Private: No
Submitted By: Morten Wang (nettrom)
Assigned to: Russell Blau (russblau)
Summary: Namespace alias error with preloading and page.exists()
Initial Comment:
As of revision 11144, the following attempt to preload a page in a specific aliased namespace on Portuguese Wikipedia fails:
import pywikibot;
site = pywikibot.getSite('pt');
site.login();
from pywikibot.pagegenerators import PreloadingGenerator, PagesFromTitlesGenerator;
pageGen = PreloadingGenerator(PagesFromTitlesGenerator([u"Usuário:Vitorvicentevalente"], site=site));
for page in pageGen:
print page.title();
Gives the following output:
Retrieving 1 pages from wikipedia:pt.
WARNING: preloadpages: Query returned unexpected title 'Usuário:Vitorvicentevalente'
Creating a Page object and asking if the page exists also fails:
import pywikibot;
site = pywikibot.getSite('pt');
site.login();
page = pywikibot.Page(site, u"Usuário:Vitorvicentevalente");
page.exists();
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "pywikibot/page.py", line 417, in exists
return self.site.page_exists(self)
File "pywikibot/site.py", line 1180, in page_exists
return page._pageid > 0
AttributeError: 'Page' object has no attribute '_pageid'
Output of python scripts/version.py
Pywikibot (r10326 (pywikibot/__init__.py), 2012/06/08, 12:08:53, OUTDATED)
Python 2.7.1 (r271:86832, Feb 8 2011, 09:38:37)
[GCC 4.2.3]
unicode test: triggers problem #3081100
----------------------------------------------------------------------
>Comment By: Morten Wang (nettrom)
Date: 2013-03-05 08:09
Message:
Duplicate of bug #3606570
User error led to it being submitted again, sorry!
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3606932&group_…
Bugs item #3606932, was opened at 2013-03-05 08:00
Message generated for change (Tracker Item Submitted) made by nettrom
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3606932&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: rewrite
Group: rewrite
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Morten Wang (nettrom)
Assigned to: Russell Blau (russblau)
Summary: Namespace alias error with preloading and page.exists()
Initial Comment:
As of revision 11144, the following attempt to preload a page in a specific aliased namespace on Portuguese Wikipedia fails:
import pywikibot;
site = pywikibot.getSite('pt');
site.login();
from pywikibot.pagegenerators import PreloadingGenerator, PagesFromTitlesGenerator;
pageGen = PreloadingGenerator(PagesFromTitlesGenerator([u"Usuário:Vitorvicentevalente"], site=site));
for page in pageGen:
print page.title();
Gives the following output:
Retrieving 1 pages from wikipedia:pt.
WARNING: preloadpages: Query returned unexpected title 'Usuário:Vitorvicentevalente'
Creating a Page object and asking if the page exists also fails:
import pywikibot;
site = pywikibot.getSite('pt');
site.login();
page = pywikibot.Page(site, u"Usuário:Vitorvicentevalente");
page.exists();
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "pywikibot/page.py", line 417, in exists
return self.site.page_exists(self)
File "pywikibot/site.py", line 1180, in page_exists
return page._pageid > 0
AttributeError: 'Page' object has no attribute '_pageid'
Output of python scripts/version.py
Pywikibot (r10326 (pywikibot/__init__.py), 2012/06/08, 12:08:53, OUTDATED)
Python 2.7.1 (r271:86832, Feb 8 2011, 09:38:37)
[GCC 4.2.3]
unicode test: triggers problem #3081100
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3606932&group_…
Patches item #3606837, was opened at 2013-03-04 19:02
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=3606837&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
>Status: Pending
>Resolution: Rejected
Priority: 5
Private: No
Submitted By: Hazard-SJ (hazard-sj)
>Assigned to: xqt (xqt)
Summary: Alternative to (and per) r10976
Initial Comment:
See https://www.mediawiki.org/wiki/Special:Code/pywikipedia/10976 and patch
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2013-03-05 00:21
Message:
Look around some scripts. There are a lot of variants for speedy deletion
request templates. This should be handled by L10N file and the family file
is a good play for that rather than hard-coded tries of some variants.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=3606837&group_…