Support Requests item #3610265, was opened at 2013-04-07 18:52
Message generated for change (Tracker Item Submitted) made by cdpark
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603139&aid=3610265&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Priority: 5
Private: No
Submitted By: ChongDae (cdpark)
Assigned to: Nobody/Anonymous (nobody)
Summary: Support for Wikimedia Labs and Toolserver
Initial Comment:
1. Source directory repositiory
Split pywikipedia source and user config/script. See also https://wikitech.wikimedia.org/wiki/Nova_Resource_Talk:Bots#Pywikipediabot
Currently, pywikipedia imports(or includes) other file from (1) source code directory (2) current working directory (3) pywikipedia environment, depends on code. How about to support PYWIKIPEDIA_HOME or similar environment for source code repository.
2. Local mysql mirror support.
In WMLabs and toolserver, access of wikipedia mirror is available. Page.get() and pagegenerators can be use this information. (BTW, Page.put() should work to original site, not mirrors). It can be also useful for other mediawiki installs if admin runs some pywikipedia-based script (e.g. replace.py) for his own database.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603139&aid=3610265&group_…
Bugs item #3609967, was opened at 2013-04-03 18:32
Message generated for change (Comment added) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3609967&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: ChongDae (cdpark)
Assigned to: Amir (amird)
Summary: interwiki() fails for closed wikipedia
Initial Comment:
Page.interwiki() and PageData.interwiki() fails when wikidata contains sitelinks of closed wikipedias.
For example, following code is broken now.
---------------------------------------------------------------------------------------------------
#!/usr/bin/python
# -*- coding: utf-8 -*-
import pywikibot
en = pywikibot.getSite('en', 'wikipedia')
mainpage = pywikibot.Page(en, u'Main Page')
interwiki = mainpage.interwiki()
print interwiki
--------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------
>Comment By: Amir (amird)
Date: 2013-04-07 16:36
Message:
Sorry, I misunderstood but every time i run the code on Main Page,bot
returns me invalid title because the iws are invalid titles:
http://en.wikipedia.org/w/index.php?title=Template:Main_Page_interwikis&act…
can you give me another example (e.g. a test userspace page)
----------------------------------------------------------------------
Comment By: Amir (amird)
Date: 2013-04-07 16:30
Message:
The problem is from API:
http://en.wikipedia.org/w/api.php?action=query&format=jsonfm&titles=Main%20…
in comprehension of:
http://en.wikipedia.org/w/api.php?action=query&format=jsonfm&titles=Persian…
legoktm: I think your idea is good but unrelated, none of these wikis are
obsolete
----------------------------------------------------------------------
Comment By: Legoktm (legoktm)
Date: 2013-04-06 13:49
Message:
I posted this on the ML about a month ago
(http://lists.wikimedia.org/pipermail/pywikipedia-l/2013-March/007766.html)
and assumed that because no one responded it wasn't a big deal so I didn't
bother fixing it.
This issue is more than just interwiki links, it's that we prevent read
(and write for stewards) access for a locked (but readable) wiki.
I think the best way to fix it would be to split that list of obsolete
wiki's into 3: locked, deleted, and obsolete (renamed, backwards
compatibility), and then any wiki that is deleted/obsolete throw an error,
but locked does not.
(Setting group to none since this also affects rewrite branch)
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3609967&group_…
Bugs item #3609967, was opened at 2013-04-03 18:32
Message generated for change (Comment added) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3609967&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: ChongDae (cdpark)
Assigned to: Amir (amird)
Summary: interwiki() fails for closed wikipedia
Initial Comment:
Page.interwiki() and PageData.interwiki() fails when wikidata contains sitelinks of closed wikipedias.
For example, following code is broken now.
---------------------------------------------------------------------------------------------------
#!/usr/bin/python
# -*- coding: utf-8 -*-
import pywikibot
en = pywikibot.getSite('en', 'wikipedia')
mainpage = pywikibot.Page(en, u'Main Page')
interwiki = mainpage.interwiki()
print interwiki
--------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------
>Comment By: Amir (amird)
Date: 2013-04-07 16:30
Message:
The problem is from API:
http://en.wikipedia.org/w/api.php?action=query&format=jsonfm&titles=Main%20…
in comprehension of:
http://en.wikipedia.org/w/api.php?action=query&format=jsonfm&titles=Persian…
legoktm: I think your idea is good but unrelated, none of these wikis are
obsolete
----------------------------------------------------------------------
Comment By: Legoktm (legoktm)
Date: 2013-04-06 13:49
Message:
I posted this on the ML about a month ago
(http://lists.wikimedia.org/pipermail/pywikipedia-l/2013-March/007766.html)
and assumed that because no one responded it wasn't a big deal so I didn't
bother fixing it.
This issue is more than just interwiki links, it's that we prevent read
(and write for stewards) access for a locked (but readable) wiki.
I think the best way to fix it would be to split that list of obsolete
wiki's into 3: locked, deleted, and obsolete (renamed, backwards
compatibility), and then any wiki that is deleted/obsolete throw an error,
but locked does not.
(Setting group to none since this also affects rewrite branch)
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3609967&group_…
Bugs item #3609967, was opened at 2013-04-03 18:32
Message generated for change (Comment added) made by legoktm
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3609967&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
>Category: None
>Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: ChongDae (cdpark)
Assigned to: Amir (amird)
Summary: interwiki() fails for closed wikipedia
Initial Comment:
Page.interwiki() and PageData.interwiki() fails when wikidata contains sitelinks of closed wikipedias.
For example, following code is broken now.
---------------------------------------------------------------------------------------------------
#!/usr/bin/python
# -*- coding: utf-8 -*-
import pywikibot
en = pywikibot.getSite('en', 'wikipedia')
mainpage = pywikibot.Page(en, u'Main Page')
interwiki = mainpage.interwiki()
print interwiki
--------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------
>Comment By: Legoktm (legoktm)
Date: 2013-04-06 13:49
Message:
I posted this on the ML about a month ago
(http://lists.wikimedia.org/pipermail/pywikipedia-l/2013-March/007766.html)
and assumed that because no one responded it wasn't a big deal so I didn't
bother fixing it.
This issue is more than just interwiki links, it's that we prevent read
(and write for stewards) access for a locked (but readable) wiki.
I think the best way to fix it would be to split that list of obsolete
wiki's into 3: locked, deleted, and obsolete (renamed, backwards
compatibility), and then any wiki that is deleted/obsolete throw an error,
but locked does not.
(Setting group to none since this also affects rewrite branch)
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3609967&group_…
Bugs item #3609967, was opened at 2013-04-03 18:32
Message generated for change (Settings changed) made by amird
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3609967&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: trunk
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: ChongDae (cdpark)
>Assigned to: Amir (amird)
Summary: interwiki() fails for closed wikipedia
Initial Comment:
Page.interwiki() and PageData.interwiki() fails when wikidata contains sitelinks of closed wikipedias.
For example, following code is broken now.
---------------------------------------------------------------------------------------------------
#!/usr/bin/python
# -*- coding: utf-8 -*-
import pywikibot
en = pywikibot.getSite('en', 'wikipedia')
mainpage = pywikibot.Page(en, u'Main Page')
interwiki = mainpage.interwiki()
print interwiki
--------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3609967&group_…
Bugs item #3610073, was opened at 2013-04-05 03:25
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3610073&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: other
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
>Assigned to: xqt (xqt)
Summary: isDisambig() does not work for some pages
Initial Comment:
Following code returns "False". For "Anemone (disambiguation)", it returns True.
-----------------------------------------------------------------------------------------------
import pywikibot
site = pywikibot.getSite('en')
page = pywikibot.Page(site, u'Missouri (disambiguation)')
print page.isDisambig()
---------------------------------------------------------------------------------------------------
$ python version.py
Pywikipedia [http] trunk/pywikipedia (r11339, 2013/04/04, 17:30:35, ok)
Python 2.7.1 (r271:86832, Jan 4 2011, 13:57:14)
[GCC 4.5.2]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
---------------------------------------------------------------------------------------------------
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2013-04-06 06:49
Message:
fixed in r11347
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3610073&group_…
Bugs item #3610073, was opened at 2013-04-05 03:25
Message generated for change (Tracker Item Submitted) made by nobody
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3610073&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: other
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: isDisambig() does not work for some pages
Initial Comment:
Following code returns "False". For "Anemone (disambiguation)", it returns True.
-----------------------------------------------------------------------------------------------
import pywikibot
site = pywikibot.getSite('en')
page = pywikibot.Page(site, u'Missouri (disambiguation)')
print page.isDisambig()
---------------------------------------------------------------------------------------------------
$ python version.py
Pywikipedia [http] trunk/pywikipedia (r11339, 2013/04/04, 17:30:35, ok)
Python 2.7.1 (r271:86832, Jan 4 2011, 13:57:14)
[GCC 4.5.2]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
---------------------------------------------------------------------------------------------------
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3610073&group_…
Support Requests item #3602096, was opened at 2013-01-25 04:13
Message generated for change (Comment added) made by n-fran
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603139&aid=3602096&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Priority: 5
Private: No
Submitted By: Анима (n-fran)
Assigned to: Nobody/Anonymous (nobody)
Summary: When it will start work archivebot.py and weblinkchecker.py?
Initial Comment:
1) archivebot.py
I put in the parameters
|algo = old(1d)
Several days have passed, but still appears
Processing 10 threads
There are only 0 Threads. Skipped
When it will be back up?
2) weblinkchecker.py
I set the parameter-day:1 several days have Passed, but the bot is doing nothing.
And the errors are gone. I think that the problem is not only in my family file, because the same error occurs when I run the bot in Russian Wikipedia.
http://pastebin.com/x1zQipmU
Thanks.
----------------------------------------------------------------------
>Comment By: Анима (n-fran)
Date: 2013-04-04 01:58
Message:
After 3 months... What are my errors?
http://pastebin.ru/6u5etpBchttp://pastebin.com/BSkXmFaa
----------------------------------------------------------------------
Comment By: Анима (n-fran)
Date: 2013-01-25 04:15
Message:
Pywikipedia (r10976 (wikipedia.py), 2013/01/23, 21:32:04, OUTDATED)
Python 2.6.4 (r264:75708, Oct 26 2009, 08:23:19) [MSC v.1500 32 bit
(Intel)]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603139&aid=3602096&group_…