http://www.mediawiki.org/wiki/Special:Code/pywikipedia/10189
Revision: 10189
Author: valhallasw
Date: 2012-05-05 21:29:41 +0000 (Sat, 05 May 2012)
Log Message:
-----------
Spellfix: Pwikipedia->Pywikipedia
Modified Paths:
--------------
trunk/pywikipedia/wikipedia.py
Modified: trunk/pywikipedia/wikipedia.py
===================================================================
--- trunk/pywikipedia/wikipedia.py 2012-05-05 21:28:01 UTC (rev 10188)
+++ trunk/pywikipedia/wikipedia.py 2012-05-05 21:29:41 UTC (rev 10189)
@@ -8266,7 +8266,7 @@
if not config.suppresssurvey:
output(
"""
-\03{lightyellow}Dear Pwikipedia user!\03{default}
+\03{lightyellow}Dear Pywikipedia user!\03{default}
Pywikibot has detected that you use this outdated version of Python:
%s.
We would like to hear your voice before ceasing support of this version.
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/10188
Revision: 10188
Author: valhallasw
Date: 2012-05-05 21:28:01 +0000 (Sat, 05 May 2012)
Log Message:
-----------
Update for unicode normalisation bug warning ('issue 3081100')
- Ditched the goo.gl url. Now we again have a sf.net url, but a shorter one.
- Changed the advice to downgrade (<2.6.5) to upgrade (2.7.2+)
Modified Paths:
--------------
trunk/pywikipedia/wikipedia.py
Modified: trunk/pywikipedia/wikipedia.py
===================================================================
--- trunk/pywikipedia/wikipedia.py 2012-05-05 19:22:03 UTC (rev 10187)
+++ trunk/pywikipedia/wikipedia.py 2012-05-05 21:28:01 UTC (rev 10188)
@@ -7806,8 +7806,8 @@
================================================================================
\03{lightyellow}WARNING:\03{lightred} your python version might trigger issue #3081100\03{default}
-See http://goo.gl/W8lJB for more information.
-\03{lightyellow}Use an older python version (<2.6.5) if you are running on wikimedia sites!\03{default}
+More information: See https://sourceforge.net/support/tracker.php?aid=3081100
+\03{lightyellow}Please update python to 2.7.2+ if you are running on wikimedia sites!\03{default}
================================================================================
""")
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/10187
Revision: 10187
Author: valhallasw
Date: 2012-05-05 19:22:03 +0000 (Sat, 05 May 2012)
Log Message:
-----------
Added transliteration feature, and added hint for people running on windows.
Based on r10048; fixes the FixMe r10043 and solves Feature Request
Reference: pywikipediabot-Feature Requests-3516383
Modified Paths:
--------------
trunk/pywikipedia/config.py
trunk/pywikipedia/userinterfaces/terminal_interface_base.py
Modified: trunk/pywikipedia/config.py
===================================================================
--- trunk/pywikipedia/config.py 2012-05-05 17:14:00 UTC (rev 10186)
+++ trunk/pywikipedia/config.py 2012-05-05 19:22:03 UTC (rev 10187)
@@ -108,6 +108,19 @@
#we get "StdioOnnaStick instance has no attribute 'encoding'"
console_encoding = None
+# The encoding the user would like to see text transliterated to. This can be
+# set to a charset (e.g. 'ascii', 'iso-8859-1' or 'cp850'), and we will output
+# only characters that exist in that charset. However, the characters will be
+# output using console_encoding.
+# If this is not defined on Windows, we emit a Warning explaining the user
+# to either switch to a Unicode-able font and use
+# transliteration_target = None
+# or to keep using raster fonts and set
+# transliteration_target = console_encoding
+# After emitting the warning, this last option will be set.
+
+transliteration_target = 'not set'
+
# The encoding in which textfiles are stored, which contain lists of page
# titles. The most used is: 'utf-8'. 'utf-8-sig' recognizes BOM but it is
# available on Python 2.5 or higher. For a complete list please see:
@@ -598,6 +611,15 @@
else:
console_encoding = 'iso-8859-1'
+# Fix up transliteration_target
+if transliteration_target == 'not set':
+ if __sys.platform == 'win32':
+ transliteration_target = console_encoding
+ print "WARNING: Running on Windows and transliteration_target is not set."
+ print "Please see http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Windows"
+ else:
+ transliteration_target = None
+
# Save base_dir for use by other modules
base_dir = _base_dir
if _verbose:
Modified: trunk/pywikipedia/userinterfaces/terminal_interface_base.py
===================================================================
--- trunk/pywikipedia/userinterfaces/terminal_interface_base.py 2012-05-05 17:14:00 UTC (rev 10186)
+++ trunk/pywikipedia/userinterfaces/terminal_interface_base.py 2012-05-05 19:22:03 UTC (rev 10187)
@@ -40,6 +40,7 @@
self.stdout = sys.stdout
self.stderr = sys.stderr
self.encoding = config.console_encoding
+ self.transliteration_target = config.transliteration_target
def printNonColorized(self, text, targetStream):
# We add *** after the text as a whole if anything needed to be colorized.
@@ -69,7 +70,12 @@
# Encode our unicode string in the encoding used by the user's console,
# and decode it back to unicode. Then we can see which characters
# can't be represented in the console encoding.
+ # We need to take min(console_encoding, transliteration_target)
+ # the first is what the terminal is capable of
+ # the second is how unicode-y the user would like the output
codecedText = text.encode(self.encoding, 'replace').decode(self.encoding)
+ if self.transliteration_target:
+ codecedText = codecedText.encode(self.transliteration_target, 'replace').decode(self.transliteration_target)
transliteratedText = ''
# Note: A transliteration replacement might be longer than the original
# character, e.g. ч is transliterated to ch.
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/10186
Revision: 10186
Author: xqt
Date: 2012-05-05 17:14:00 +0000 (Sat, 05 May 2012)
Log Message:
-----------
use xrange iterator instead of range list in for statements
Modified Paths:
--------------
trunk/pywikipedia/date.py
trunk/pywikipedia/interwiki.py
trunk/pywikipedia/pagegenerators.py
trunk/pywikipedia/wikipedia.py
Modified: trunk/pywikipedia/date.py
===================================================================
--- trunk/pywikipedia/date.py 2012-05-05 17:08:48 UTC (rev 10185)
+++ trunk/pywikipedia/date.py 2012-05-05 17:14:00 UTC (rev 10186)
@@ -204,28 +204,28 @@
# Helper for KN: digits representation
_knDigits = u'೦೧೨೩೪೫೬೭೮೯'
-_knDigitsToLocal = dict([(ord(unicode(i)), _knDigits[i]) for i in range(10)])
-_knLocalToDigits = dict([(ord(_knDigits[i]), unicode(i)) for i in range(10)])
+_knDigitsToLocal = dict([(ord(unicode(i)), _knDigits[i]) for i in xrange(10)])
+_knLocalToDigits = dict([(ord(_knDigits[i]), unicode(i)) for i in xrange(10)])
# Helper for Urdu/Persian languages
_faDigits = u'۰۱۲۳۴۵۶۷۸۹'
-_faDigitsToLocal = dict([(ord(unicode(i)), _faDigits[i]) for i in range(10)])
-_faLocalToDigits = dict([(ord(_faDigits[i]), unicode(i)) for i in range(10)])
+_faDigitsToLocal = dict([(ord(unicode(i)), _faDigits[i]) for i in xrange(10)])
+_faLocalToDigits = dict([(ord(_faDigits[i]), unicode(i)) for i in xrange(10)])
# Helper for HI:, MR:
_hiDigits = u'०१२३४५६७८९'
-_hiDigitsToLocal = dict([(ord(unicode(i)), _hiDigits[i]) for i in range(10)])
-_hiLocalToDigits = dict([(ord(_hiDigits[i]), unicode(i)) for i in range(10)])
+_hiDigitsToLocal = dict([(ord(unicode(i)), _hiDigits[i]) for i in xrange(10)])
+_hiLocalToDigits = dict([(ord(_hiDigits[i]), unicode(i)) for i in xrange(10)])
# Helper for BN:
_bnDigits = u'০১২৩৪৫৬৭৮৯'
-_bnDigitsToLocal = dict([(ord(unicode(i)), _bnDigits[i]) for i in range(10)])
-_bnLocalToDigits = dict([(ord(_bnDigits[i]), unicode(i)) for i in range(10)])
+_bnDigitsToLocal = dict([(ord(unicode(i)), _bnDigits[i]) for i in xrange(10)])
+_bnLocalToDigits = dict([(ord(_bnDigits[i]), unicode(i)) for i in xrange(10)])
# Helper for GU:
_guDigits = u'૦૧૨૩૪૫૬૭૮૯'
-_guDigitsToLocal = dict([(ord(unicode(i)), _guDigits[i]) for i in range(10)])
-_guLocalToDigits = dict([(ord(_guDigits[i]), unicode(i)) for i in range(10)])
+_guDigitsToLocal = dict([(ord(unicode(i)), _guDigits[i]) for i in xrange(10)])
+_guLocalToDigits = dict([(ord(_guDigits[i]), unicode(i)) for i in xrange(10)])
def intToLocalDigitsStr( value, digitsToLocalDict ):
# Encode an integer value into a textual form.
@@ -368,7 +368,7 @@
m = compPattern.match(value)
if m:
# decode each found value using provided decoder
- values = [ decoders[i][2](m.group(i+1)) for i in range(len(decoders))]
+ values = [ decoders[i][2](m.group(i+1)) for i in xrange(len(decoders))]
decValue = decf( values )
if decValue in _stringTypes:
@@ -391,7 +391,7 @@
if len(params) != len(decoders):
raise AssertionError("parameter count (%d) does not match decoder count (%d)" % (len(params), len(decoders)))
# convert integer parameters into their textual representation
- params = [ MakeParameter(decoders[i], params[i]) for i in range(len(params)) ]
+ params = [ MakeParameter(decoders[i], params[i]) for i in xrange(len(params)) ]
return strPattern % tuple(params)
else:
if 1 != len(decoders):
@@ -1503,7 +1503,7 @@
if len(patterns) != 12:
raise AssertionError(u'pattern %s does not have 12 elements' % lang )
- for i in range(12):
+ for i in xrange(12):
if patterns[i] is not None:
if isMnthOfYear:
formats[yrMnthFmts[i]][lang] = eval(u'lambda v: dh_mnthOfYear( v, u"%s" )' % patterns[i])
@@ -1511,7 +1511,7 @@
formats[dayMnthFmts[i]][lang] = eval(u'lambda v: dh_dayOfMnth( v, u"%s" )' % patterns[i])
def makeMonthList(pattern):
- return [pattern % m for m in range(1,13)]
+ return [pattern % m for m in xrange(1,13)]
def makeMonthNamedList(lang, pattern, makeUpperCase=None):
"""Creates a list of 12 elements based on the name of the month.
@@ -1526,7 +1526,7 @@
elif makeUpperCase == False:
f = lambda s: s[0].lower() + s[1:]
- return [ pattern % f(monthName(lang, m)) for m in range(1,13) ]
+ return [ pattern % f(monthName(lang, m)) for m in xrange(1,13) ]
def addFmt2( lang, isMnthOfYear, pattern, makeUpperCase = None ):
addFmt( lang, isMnthOfYear, makeMonthNamedList( lang, pattern, makeUpperCase ))
@@ -1646,7 +1646,7 @@
# Brazil uses "1añ" for the 1st of every month, and number without suffix for all other days
brMonthNames = makeMonthNamedList( 'br', u"%s", True )
-for i in range(0,12):
+for i in xrange(0,12):
formats[dayMnthFmts[i]]['br'] = eval(
(u'lambda m: multi( m, [' +
u'(lambda v: dh_dayOfMnth( v, u"%%dañ %s" ), lambda p: p == 1),' +
@@ -1669,7 +1669,7 @@
addFmt ('fr', True, [ u"Janvier %d", u"Février %d", u"Mars %d", u"Avril %d", u"Mai %d", u"Juin %d", u"Juillet %d", u"Août %d", u"Septembre %d", u"Octobre %d", u"Novembre %d", u"Décembre %d" ])
addFmt2('he', True, u"%s %%d", True )
addFmt2('it', True, u"Attualità/Anno %%d - %s", True )
-addFmt ('ja', True, [ u"「最近の出来事」%%d年%d月" % mm for mm in range(1,13)])
+addFmt ('ja', True, [ u"「最近の出来事」%%d年%d月" % mm for mm in xrange(1,13)])
addFmt2('ka', True, u"%s, %%d" )
addFmt ('ko', True, [ u"%d년 1월", u"%d년 2월", u"%d년 3월", u"%d년 4월", u"%d년 5월", u"%d년 6월", u"%d년 7월", u"%d년 8월", u"%d년 9월", u"%d년 10월", u"%d년 11월", u"%d년 12월" ])
addFmt ('li', True, [ u"januari %d", u"februari %d", u"miert %d", u"april %d", u"mei %d", u"juni %d", u"juli %d", u"augustus %d", u"september %d", u"oktober %d", u"november %d", u"december %d" ])
@@ -1725,7 +1725,7 @@
_formatLimit_DayOfMonth31 = (lambda v: 1 <= v and v < 32, 1, 32)
_formatLimit_DayOfMonth30 = (lambda v: 1 <= v and v < 31, 1, 31)
_formatLimit_DayOfMonth29 = (lambda v: 1 <= v and v < 30, 1, 30)
-for monthId in range(12):
+for monthId in xrange(12):
if (monthId + 1) in [1, 3, 5, 7, 8, 10, 12]:
formatLimits[dayMnthFmts[monthId]] = _formatLimit_DayOfMonth31 # 31 days a month
elif (monthId+1) == 2: # February
@@ -1804,7 +1804,7 @@
for code, convFunc in formats[formatName].iteritems():
# import time
# startClock = time.clock()
- for value in range(start, stop, step):
+ for value in xrange(start, stop, step):
try:
if not predicate(value):
raise AssertionError(" Not a valid value for this format.")
Modified: trunk/pywikipedia/interwiki.py
===================================================================
--- trunk/pywikipedia/interwiki.py 2012-05-05 17:08:48 UTC (rev 10185)
+++ trunk/pywikipedia/interwiki.py 2012-05-05 17:14:00 UTC (rev 10186)
@@ -2106,7 +2106,7 @@
% fs.originPage)
pywikibot.output(u"NOTE: Number of pages queued is %d, trying to add %d more."
% (len(self.subjects), number))
- for i in range(number):
+ for i in xrange(number):
try:
while True:
try:
@@ -2263,7 +2263,7 @@
def queryStep(self):
self.oneQuery()
# Delete the ones that are done now.
- for i in range(len(self.subjects)-1, -1, -1):
+ for i in xrange(len(self.subjects)-1, -1, -1):
subj = self.subjects[i]
if subj.isDone():
subj.finish(self)
@@ -2483,7 +2483,8 @@
for FileName in glob.iglob('interwiki-dumps/interwikidump-*.txt'):
s = FileName.split('\\')[1].split('.')[0].split('-')
sitename = s[1]
- for i in range(0,2): s.remove(s[0])
+ for i in xrange(0,2):
+ s.remove(s[0])
sitelang = '-'.join(s)
if site.family.name == sitename:
File2Restore.append([sitename, sitelang])
Modified: trunk/pywikipedia/pagegenerators.py
===================================================================
--- trunk/pywikipedia/pagegenerators.py 2012-05-05 17:08:48 UTC (rev 10185)
+++ trunk/pywikipedia/pagegenerators.py 2012-05-05 17:14:00 UTC (rev 10186)
@@ -762,13 +762,13 @@
def RandomPageGenerator(number = 10, site = None):
if site is None:
site = pywikibot.getSite()
- for i in range(number):
+ for i in xrange(number):
yield site.randompage()
def RandomRedirectPageGenerator(number = 10, site = None):
if site is None:
site = pywikibot.getSite()
- for i in range(number):
+ for i in xrange(number):
yield site.randomredirectpage()
def PagesFromTitlesGenerator(iterable, site=None):
Modified: trunk/pywikipedia/wikipedia.py
===================================================================
--- trunk/pywikipedia/wikipedia.py 2012-05-05 17:08:48 UTC (rev 10185)
+++ trunk/pywikipedia/wikipedia.py 2012-05-05 17:14:00 UTC (rev 10186)
@@ -4551,11 +4551,11 @@
# The first two chars represent an Esperanto letter.
# Following x's are doubled.
new = esperanto + ''.join([old[2 * i]
- for i in range(1, len(old)/2)])
+ for i in xrange(1, len(old)/2)])
else:
# The first character stays latin; only the x's are doubled.
new = latin + ''.join([old[2 * i + 1]
- for i in range(0, len(old)/2)])
+ for i in xrange(0, len(old)/2)])
result += text[pos : match.start() + pos] + new
pos += match.start() + len(old)
else:
@@ -4583,7 +4583,7 @@
if match:
old = match.group()
# the first letter stays; add an x after each X or x.
- new = old[0] + ''.join([old[i] + 'x' for i in range(1, len(old))])
+ new = old[0] + ''.join([old[i] + 'x' for i in xrange(1, len(old))])
result += text[pos : match.start() + pos] + new
pos += match.start() + len(old)
else:
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/10185
Revision: 10185
Author: xqt
Date: 2012-05-05 17:08:48 +0000 (Sat, 05 May 2012)
Log Message:
-----------
update property Page.site from rewrite
Modified Paths:
--------------
trunk/pywikipedia/interwiki.py
Modified: trunk/pywikipedia/interwiki.py
===================================================================
--- trunk/pywikipedia/interwiki.py 2012-05-05 16:39:51 UTC (rev 10184)
+++ trunk/pywikipedia/interwiki.py 2012-05-05 17:08:48 UTC (rev 10185)
@@ -732,7 +732,7 @@
return self.size
def add(self, page):
- site = page.site()
+ site = page.site
if not site in self.tree:
self.tree[site] = []
self.tree[site].append(page)
@@ -740,7 +740,7 @@
def remove(self, page):
try:
- self.tree[page.site()].remove(page)
+ self.tree[page.site].remove(page)
self.size -= 1
except ValueError:
pass
@@ -805,7 +805,7 @@
Site:
Code becomes:
- todo <- {originPage.site():[originPage]}
+ todo <- {originPage.site:[originPage]}
done <- []
while todo != {}:
site <- electSite()
@@ -1001,7 +1001,7 @@
page = StoredPage(page)
self.foundIn[page] = [linkingPage]
self.todo.add(page)
- counter.plus(page.site())
+ counter.plus(page.site)
return True
def skipPage(self, page, target, counter):
@@ -1022,9 +1022,9 @@
return False
elif self.originPage and self.originPage.namespace() != linkedPage.namespace():
# Allow for a mapping between different namespaces
- crossFrom = self.originPage.site().family.crossnamespace.get(self.originPage.namespace(), {})
- crossTo = crossFrom.get(self.originPage.site().language(), crossFrom.get('_default', {}))
- nsmatch = crossTo.get(linkedPage.site().language(), crossTo.get('_default', []))
+ crossFrom = self.originPage.site.family.crossnamespace.get(self.originPage.namespace(), {})
+ crossTo = crossFrom.get(self.originPage.site.language(), crossFrom.get('_default', {}))
+ nsmatch = crossTo.get(linkedPage.site.language(), crossTo.get('_default', []))
if linkedPage.namespace() in nsmatch:
return False
if globalvar.autonomous:
@@ -1035,7 +1035,7 @@
self.foundIn[linkedPage] = [linkingPage]
return True
else:
- preferredPage = self.getFoundInCorrectNamespace(linkedPage.site())
+ preferredPage = self.getFoundInCorrectNamespace(linkedPage.site)
if preferredPage:
pywikibot.output(u"NOTE: Ignoring link from page %s in namespace %i to page %s in namespace %i because page %s in the correct namespace has already been found."
% (linkingPage, linkingPage.namespace(), linkedPage,
@@ -1055,9 +1055,9 @@
self.makeForcedStop(counter)
elif choice == 'a':
newHint = pywikibot.input(u'Give the alternative for language %s, not using a language code:'
- % linkedPage.site().language())
+ % linkedPage.site.language())
if newHint:
- alternativePage = pywikibot.Page(linkedPage.site(), newHint)
+ alternativePage = pywikibot.Page(linkedPage.site, newHint)
if alternativePage:
# add the page that was entered by the user
self.addIfNew(alternativePage, counter, None)
@@ -1076,7 +1076,7 @@
if page.title().lower() != self.originPage.title().lower():
pywikibot.output(u"NOTE: Ignoring %s for %s in wiktionary mode" % (page, self.originPage))
return True
- elif page.title() != self.originPage.title() and self.originPage.site().nocapitalize and page.site().nocapitalize:
+ elif page.title() != self.originPage.title() and self.originPage.site.nocapitalize and page.site.nocapitalize:
pywikibot.output(u"NOTE: Ignoring %s for %s in wiktionary mode because both languages are uncapitalized."
% (page, self.originPage))
return True
@@ -1110,7 +1110,7 @@
else:
choice = 'y'
if self.originPage.isDisambig() and not page.isDisambig():
- disambig = self.getFoundDisambig(page.site())
+ disambig = self.getFoundDisambig(page.site)
if disambig:
pywikibot.output(
u"NOTE: Ignoring non-disambiguation page %s for %s because disambiguation page %s has already been found."
@@ -1123,7 +1123,7 @@
['Yes', 'No', 'Add an alternative', 'Give up'],
['y', 'n', 'a', 'g'])
elif not self.originPage.isDisambig() and page.isDisambig():
- nondisambig = self.getFoundNonDisambig(page.site())
+ nondisambig = self.getFoundNonDisambig(page.site)
if nondisambig:
pywikibot.output(u"NOTE: Ignoring disambiguation page %s for %s because non-disambiguation page %s has already been found."
% (page, self.originPage, nondisambig))
@@ -1138,8 +1138,8 @@
return (True, None)
elif choice == 'a':
newHint = pywikibot.input(u'Give the alternative for language %s, not using a language code:'
- % page.site().language())
- alternativePage = pywikibot.Page(page.site(), newHint)
+ % page.site.language())
+ alternativePage = pywikibot.Page(page.site, newHint)
return (True, alternativePage)
elif choice == 'g':
self.makeForcedStop(counter)
@@ -1148,7 +1148,7 @@
return (False, None)
def isIgnored(self, page):
- if page.site().language() in globalvar.neverlink:
+ if page.site.language() in globalvar.neverlink:
pywikibot.output(u"Skipping link %s to an ignored language" % page)
return True
if page in globalvar.ignore:
@@ -1221,15 +1221,15 @@
if dictName is not None:
if self.originPage:
pywikibot.output(u'WARNING: %s:%s relates to %s:%s, which is an auto entry %s(%s)'
- % (self.originPage.site().language(), self.originPage.title(),
- page.site().language(), page.title(), dictName, year))
+ % (self.originPage.site.language(), self.originPage,
+ page.site.language(), page, dictName, year))
# Abort processing if the bot is running in autonomous mode.
if globalvar.autonomous:
self.makeForcedStop(counter)
# Register this fact at the todo-counter.
- counter.minus(page.site())
+ counter.minus(page.site)
# Now check whether any interwiki links should be added to the
# todo list.
@@ -1279,7 +1279,7 @@
and not redirectTargetPage.isCategoryRedirect():
self.originPage = redirectTargetPage
self.todo.add(redirectTargetPage)
- counter.plus(redirectTargetPage.site())
+ counter.plus(redirectTargetPage.site)
else:
# This is a redirect page to the origin. We don't need to
# follow the redirection.
@@ -1295,7 +1295,7 @@
if not globalvar.quiet or pywikibot.verbose:
pywikibot.output(
u"NOTE: not following static %sredirects." % redir)
- elif page.site().family == redirectTargetPage.site().family \
+ elif page.site.family == redirectTargetPage.site.family \
and not self.skipPage(page, redirectTargetPage, counter):
if self.addIfNew(redirectTargetPage, counter, page):
if config.interwiki_shownew or pywikibot.verbose:
@@ -1346,7 +1346,7 @@
self.addIfNew(alternativePage, counter, None)
duplicate = None
- for p in self.done.filter(page.site()):
+ for p in self.done.filter(page.site):
if p != page and p.exists() and not p.isRedirectPage() and not p.isCategoryRedirect():
duplicate = p
break
@@ -1357,7 +1357,7 @@
# Ignore the interwiki links.
iw = ()
if globalvar.lacklanguage:
- if globalvar.lacklanguage in [link.site().language() for link in iw]:
+ if globalvar.lacklanguage in [link.site.language() for link in iw]:
iw = ()
self.workonme = False
if len(iw) < globalvar.minlinks:
@@ -1373,7 +1373,7 @@
pywikibot.config.datafilepath('autonomous_problems.dat'),
'a', 'utf-8')
f.write(u"* %s {Found more than one link for %s}"
- % (self.originPage, page.site()))
+ % (self.originPage, page.site))
if config.interwiki_graph and config.interwiki_graph_url:
filename = interwiki_graph.getFilename(self.originPage, extension = config.interwiki_graph_formats[0])
f.write(u" [%s%s graph]" % (config.interwiki_graph_url, filename))
@@ -1406,9 +1406,9 @@
if self.addIfNew(linkedPage, counter, page):
# It is new. Also verify whether it is the second on the
# same site
- lpsite=linkedPage.site()
+ lpsite=linkedPage.site
for prevPage in self.foundIn:
- if prevPage != linkedPage and prevPage.site() == lpsite:
+ if prevPage != linkedPage and prevPage.site == lpsite:
# Still, this could be "no problem" as either may be a
# redirect to the other. No way to find out quickly!
pywikibot.output(u"NOTE: %s: %s gives duplicate interwiki on same site %s"
@@ -1457,11 +1457,11 @@
new = {}
for page in self.done:
if page.exists() and not page.isRedirectPage() and not page.isCategoryRedirect():
- site = page.site()
+ site = page.site
if site.family.interwiki_forward:
#TODO: allow these cases to be propagated!
continue # inhibit the forwarding families pages to be updated.
- if site == self.originPage.site():
+ if site == self.originPage.site:
if page != self.originPage:
self.problem(u"Found link to %s" % page)
self.whereReport(page)
@@ -1609,10 +1609,10 @@
# Make sure new contains every page link, including the page we are processing
# TODO: should be move to assemble()
# replaceLinks will skip the site it's working on.
- if self.originPage.site() not in new:
+ if self.originPage.site not in new:
#TODO: make this possible as well.
- if not self.originPage.site().family.interwiki_forward:
- new[self.originPage.site()] = self.originPage
+ if not self.originPage.site.family.interwiki_forward:
+ new[self.originPage.site] = self.originPage
#self.replaceLinks(self.originPage, new, True, bot)
@@ -1621,7 +1621,7 @@
# Process all languages here
globalvar.always = False
if globalvar.limittwo:
- lclSite = self.originPage.site()
+ lclSite = self.originPage.site
lclSiteDone = False
frgnSiteDone = False
@@ -1646,7 +1646,7 @@
old={}
try:
for page in new[site].interwiki():
- old[page.site()] = page
+ old[page.site] = page
except pywikibot.NoPage:
pywikibot.output(u"BUG>>> %s no longer exists?"
% new[site])
@@ -1675,11 +1675,11 @@
# or the last edit wasn't a bot
# or the last edit was 1 month ago
smallWikiAllowed = True
- if globalvar.autonomous and page.site().sitename() == 'wikipedia:is':
+ if globalvar.autonomous and page.site.sitename() == 'wikipedia:is':
old={}
try:
- for mypage in new[page.site()].interwiki():
- old[mypage.site()] = mypage
+ for mypage in new[page.site].interwiki():
+ old[mypage.site] = mypage
except pywikibot.NoPage:
pywikibot.output(u"BUG>>> %s no longer exists?"
% new[site])
@@ -1691,10 +1691,10 @@
len(removing) > 0 or len(old) == 0 or \
len(adding) + len(modifying) > 2 or \
len(removing) + len(modifying) == 0 and \
- adding == [page.site()]
+ adding == [page.site]
if not smallWikiAllowed:
import userlib
- user = userlib.User(page.site(), page.userName())
+ user = userlib.User(page.site, page.userName())
if not 'bot' in user.groups() \
and not 'bot' in page.userName().lower(): #erstmal auch keine namen mit bot
smallWikiAllowed = True
@@ -1707,7 +1707,7 @@
else:
pywikibot.output(
u'NOTE: number of edits are restricted at %s'
- % page.site().sitename())
+ % page.site.sitename())
# if we have an account for this site
if site.family.name in config.usernames \
@@ -1783,18 +1783,18 @@
# remove interwiki links to ignore
for iw in re.finditer('<!-- *\[\[(.*?:.*?)\]\] *-->', pagetext):
try:
- ignorepage = pywikibot.Page(page.site(), iw.groups()[0])
+ ignorepage = pywikibot.Page(page.site, iw.groups()[0])
except (pywikibot.NoSuchSite, pywikibot.InvalidTitle):
continue
try:
- if (new[ignorepage.site()] == ignorepage) and \
- (ignorepage.site() != page.site()):
+ if (new[ignorepage.site] == ignorepage) and \
+ (ignorepage.site != page.site):
if (ignorepage not in interwikis):
pywikibot.output(
u"Ignoring link to %(to)s for %(from)s"
% {'to': ignorepage,
'from': page})
- new.pop(ignorepage.site())
+ new.pop(ignorepage.site)
else:
pywikibot.output(
u"NOTE: Not removing interwiki from %(from)s to %(to)s (exists both commented and non-commented)"
@@ -1805,7 +1805,7 @@
# sanity check - the page we are fixing must be the only one for that
# site.
- pltmp = new[page.site()]
+ pltmp = new[page.site]
if pltmp != page:
s = u"None"
if pltmp is not None: s = pltmp
@@ -1815,23 +1815,22 @@
raise SaveError(u'BUG: sanity check failed')
# Avoid adding an iw link back to itself
- del new[page.site()]
-
+ del new[page.site]
# Do not add interwiki links to foreign families that page.site() does not forward to
for stmp in new.keys():
- if stmp.family != page.site().family:
- if stmp.family.name != page.site().family.interwiki_forward:
+ if stmp.family != page.site.family:
+ if stmp.family.name != page.site.family.interwiki_forward:
del new[stmp]
# Put interwiki links into a map
old={}
for page2 in interwikis:
- old[page2.site()] = page2
+ old[page2.site] = page2
# Check what needs to get done
mods, mcomment, adding, removing, modifying = compareLanguages(old,
new,
- insite=page.site())
+ insite=page.site)
# When running in autonomous mode without -force switch, make sure we
# don't remove any items, but allow addition of the new ones
@@ -1841,15 +1840,15 @@
for rmsite in removing:
# Sometimes sites have an erroneous link to itself as an
# interwiki
- if rmsite == page.site():
+ if rmsite == page.site:
continue
rmPage = old[rmsite]
#put it to new means don't delete it
if not globalvar.cleanup and not globalvar.force or \
globalvar.cleanup and \
unicode(rmPage) not in globalvar.remove or \
- rmPage.site().lang in ['hak', 'hi', 'cdo'] and \
- pywikibot.unicode_error: #work-arround for bug #3081100 (do not remove hi-pages)
+ rmPage.site.lang in ['hak', 'hi', 'cdo'] and \
+ pywikibot.unicode_error: #work-arround for bug #3081100 (do not remove affected pages)
new[rmsite] = rmPage
pywikibot.output(
u"WARNING: %s is either deleted or has a mismatching disambiguation state."
@@ -1857,7 +1856,7 @@
# Re-Check what needs to get done
mods, mcomment, adding, removing, modifying = compareLanguages(old,
new,
- insite=page.site())
+ insite=page.site)
if not mods:
if not globalvar.quiet or pywikibot.verbose:
pywikibot.output(u'No changes needed on page %s' % page)
@@ -1870,7 +1869,7 @@
oldtext = page.get()
template = (page.namespace() == 10)
newtext = pywikibot.replaceLanguageLinks(oldtext, new,
- site=page.site(),
+ site=page.site,
template=template)
# This is for now. Later there should be different funktions for each
# kind
@@ -1893,7 +1892,7 @@
ask = False
# Allow for special case of a self-pointing interwiki link
- if removing and removing != [page.site()]:
+ if removing and removing != [page.site]:
self.problem(u'Found incorrect link to %s in %s'
% (", ".join([x.lang for x in removing]), page),
createneed=False)
@@ -1920,8 +1919,8 @@
['y', 'n', 'b', 'g', 'a'])
if answer == 'b':
webbrowser.open("http://%s%s" % (
- page.site().hostname(),
- page.site().nice_get_address(page.title())
+ page.site.hostname(),
+ page.site.nice_get_address(page.title())
))
pywikibot.input(u"Press Enter when finished in browser.")
return True
@@ -2024,29 +2023,29 @@
# This assumes that there is only one interwiki link per language.
linkedPagesDict = {}
for linkedPage in linkedPages:
- linkedPagesDict[linkedPage.site()] = linkedPage
+ linkedPagesDict[linkedPage.site] = linkedPage
for expectedPage in expectedPages - linkedPages:
if expectedPage != page:
try:
- linkedPage = linkedPagesDict[expectedPage.site()]
+ linkedPage = linkedPagesDict[expectedPage.site]
pywikibot.output(
u"WARNING: %s: %s does not link to %s but to %s"
- % (page.site().family.name,
+ % (page.site.family.name,
page, expectedPage, linkedPage))
except KeyError:
pywikibot.output(
u"WARNING: %s: %s does not link to %s"
- % (page.site().family.name,
+ % (page.site.family.name,
page, expectedPage))
# Check for superfluous links
for linkedPage in linkedPages:
if linkedPage not in expectedPages:
# Check whether there is an alternative page on that language.
# In this case, it was already reported above.
- if linkedPage.site() not in expectedSites:
+ if linkedPage.site not in expectedSites:
pywikibot.output(
u"WARNING: %s: %s links to incorrect %s"
- % (page.site().family.name,
+ % (page.site.family.name,
page, linkedPage))
except (socket.error, IOError):
pywikibot.output(u'ERROR: could not report backlinks')
@@ -2137,7 +2136,7 @@
if page.namespace() == 10:
loc = None
try:
- tmpl, loc = moved_links[page.site().lang]
+ tmpl, loc = moved_links[page.site.lang]
del tmpl
except KeyError:
pass
@@ -2148,7 +2147,7 @@
if self.generateUntil:
until = self.generateUntil
- if page.site().lang not in page.site().family.nocapitalize:
+ if page.site.lang not in page.site.family.nocapitalize:
until = until[0].upper()+until[1:]
if page.title(withNamespace=False) > until:
raise StopIteration
@@ -2342,13 +2341,13 @@
def botMayEdit (page):
tmpl = []
try:
- tmpl, loc = moved_links[page.site().lang]
+ tmpl, loc = moved_links[page.site.lang]
except KeyError:
pass
if type(tmpl) != list:
tmpl = [tmpl]
try:
- tmpl += ignoreTemplates[page.site().lang]
+ tmpl += ignoreTemplates[page.site.lang]
except KeyError:
pass
tmpl += ignoreTemplates['_default']
@@ -2367,7 +2366,7 @@
for page, pagelist in hints.iteritems():
# The WarnfileReader gives us a list of pagelinks, but titletranslate.py expects a list of strings, so we convert it back.
# TODO: This is a quite ugly hack, in the future we should maybe make titletranslate expect a list of pagelinks.
- hintStrings = ['%s:%s' % (hintedPage.site().language(), hintedPage.title()) for hintedPage in pagelist]
+ hintStrings = ['%s:%s' % (hintedPage.site.language(), hintedPage.title()) for hintedPage in pagelist]
bot.add(page, hints = hintStrings)
def main():
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/10184
Revision: 10184
Author: xqt
Date: 2012-05-05 16:39:51 +0000 (Sat, 05 May 2012)
Log Message:
-----------
update property decorator from rewrite r8238, __call__() method for backwards compatibility
Modified Paths:
--------------
trunk/pywikipedia/wikipedia.py
Modified: trunk/pywikipedia/wikipedia.py
===================================================================
--- trunk/pywikipedia/wikipedia.py 2012-05-05 09:45:27 UTC (rev 10183)
+++ trunk/pywikipedia/wikipedia.py 2012-05-05 16:39:51 UTC (rev 10184)
@@ -427,6 +427,7 @@
)
raise
+ @property
def site(self):
"""Return the Site object for the wiki on which this Page resides."""
return self._site
@@ -4986,6 +4987,17 @@
if not language[0].upper() + language[1:] in self.namespaces():
self._validlanguages.append(language)
+ def __call__(self):
+ """Since the Page.site() method has a property decorator, return the
+ site object for backwards-compatibility if Page.site() call is still
+ used instead of Page.site as recommended.
+
+ """
+## # DEPRECATED warning. Should be uncommented if scripts are actualized
+## pywikibot.output('Page.site() method is DEPRECATED, '
+## 'use Page.site instead.')
+ return self
+
@property
def family(self):
"""The Family object for this Site's wiki family."""