jenkins-bot has submitted this change and it was merged.
Change subject: Enable logging on travis-ci builds
......................................................................
Enable logging on travis-ci builds
The logfilesize should be sufficient for all logging to be captured
in a single file, but in the offchance that it overflows, also
list the log files so the size can be increased when required.
Bug 56964
Change-Id: If19e82f0c447c6e99959211203b91bbee936cb8c
---
M .travis.yml
1 file changed, 9 insertions(+), 0 deletions(-)
Approvals:
John Vandenberg: Looks good to me, but someone else must approve
Merlijn van Deen: Looks good to me, approved
jenkins-bot: Verified
diff --git a/.travis.yml b/.travis.yml
index 9a89d86..5f12f0f 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -13,6 +13,10 @@
- echo "family = 'wikipedia'" >> ~/.pywikibot/user-config.py
- echo "usernames['wikipedia']['en'] = 'Pywikibot-test'" >> ~/.pywikibot/user-config.py
- echo "password_file = os.path.expanduser('~/.pywikibot/passwordfile')" >> ~/.pywikibot/user-config.py
+ - echo "debug_log.append('')" >> ~/.pywikibot/user-config.py
+ - echo "log.append('*')" >> ~/.pywikibot/user-config.py
+ - echo "logfilename = 'tests.log'" >> ~/.pywikibot/user-config.py
+ - echo "logfilesize = 10000" >> ~/.pywikibot/user-config.py
- touch ~/.pywikibot/passwordfile
- echo "('Pywikibot-test', '"$USER_PASSWORD"')" > ~/.pywikibot/passwordfile
@@ -23,6 +27,11 @@
script:
- python setup.py test
+ - echo '============='
+ - echo '==== LOG ====
+ - ls ~/.pywikibot/logs/*
+ - echo '============='
+ - sed -e "s/lgpassword=\([^&]*\)/lgpassword=xxxxxxx/g" ~/.pywikibot/logs/tests.log
env:
global:
--
To view, visit https://gerrit.wikimedia.org/r/144655
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: If19e82f0c447c6e99959211203b91bbee936cb8c
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Addshore <addshorewiki(a)gmail.com>
Gerrit-Reviewer: Jeroen De Dauw <jeroendedauw(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Docstring fixes
......................................................................
Docstring fixes
This changeset resolves all known docstring warnings with page.py,
including pep257, but also line lengths.
Several docstrings where altered so they used similar language
to other docstrings.
@return has been added liberally as often the first line of the
docstring contained detailed return information.
colons added where missing.
Change-Id: I35d28dbd4ad87b50f4e93cf6dc460c0fb59ea57a
---
M pywikibot/page.py
1 file changed, 488 insertions(+), 157 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/page.py b/pywikibot/page.py
index bb2caf0..c45d5b4 100644
--- a/pywikibot/page.py
+++ b/pywikibot/page.py
@@ -1,6 +1,13 @@
# -*- coding: utf-8 -*-
"""
-Objects representing various types of MediaWiki pages.
+Objects representing various types of MediaWiki, including Wikibase, pages.
+
+This module also includes objects:
+* Link: an internal or interwiki link in wikitext.
+* Revision: a single change to a wiki page.
+* Property: a type of semantic data.
+* Claim: an instance of a semantic assertion.
+
"""
#
# (C) Pywikibot team, 2008-2014
@@ -46,7 +53,7 @@
class Page(pywikibot.UnicodeMixin, ComparableMixin):
- """Page: A MediaWiki page
+ """Page: A MediaWiki page.
This object only implements internally methods that do not require
reading from or writing to the wiki. All other methods are delegated
@@ -123,6 +130,7 @@
def namespace(self):
"""Return the number of the namespace of the page.
+ @return: int
"""
return self._link.namespace
@@ -219,6 +227,7 @@
return self._link.section
def __unicode__(self):
+ """Return a unicode string representation."""
return self.title(asLink=True, forceInterwiki=True)
def __repr__(self):
@@ -228,16 +237,22 @@
def _cmpkey(self):
"""
+ Key for comparison of Page objects.
+
Page objects are "equal" if and only if they are on the same site
and have the same normalized title, including section if any.
- Page objects are sortable by site, namespace then title."""
+ Page objects are sortable by site, namespace then title.
+ """
return (self.site, self.namespace(), self.title())
def __hash__(self):
- # Pseudo method that makes it possible to store Page objects as keys
- # in hash-tables. This relies on the fact that the string
- # representation of an instance can not change after the construction.
+ """
+ A stable identifier to be used as a key in hash-tables.
+
+ This relies on the fact that the string
+ representation of an instance can not change after the construction.
+ """
return hash(unicode(self))
def autoFormat(self):
@@ -278,10 +293,10 @@
@exception SectionError The section does not exist on a page with
a # link
- @param force reload all page attributes, including errors.
- @param get_redirect return the redirect text, do not follow the
+ @param force: reload all page attributes, including errors.
+ @param get_redirect: return the redirect text, do not follow the
redirect, do not raise an exception.
- @param sysop if the user has a sysop account, use it to
+ @param sysop: if the user has a sysop account, use it to
retrieve this page
"""
@@ -366,7 +381,10 @@
@property
def text(self):
- """Return the current (edited) wikitext, loading it if necessary."""
+ """Return the current (edited) wikitext, loading it if necessary.
+
+ @return: unicode
+ """
if not hasattr(self, '_text') or self._text is None:
try:
self._text = self.get(get_redirect=True)
@@ -377,29 +395,38 @@
@text.setter
def text(self, value):
- """Update the edited wikitext"""
+ """Update the current (edited) wikitext.
+
+ @param value: New value or None
+ @param value: basestring
+ """
self._text = None if value is None else unicode(value)
@text.deleter
def text(self):
- """Delete the edited wikitext"""
+ """Delete the current (edited) wikitext."""
if hasattr(self, "_text"):
del self._text
def preloadText(self):
- """Return the text returned by EditFormPreloadText See API module "info".
+ """The text returned by EditFormPreloadText.
+
+ See API module "info".
Application: on Wikisource wikis, text can be preloaded even if
a page does not exist, if an Index page is present.
+ @return: unicode
"""
self.site.loadpageinfo(self, preload=True)
return self._preloadedtext
def properties(self, force=False):
"""
- Returns the various page properties stored for a page
+ Return the properties of the page.
+
@param force: force updating from the live site
+
@return: dict
"""
if not hasattr(self, '_pageprops') or force:
@@ -409,9 +436,10 @@
def defaultsort(self, force=False):
"""
- Returns the value of {{DEFAULTSORT:}} magic word or None if no
- defaultsort has been defined.
+ Extract value of the {{DEFAULTSORT:}} magic word from the page.
+
@param force: force updating from the live site
+
@return: unicode or None
"""
return self.properties(force=force).get('defaultsort')
@@ -423,8 +451,8 @@
@param force: force updating from the live site
@param includecomments: Also strip comments if includecomments
parameter is not True.
- @return: unicode or None
+ @return: unicode or None
"""
if not hasattr(self, '_expanded_text') or (
self._expanded_text is None) or force:
@@ -437,6 +465,7 @@
def userName(self):
"""Return name or IP address of last user to edit page.
+ @return: unicode
"""
rev = self.latestRevision()
if rev not in self._revisions:
@@ -446,6 +475,7 @@
def isIpEdit(self):
"""Return True if last editor was unregistered.
+ @return: bool
"""
rev = self.latestRevision()
if rev not in self._revisions:
@@ -455,14 +485,15 @@
def lastNonBotUser(self):
"""Return name or IP address of last human/non-bot user to edit page.
- Returns the most recent human editor out of the last revisions
- If it was not able to retrieve a human user returns None.
- If the edit was done by a bot which is no longer flagged as 'bot',
- i.e. which is not returned by Site.botusers(), it will be returned
- as a non-bot edit.
+ Determine the most recent human editor out of the last revisions.
+ If it was not able to retrieve a human user, returns None.
+ If the edit was done by a bot which is no longer flagged as 'bot',
+ i.e. which is not returned by Site.botusers(), it will be returned
+ as a non-bot edit.
+
+ @return: unicode
"""
-
if hasattr(self, '_lastNonBotUser'):
return self._lastNonBotUser
@@ -479,6 +510,7 @@
def editTime(self):
"""Return timestamp of last revision to page.
+ @return: pywikibot.Timestamp
"""
rev = self.latestRevision()
if rev not in self._revisions:
@@ -486,7 +518,10 @@
return self._revisions[rev].timestamp
def previousRevision(self):
- """Return the revision id for the previous revision of this Page."""
+ """Return the revision id for the previous revision of this Page.
+
+ @return: long
+ """
self.getVersionHistory(total=2)
revkey = sorted(self._revisions, reverse=True)[1]
return revkey
@@ -497,6 +532,7 @@
If the title includes a section, return False if this section isn't
found.
+ @return: bool
"""
return self.site.page_exists(self)
@@ -505,9 +541,16 @@
return self.site.page_isredirect(self)
def isStaticRedirect(self, force=False):
- """Return True if this is a redirect containing the magic word
- __STATICREDIRECT__, False if not or not existing.
+ """
+ Determine whether the page is a static redirect.
+ A static redirect must be a valid redirect, and contain the magic word
+ __STATICREDIRECT__.
+
+ @param force: Bypass local caching
+ @type force: bool
+
+ @return: bool
"""
found = False
if self.isRedirectPage():
@@ -521,8 +564,10 @@
return found
def isCategoryRedirect(self):
- """Return True if this is a category redirect page, False otherwise."""
+ """Return True if this is a category redirect page, False otherwise.
+ @return: bool
+ """
if not self.isCategory():
return False
if not hasattr(self, "_catredirect"):
@@ -551,7 +596,10 @@
return bool(self._catredirect)
def getCategoryRedirectTarget(self):
- """If this is a category redirect, return the target category title."""
+ """If this is a category redirect, return the target category title.
+
+ @return: Category
+ """
if self.isCategoryRedirect():
return Category(Link(self._catredirect, self.site))
raise pywikibot.IsNotRedirectPage(self.title())
@@ -562,6 +610,7 @@
Character count ignores language links and category links.
Can raise the same exceptions as get().
+ @return bool
"""
txt = self.get()
txt = pywikibot.removeLanguageLinks(txt, site=self.site)
@@ -580,8 +629,7 @@
otherwise, returns the associated talk page. The returned page need
not actually exist on the wiki.
- Returns None if self is a special page.
-
+ @return: Page or None if self is a special page.
"""
ns = self.namespace()
if ns < 0: # Special page
@@ -624,8 +672,8 @@
Template:Disambig is always assumed to be default, and will be
appended regardless of its existence.
+ @return: bool
"""
-
if self.site.hasExtension('Disambiguator', False):
# If the Disambiguator extension is loaded, use it
return 'disambiguation' in self.properties()
@@ -767,21 +815,25 @@
)
def protection(self):
- """Return a dictionary reflecting page protections"""
+ """Return a dictionary reflecting page protections.
+
+ @return: dict
+ """
return self.site.page_restrictions(self)
def canBeEdited(self):
- """Return bool indicating whether this page can be edited.
+ """Determine whether the page may be edited.
This returns True if and only if:
- page is unprotected, and bot has an account for this site, or
- page is protected, and bot has a sysop account for this site.
+ @return: bool
"""
return self.site.page_can_be_edited(self)
def botMayEdit(self):
- """Return True if this page allows bots to edit it.
+ """Determine whether the active bot is allowed to edit the page.
This will be True if the page doesn't contain {{bots}} or
{{nobots}}, or it contains them and the active bot is allowed to
@@ -793,6 +845,7 @@
to override this by setting ignore_bot_templates=True in
user-config.py, or using page.put(force=True).
+ @return: bool
"""
# TODO: move this to Site object?
if config.ignore_bot_templates: # Check the "master ignore switch"
@@ -887,6 +940,7 @@
def _save(self, comment, minor, watchval, botflag, async, callback,
**kwargs):
+ """Helper function for save()."""
err = None
link = self.title(asLink=True)
if config.cosmetic_changes:
@@ -983,14 +1037,16 @@
"""Add or remove this page to/from bot account's watchlist.
@param unwatch: True to unwatch, False (default) to watch.
- @return: True if successful, False otherwise.
+ @type unwatch: bool
+ @return: bool; True if successful, False otherwise.
"""
return self.site.watchpage(self, unwatch)
def purge(self, **kwargs):
"""Purge the server's cache for this page.
+ @return: bool
"""
return self.site.purgepages([self], **kwargs)
@@ -1004,12 +1060,16 @@
external links are omitted.
@param namespaces: only iterate links in these namespaces
+ @param namespaces: int, or list of ints
@param step: limit each API call to this number of pages
+ @type step: int
@param total: iterate no more than this number of pages in total
+ @type total: int
@param content: if True, retrieve the content of the current version
of each linked page (default False)
- @return: a generator that yields Page objects.
+ @type content: bool
+ @return: a generator that yields Page objects.
"""
return self.site.pagelinks(self, namespaces=namespaces, step=step,
total=total, content=content)
@@ -1020,8 +1080,9 @@
@param expand: if True (default), include interwiki links found in
templates transcluded onto this page; if False, only iterate
interwiki links found in this page's own wikitext
- @return: a generator that yields Link objects
+ @type expand: bool
+ @return: a generator that yields Link objects
"""
# This function does not exist in the API, so it has to be
# implemented by screen-scraping
@@ -1052,7 +1113,9 @@
@param include_obsolete: if true, return even Link objects whose site
is obsolete
+ @type include_obsolete: bool
+ @return: list of Link objects.
"""
# Note: We preload a list of *all* langlinks, including links to
# obsolete sites, and store that in self._langlinks. We then filter
@@ -1073,8 +1136,9 @@
@param total: iterate no more than this number of pages in total
@param include_obsolete: if true, yield even Link object whose site
is obsolete
- @return: a generator that yields Link objects.
+ @type include_obsolete: bool
+ @return: a generator that yields Link objects.
"""
if hasattr(self, '_langlinks'):
return iter(self.langlinks(include_obsolete=include_obsolete))
@@ -1087,7 +1151,8 @@
def data_item(self):
"""
- Convinience function to get the Wikibase item of a page
+ Convenience function to get the Wikibase item of a page.
+
@return: ItemPage
"""
return ItemPage.fromPage(self)
@@ -1095,6 +1160,7 @@
@deprecate_arg('tllimit', None)
@deprecated("Page.templates()")
def getTemplates(self):
+ """DEPRECATED. Use templates()."""
return self.templates()
def templates(self, content=False):
@@ -1106,7 +1172,7 @@
@param content: if True, retrieve the content of the current version
of each template (default False)
-
+ @param content: bool
"""
# Data might have been preloaded
if not hasattr(self, '_templates'):
@@ -1125,6 +1191,7 @@
@param total: iterate no more than this number of pages in total
@param content: if True, retrieve the content of the current version
of each template (default False)
+ @param content: bool
"""
if hasattr(self, '_templates'):
@@ -1226,8 +1293,9 @@
return self.site.page_extlinks(self, step=step, total=total)
def coordinates(self, primary_only=False):
- """Return a list of Coordinate objects for points
- on the page using [[mw:Extension:GeoData]]
+ """Return a list of Coordinate objects for points on the page.
+
+ Uses [[mw:Extension:GeoData]]
@param primary_only: Only return the coordinate indicated to be primary
@return: A list of Coordinate objects
@@ -1246,6 +1314,7 @@
If this page is not a redirect page, will raise an IsNotRedirectPage
exception. This method also can raise a NoPage exception.
+ @return: Page
"""
return self.site.getredirtarget(self)
@@ -1281,7 +1350,6 @@
def getVersionHistoryTable(self, forceReload=False, reverseOrder=False,
step=None, total=None):
"""Return the version history as a wiki table."""
-
result = '{| class="wikitable"\n'
result += '! oldid || date/time || username || edit summary\n'
for oldid, time, username, summary \
@@ -1502,9 +1570,9 @@
(equivalent to set both edit and move to '')
@param reason: Edit summary.
@param prompt: If true, ask user for confirmation.
- @param expiry: When the block should expire. This expiry will be applied
- to all protections. If None, 'infinite', 'indefinite', 'never', or ''
- is given, there is no expiry.
+ @param expiry: When the block should expire. This expiry will be
+ applied to all protections.
+ None, 'infinite', 'indefinite', 'never', and '' mean no expiry.
@type expiry: pywikibot.Timestamp, string in GNU timestamp format
(including ISO 8601).
"""
@@ -1626,7 +1694,10 @@
return False
def isFlowPage(self):
- """Whether the given title is a Flow page"""
+ """Whether the given title is a Flow page.
+
+ @return: bool
+ """
if not self.site.hasExtension('Flow', False):
return False
if not hasattr(self, '_flowinfo'):
@@ -1637,7 +1708,7 @@
@deprecated("Site.encoding()")
def encoding(self):
- """DEPRECATED: use Site.encoding() instead"""
+ """DEPRECATED: use self.site.encoding instead."""
return self.site.encoding()
@deprecated("Page.title(withNamespace=False)")
@@ -1703,8 +1774,10 @@
usingPages : Iterate Pages on which the image is displayed.
"""
+
@deprecate_arg("insite", None)
def __init__(self, source, title=u""):
+ """Constructor."""
Page.__init__(self, source, title, 6)
if self.namespace() != 6:
raise ValueError(u"'%s' is not in the image namespace!" % title)
@@ -1732,11 +1805,17 @@
@deprecated("fileIsShared")
def fileIsOnCommons(self):
- """Return True if the image is stored on Wikimedia Commons"""
+ """DEPRECATED. Check if the image is stored on Wikimedia Commons.
+
+ @return: bool
+ """
return self.fileIsShared()
def fileIsShared(self):
- """Return True if image is stored on any known shared repository."""
+ """Check if the image is stored on any known shared repository.
+
+ @return: bool
+ """
# as of now, the only known repositories are commons and wikitravel
# TODO: put the URLs to family file
if not self.site.has_image_repository:
@@ -1803,11 +1882,14 @@
class Category(Page):
- """A page in the Category: namespace"""
+ """A page in the Category: namespace."""
@deprecate_arg("insite", None)
def __init__(self, source, title=u"", sortKey=None):
- """All parameters are the same as for Page() constructor.
+ """
+ Constructor.
+
+ All parameters are the same as for Page() constructor.
"""
Page.__init__(self, source, title, ns=14)
@@ -1900,7 +1982,7 @@
starttime=None, endtime=None, startsort=None,
endsort=None):
"""
- Yields all articles in the current category.
+ Yield all articles in the current category.
By default, yields all *pages* in the category that are not
subcategories!
@@ -1975,7 +2057,6 @@
def members(self, recurse=False, namespaces=None, step=None, total=None,
content=False):
"""Yield all category contents (subcats, pages, and files)."""
-
for member in self.site.categorymembers(
self, namespaces, step=step, total=total, content=content):
yield member
@@ -2099,32 +2180,35 @@
@property
def categoryinfo(self):
- """return a dict containing category content values:
+ """Return a dict containing information about the category.
+
+ The dict contains values for:
Numbers of pages, subcategories, files, and total contents.
+ @return: dict
"""
return self.site.categoryinfo(self)
# ### DEPRECATED METHODS ####
@deprecated("list(Category.subcategories(...))")
def subcategoriesList(self, recurse=False):
- """DEPRECATED: Equivalent to list(self.subcategories(...))"""
+ """DEPRECATED: Equivalent to list(self.subcategories(...))."""
return sorted(list(set(self.subcategories(recurse))))
@deprecated("list(Category.articles(...))")
def articlesList(self, recurse=False):
- """DEPRECATED: equivalent to list(self.articles(...))"""
+ """DEPRECATED: equivalent to list(self.articles(...))."""
return sorted(list(set(self.articles(recurse))))
@deprecated("Category.categories()")
def supercategories(self):
- """DEPRECATED: equivalent to self.categories()"""
+ """DEPRECATED: equivalent to self.categories()."""
return self.categories()
@deprecated("list(Category.categories(...))")
def supercategoriesList(self):
- """DEPRECATED: equivalent to list(self.categories(...))"""
+ """DEPRECATED: equivalent to list(self.categories(...))."""
return sorted(list(set(self.categories())))
@@ -2140,12 +2224,15 @@
class User(Page):
"""A class that represents a Wiki user.
+
+ This class also represents the Wiki page User:<username>
"""
@deprecate_arg("site", "source")
@deprecate_arg("name", "title")
def __init__(self, source, title=u''):
"""Initializer for a User object.
+
All parameters are the same as for Page() constructor.
"""
if len(title) > 1 and title[0] == u'#':
@@ -2164,12 +2251,21 @@
"This is an autoblock ID, you can only use to unblock it.")
def name(self):
+ """
+ The username.
+
+ @return: unicode
+ """
return self.username
@property
def username(self):
- """ Convenience method that returns the title of the page with
- namespace prefix omitted, aka the username, as a Unicode string.
+ """ The username.
+
+ Convenience method that returns the title of the page with
+ namespace prefix omitted, which is the username.
+
+ @return: unicode
"""
if self._isAutoblock:
return u'#' + self.title(withNamespace=False)
@@ -2177,11 +2273,18 @@
return self.title(withNamespace=False)
def isRegistered(self, force=False):
- """ Return True if a user with this name is registered on this site,
- False otherwise.
+ """ Determine if the user is registered on the site.
+
+ It is possible to have a page named User:xyz and not have
+ a corresponding user with username xyz.
+
+ The page does not need to exist for this method to return
+ True.
@param force: if True, forces reloading the data from API
@type force: bool
+
+ @return: bool
"""
if self.isAnonymous():
return False
@@ -2189,14 +2292,19 @@
return self.getprops(force).get('missing') is None
def isAnonymous(self):
+ """ Determine if the user is editing as an IP address.
+
+ @return: bool
+ """
return ip_regexp.match(self.username) is not None
def getprops(self, force=False):
- """ Return a Dictionary that contains user's properties. Use cached
- values if already called before, otherwise fetch data from the API.
+ """ Return a properties about the user.
@param force: if True, forces reloading the data from API
@type force: bool
+
+ @return: dict
"""
if force:
del self._userprops
@@ -2211,11 +2319,12 @@
@deprecated('User.registration()')
def registrationTime(self, force=False):
- """ Return registration date for this user, as a long in
- MediaWiki's internal timestamp format, or 0 if the date is unknown.
+ """ DEPRECATED. Fetch registration date for this user.
@param force: if True, forces reloading the data from API
@type force: bool
+
+ @return: long (MediaWiki's internal timestamp format) or 0
"""
if self.registration():
return long(self.registration().strftime('%Y%m%d%H%M%S'))
@@ -2223,22 +2332,26 @@
return 0
def registration(self, force=False):
- """ Return registration date for this user as a pywikibot.Timestamp
- object, or None if the date is unknown.
+ """ Fetch registration date for this user.
@param force: if True, forces reloading the data from API
@type force: bool
+
+ @return: pywikibot.Timestamp or None
"""
reg = self.getprops(force).get('registration')
if reg:
return pywikibot.Timestamp.fromISOformat(reg)
def editCount(self, force=False):
- """ Return edit count for this user as int. This is always 0 for
- 'anonymous' users.
+ """ Return edit count for a registered user.
+
+ Always returns 0 for 'anonymous' users.
@param force: if True, forces reloading the data from API
@type force: bool
+
+ @return: long
"""
if 'editcount' in self.getprops(force):
return self.getprops()['editcount']
@@ -2246,28 +2359,34 @@
return 0
def isBlocked(self, force=False):
- """ Return True if this user is currently blocked, False otherwise.
+ """ Determine whether the user is currently blocked.
@param force: if True, forces reloading the data from API
@type force: bool
+
+ @return: bool
"""
return 'blockedby' in self.getprops(force)
def isEmailable(self, force=False):
- """ Return True if emails can be send to this user through MediaWiki,
- False otherwise.
+ """ Determine whether emails may be send to this user through MediaWiki.
@param force: if True, forces reloading the data from API
@type force: bool
+
+ @return: bool
"""
return 'emailable' in self.getprops(force)
def groups(self, force=False):
- """ Return a list of groups to wich this user belongs. The return value
- is guaranteed to be a list object, possibly empty.
+ """ Return a list of groups to which this user belongs.
+
+ The list of groups may be empty.
@param force: if True, forces reloading the data from API
@type force: bool
+
+ @return: list
"""
if 'groups' in self.getprops(force):
return self.getprops()['groups']
@@ -2275,8 +2394,7 @@
return []
def getUserPage(self, subpage=u''):
- """ Return a pywikibot.Page object corresponding to this user's main
- page, or a subpage of it if subpage is set.
+ """ Return a Page object relative to this user's main page.
@param subpage: subpage part to be appended to the main
page title (optional)
@@ -2292,8 +2410,7 @@
return Page(Link(self.title() + subpage, self.site))
def getUserTalkPage(self, subpage=u''):
- """ Return a pywikibot.Page object corresponding to this user's main
- talk page, or a subpage of it if subpage is set.
+ """ Return a Page object relative to this user's main talk page.
@param subpage: subpage part to be appended to the main
talk page title (optional)
@@ -2311,6 +2428,7 @@
def sendMail(self, subject, text, ccme=False):
""" Send an email to this user via MediaWiki's email interface.
+
Return True on success, False otherwise.
This method can raise an UserActionRefuse exception in case this user
doesn't allow sending email to him or the currently logged in bot
@@ -2354,7 +2472,8 @@
def block(self, expiry, reason, anononly=True, nocreate=True,
autoblock=True, noemail=False, reblock=False):
"""
- Blocks a user
+ Block user.
+
@param expiry: When the block should expire
@type expiry: pywikibot.Timestamp|str
@param reason: Block reason
@@ -2383,8 +2502,9 @@
@deprecated("contributions")
@deprecate_arg("limit", "total") # To be consistent with rest of framework
def editedPages(self, total=500):
- """ Deprecated function that wraps 'contributions' for backwards
- compatibility. Yields pywikibot.Page objects that this user has
+ """ DEPRECATED. Use contributions().
+
+ Yields pywikibot.Page objects that this user has
edited, with an upper bound of 'total'. Pages returned are not
guaranteed to be unique.
@@ -2397,8 +2517,9 @@
@deprecate_arg("limit", "total") # To be consistent with rest of framework
@deprecate_arg("namespace", "namespaces")
def contributions(self, total=500, namespaces=[]):
- """ Yield tuples describing this user edits with an upper bound of
- 'limit'. Each tuple is composed of a pywikibot.Page object,
+ """ Yield tuples describing this user edits.
+
+ Each tuple is composed of a pywikibot.Page object,
the revision id (int), the edit timestamp (as a pywikibot.Timestamp
object), and the comment (unicode).
Pages returned are not guaranteed to be unique.
@@ -2420,6 +2541,7 @@
@deprecate_arg("number", "total")
def uploadedImages(self, total=10):
""" Yield tuples describing files uploaded by this user.
+
Each tuple is composed of a pywikibot.Page, the timestamp (str in
ISO8601 format), comment (unicode) and a bool for pageid > 0.
Pages returned are not guaranteed to be unique.
@@ -2442,9 +2564,12 @@
"""
The base page for the Wikibase extension.
- There really should be no need to call this directly
+
+ There should be no need to instantiate this directly.
"""
+
def __init__(self, site, title=u"", **kwargs):
+ """ Constructor. """
if not isinstance(site, pywikibot.site.DataSite):
raise TypeError("site must be a pywikibot.site.DataSite object")
Page.__init__(self, site, title, **kwargs)
@@ -2452,6 +2577,7 @@
self._isredir = False # Wikibase pages cannot be a redirect
def title(self, **kwargs):
+ """ Page title. """
if self.namespace() == 0:
self._link._text = self.getID()
del self._link._title
@@ -2459,13 +2585,16 @@
@deprecated("_defined_by")
def __defined_by(self, singular=False):
+ """ DEPRECATED. """
return self._defined_by(singular=singular)
def _defined_by(self, singular=False):
"""
- returns the parameters needed by the API to identify an item.
+ Internal function to provide the API parameters to identify the entity.
+
Once an item's "p/q##" is looked up, that will be used for all future
requests.
+
@param singular: Whether the parameter names should use the singular
form
@type singular: bool
@@ -2495,6 +2624,11 @@
return params
def exists(self):
+ """
+ Determine if an entity exists in the data repository.
+
+ @return: bool
+ """
if not hasattr(self, '_content'):
try:
self.get()
@@ -2505,9 +2639,11 @@
def get(self, force=False, *args):
"""
- Fetches all page data, and caches it
- force will override caching
- args can be used to specify custom props.
+ Fetch all page data, and cache it.
+
+ @param force: override caching
+ @type force: bool
+ @param args: may be used to specify custom props.
"""
if force or not hasattr(self, '_content'):
data = self.repo.loadcontent(self._defined_by(), *args)
@@ -2546,10 +2682,12 @@
def getID(self, numeric=False, force=False):
"""
- @param numeric Strip the first letter and return an int
- @type numeric bool
- @param force Force an update of new data
- @type force bool
+ Get the entity identifier.
+
+ @param numeric: Strip the first letter and return an int
+ @type numeric: bool
+ @param force: Force an update of new data
+ @type force: bool
"""
if not hasattr(self, 'id') or force:
self.get(force=force)
@@ -2558,16 +2696,23 @@
return self.id
def latestRevision(self):
+ """
+ Get the revision identifier for the most recent revision of the entity.
+
+ @return: long
+ """
if not hasattr(self, 'lastrevid'):
self.get()
return self.lastrevid
def __normalizeLanguages(self, data):
"""
- Helper function to convert any site objects
- into the language they may represent.
- @param data The dict to check
- @type data dict
+ Helper function to replace site objects with their language codes.
+
+ @param data: The dict to check
+ @type data: dict
+
+ @return: dict
"""
for key in data:
if isinstance(key, pywikibot.site.BaseSite):
@@ -2577,8 +2722,10 @@
def getdbName(self, site):
"""
- Helper function to normalize site
- objects into dbnames
+ Helper function to obtain a dbName for a Site.
+
+ @param site: The site to look up.
+ @type site: Site
"""
if isinstance(site, pywikibot.site.BaseSite):
return site.dbName()
@@ -2586,14 +2733,16 @@
def editEntity(self, data, **kwargs):
"""
- Enables updating of entities through wbeditentity
+ Edit an entity using Wikibase wbeditentity API.
+
This function is wrapped around by:
*editLabels
*editDescriptions
*editAliases
*ItemPage.setSitelinks
- @param data Data to be saved
- @type data dict
+
+ @param data: Data to be saved
+ @type data: dict
"""
if hasattr(self, 'lastrevid'):
baserevid = self.lastrevid
@@ -2605,6 +2754,8 @@
def editLabels(self, labels, **kwargs):
"""
+ Edit entity labels.
+
Labels should be a dict, with the key
as a language or a site object. The
value should be the string to set it to.
@@ -2618,6 +2769,8 @@
def editDescriptions(self, descriptions, **kwargs):
"""
+ Edit entity descriptions.
+
Descriptions should be a dict, with the key
as a language or a site object. The
value should be the string to set it to.
@@ -2631,6 +2784,8 @@
def editAliases(self, aliases, **kwargs):
"""
+ Edit entity aliases.
+
Aliases should be a dict, with the key
as a language or a site object. The
value should be a list of strings.
@@ -2643,9 +2798,20 @@
class ItemPage(WikibasePage):
+
+ """ A Wikibase item.
+
+ A Wikibase item may be defined by either a 'Q' id (qid),
+ or by a site & title.
+
+ If an item is defined by site & title, once an item's qid has
+ been looked up, the item is then defined by the qid.
+ """
+
def __init__(self, site, title=None):
"""
- defined by qid XOR site AND title
+ Constructor.
+
@param site: data repository
@type site: pywikibot.site.DataSite
@param title: id number of item, "Q###"
@@ -2656,7 +2822,8 @@
@classmethod
def fromPage(cls, page):
"""
- Get the ItemPage based on a Page that links to it
+ Get the ItemPage for a Page that links to it.
+
@param page: Page
@return: ItemPage
"""
@@ -2674,9 +2841,11 @@
def get(self, force=False, *args):
"""
- Fetches all page data, and caches it
- force will override caching
- args are the values of props
+ Fetch all item data, and cache it.
+
+ @param force: override caching
+ @type force: bool
+ @param args: values of props
"""
super(ItemPage, self).get(force=force, *args)
@@ -2706,7 +2875,8 @@
def iterlinks(self, family=None):
"""
- Iterates through all the sitelinks
+ Iterate through all the sitelinks.
+
@param family: string/Family object which represents what family of
links to iterate
@type family: str|pywikibot.family.Family
@@ -2725,10 +2895,15 @@
def getSitelink(self, site, force=False):
"""
- Returns the title (unicode string) for the specific site
- site is a pywikibot.Site or database name
- force will override caching
- If the item doesn't have that language, raise NoPage
+ Return the title for the specific site.
+
+ If the item doesn't have that language, raise NoPage.
+
+ @param site: Site to find the linked page of.
+ @type site: pywikibot.Site or database name
+ @param force: override caching
+
+ @return: unicode
"""
if force or not hasattr(self, '_content'):
self.get(force=force)
@@ -2740,6 +2915,8 @@
def setSitelink(self, sitelink, **kwargs):
"""
+ Set sitelinks. Calls setSitelinks().
+
A sitelink can either be a Page object,
or a {'site':dbname,'title':title} dictionary.
"""
@@ -2747,13 +2924,16 @@
def removeSitelink(self, site, **kwargs):
"""
- A site can either be a Site object,
- or it can be a dbName.
+ Remove a sitelink.
+
+ A site can either be a Site object, or it can be a dbName.
"""
self.removeSitelinks([site], **kwargs)
def removeSitelinks(self, sites, **kwargs):
"""
+ Remove sitelinks.
+
Sites should be a list, with values either
being Site objects, or dbNames.
"""
@@ -2765,11 +2945,12 @@
def setSitelinks(self, sitelinks, **kwargs):
"""
+ Set sitelinks.
+
Sitelinks should be a list. Each item in the
list can either be a Page object, or a dict
with a value for 'site' and 'title'.
"""
-
data = {}
for obj in sitelinks:
if isinstance(obj, Page):
@@ -2784,18 +2965,20 @@
def addClaim(self, claim, bot=True, **kwargs):
"""
- Adds the claim to the item
- @param claim The claim to add
- @type claim Claim
- @param bot Whether to flag as bot (if possible)
- @type bot bool
+ Add a claim to the item.
+
+ @param claim: The claim to add
+ @type claim: Claim
+ @param bot: Whether to flag as bot (if possible)
+ @type bot: bool
"""
self.repo.addClaim(self, claim, bot=bot, **kwargs)
claim.on_item = self
def removeClaims(self, claims, **kwargs):
"""
- Removes the claims from the item
+ Remove the claims from the item.
+
@type claims: list
"""
@@ -2807,7 +2990,8 @@
def mergeInto(self, item, **kwargs):
"""
- Merges the item into another item
+ Merge the item into another item.
+
@param item: The item to merge into
@type item: pywikibot.ItemPage
"""
@@ -2815,6 +2999,19 @@
class Property():
+
+ """
+ A Wikibase property.
+
+ While every Wikibase property has a Page on the data repository,
+ this object is for when the property is used as part of another concept
+ where the property is not _the_ Page of the property.
+
+ For example, a claim on an ItemPage has many property attributes, and so
+ it subclasses this Property class, but a claim does not have Page like
+ behaviour and semantics.
+ """
+
types = {'wikibase-item': ItemPage,
'string': basestring,
'commonsMedia': ImagePage,
@@ -2825,13 +3022,16 @@
}
def __init__(self, site, id=None):
+ """Constructor."""
self.repo = site
self.id = id.upper()
@property
def type(self):
"""
- Return the type of this property
+ Return the type of this property.
+
+ @return: str
"""
if not hasattr(self, '_type'):
self._type = self.repo.getPropertyType(self)
@@ -2840,7 +3040,8 @@
@deprecated("Property.type")
def getType(self):
"""
- Return the type of this property
+ Return the type of this property.
+
It returns 'globecoordinate' for type 'globe-coordinate'
in order to be backwards compatible. See
https://gerrit.wikimedia.org/r/#/c/135405/ for background.
@@ -2851,6 +3052,12 @@
return self._type
def getID(self, numeric=False):
+ """
+ Get the identifier of this property.
+
+ @param numeric: Strip the first letter and return an int
+ @type numeric: bool
+ """
if numeric:
return int(self.id[1:])
else:
@@ -2860,13 +3067,16 @@
class PropertyPage(WikibasePage, Property):
"""
- Any page in the property namespace
+ A Wikibase entity in the property namespace.
+
Should be created as:
PropertyPage(DataSite, 'Property:P21')
"""
def __init__(self, source, title=u""):
"""
+ Constructor.
+
@param source: data repository property is on
@type source: pywikibot.site.DataSite
@param title: page name of property, like "Property:P##"
@@ -2878,14 +3088,20 @@
raise ValueError(u"'%s' is not a property page!" % self.title())
def get(self, force=False, *args):
+ """
+ Fetch the property entity, and cache it.
+
+ @param force: override caching
+ @param args: values of props
+ """
if force or not hasattr(self, '_content'):
WikibasePage.get(self, force=force, *args)
self.type = self._content['datatype']
def newClaim(self, *args, **kwargs):
"""
- Convenicence function to create a new claim object
- for a specific property
+ Convenicence function to create a new claim object for this property.
+
@return: Claim
"""
return Claim(self.site, self.getID(), *args, **kwargs)
@@ -2894,9 +3110,13 @@
class QueryPage(WikibasePage):
"""
- For future usage, not implemented yet
+ A Wikibase Query entity.
+
+ For future usage, not implemented yet.
"""
+
def __init__(self, site, title):
+ """Constructor."""
WikibasePage.__init__(self, site, title, ns=122)
self.id = self.title(withNamespace=False).upper()
if not self.id.startswith(u'U'):
@@ -2906,11 +3126,16 @@
class Claim(Property):
"""
+ A Claim on a Wikibase entity.
+
Claims are standard claims as well as references.
"""
+
def __init__(self, site, pid, snak=None, hash=None, isReference=False,
isQualifier=False):
"""
+ Constructor.
+
Defined by the "snak" value, supplemented by site + pid
@param site: repository the claim is on
@@ -2938,8 +3163,12 @@
@staticmethod
def fromJSON(site, data):
"""
- Creates the claim object from JSON returned
- in the API call.
+ Create a claim object from JSON returned in the API call.
+
+ @param data: JSON containing claim data
+ @type data: dict
+
+ @return: Claim
"""
claim = Claim(site, data['mainsnak']['property'])
if 'id' in data:
@@ -2980,9 +3209,13 @@
@staticmethod
def referenceFromJSON(site, data):
"""
+ Create a dict of claims from reference JSON returned in the API call.
+
Reference objects are represented a
bit differently, and require some
more handling.
+
+ @return: dict
"""
source = collections.defaultdict(list)
for prop in list(data['snaks'].values()):
@@ -2995,17 +3228,26 @@
@staticmethod
def qualifierFromJSON(site, data):
"""
+ Create a Claim for a qualifer from JSON.
+
Qualifier objects are represented a bit
differently like references, but I'm not
sure if this even requires it's own function.
+
+ @return: Claim
"""
wrap = {'mainsnak': data}
return Claim.fromJSON(site, wrap)
def setTarget(self, value):
"""
- Sets the target to the passed value.
- There should be checks to ensure type compliance
+ Set the target value in the local object.
+
+ @param value: The new target value.
+ @type value: object
+
+ Raises ValueError if value is not of the type
+ required for the Claim type.
"""
value_class = self.types[self.type]
if not isinstance(value, value_class):
@@ -3015,7 +3257,12 @@
def changeTarget(self, value=None, snaktype='value', **kwargs):
"""
- This actually saves the new target.
+ Set the target value in the data repository.
+
+ @param value: The new target value.
+ @type value: object
+ @param snaktype: The new snaktype.
+ @type value: str ('value', 'somevalue', or 'novalue')
"""
if value:
self.setTarget(value)
@@ -3027,19 +3274,28 @@
def getTarget(self):
"""
- Returns object that the property is associated with.
+ Return the target value of this Claim.
+
None is returned if no target is set
+
+ @return: object
"""
return self.target
def getSnakType(self):
"""
- Returns the "snaktype"
- Can be "value", "somevalue" or "novalue"
+ Return the "snaktype".
+
+ @return: str ('value', 'somevalue' or 'novalue')
"""
return self.snaktype
def setSnakType(self, value):
+ """Set the "snaktype".
+
+ @param value: Type of snak
+ @type value: str ('value', 'somevalue', or 'novalue')
+ """
if value in ['value', 'somevalue', 'novalue']:
self.snaktype = value
else:
@@ -3047,17 +3303,21 @@
"snaktype must be 'value', 'somevalue', or 'novalue'.")
def getRank(self):
+ """Return the rank of the Claim."""
return self.rank
def setRank(self):
"""
+ Set the rank of the Claim.
+
Has not been implemented in the Wikibase API yet
"""
raise NotImplementedError
def changeSnakType(self, value=None, **kwargs):
"""
- This actually saves the new snakvalue.
+ Save the new snakvalue.
+
TODO: Is this function really needed?
"""
if value:
@@ -3066,13 +3326,16 @@
def getSources(self):
"""
- Returns a list of sources. Each source is a list of Claims.
+ Return a list of sources, each being a list of Claims.
+
+ @return: list
"""
return self.sources
def addSource(self, claim, **kwargs):
"""
- Adds the claim as a source.
+ Add the claim as a source.
+
@param claim: the claim to add
@type claim: pywikibot.Claim
"""
@@ -3080,7 +3343,8 @@
def addSources(self, claims, **kwargs):
"""
- Adds the claims as one source.
+ Add the claims as one source.
+
@param claims: the claims to add
@type claims: list of pywikibot.Claim
"""
@@ -3094,7 +3358,8 @@
def removeSource(self, source, **kwargs):
"""
- Removes the source.
+ Remove the source. Calls removeSources().
+
@param source: the source to remove
@type source: pywikibot.Claim
"""
@@ -3102,7 +3367,8 @@
def removeSources(self, sources, **kwargs):
"""
- Removes the individual sources.
+ Remove the sources.
+
@param sources: the sources to remove
@type sources: list of pywikibot.Claim
"""
@@ -3113,7 +3379,7 @@
self.sources.remove(source_dict)
def addQualifier(self, qualifier, **kwargs):
- """Add the given qualifier
+ """Add the given qualifier.
@param qualifier: the qualifier to add
@type qualifier: Claim
@@ -3125,7 +3391,9 @@
def _formatDataValue(self):
"""
- Format the target into the proper JSON datavalue that Wikibase wants
+ Format the target into the proper JSON datavalue that Wikibase wants.
+
+ @return: dict
"""
if self.type == 'wikibase-item':
value = {'entity-type': 'item',
@@ -3145,9 +3413,13 @@
class Revision(object):
"""A structure holding information about a single revision of a Page."""
+
def __init__(self, revid, timestamp, user, anon=False, comment=u"",
text=None, minor=False, rollbacktoken=None):
- """All parameters correspond to object attributes (e.g., revid
+ """
+ Constructor.
+
+ All parameters correspond to object attributes (e.g., revid
parameter is stored as self.revid)
@param revid: Revision id number
@@ -3178,7 +3450,7 @@
class Link(ComparableMixin):
- """A MediaWiki link (local or interwiki)
+ """A MediaWiki link (local or interwiki).
Has the following attributes:
@@ -3192,6 +3464,7 @@
following a '|' character inside the link
"""
+
illegal_titles_pattern = re.compile(
# Matching titles will be held as illegal.
r'''[\x00-\x1f\x23\x3c\x3e\x5b\x5d\x7b\x7c\x7d\x7f]'''
@@ -3205,7 +3478,7 @@
)
def __init__(self, text, source=None, defaultNamespace=0):
- """Constructor
+ """Constructor.
@param text: the link text (everything appearing between [[ and ]]
on a wiki page)
@@ -3264,6 +3537,7 @@
self._text = t
def __repr__(self):
+ """Return a more complete string representation."""
return "pywikibot.page.Link(%r, %r)" % (self.title, self.site)
def parse_site(self):
@@ -3307,8 +3581,10 @@
return (fam.name, code) # text before : doesn't match any known prefix
def parse(self):
- """Parse text; called internally when accessing attributes"""
+ """Parse wikitext of the link.
+ Called internally when accessing attributes.
+ """
self._site = self._source
self._namespace = self._defaultns
t = self._text
@@ -3422,30 +3698,50 @@
@property
def site(self):
+ """Return the site of the link.
+
+ @return: unicode
+ """
if not hasattr(self, "_site"):
self.parse()
return self._site
@property
def namespace(self):
+ """Return the namespace of the link.
+
+ @return: unicode
+ """
if not hasattr(self, "_namespace"):
self.parse()
return self._namespace
@property
def title(self):
+ """Return the title of the link.
+
+ @return: unicode
+ """
if not hasattr(self, "_title"):
self.parse()
return self._title
@property
def section(self):
+ """Return the section of the link.
+
+ @return: unicode
+ """
if not hasattr(self, "_section"):
self.parse()
return self._section
@property
def anchor(self):
+ """Return the anchor of the link.
+
+ @return: unicode
+ """
if not hasattr(self, "_anchor"):
self.parse()
return self._anchor
@@ -3487,10 +3783,13 @@
title)
def __str__(self):
+ """Return a string representation."""
return self.astext().encode("ascii", "backslashreplace")
def _cmpkey(self):
"""
+ Key for comparison of Link objects.
+
Link objects are "equal" if and only if they are on the same site
and have the same normalized title, including section if any.
@@ -3499,9 +3798,14 @@
return (self.site, self.namespace, self.title)
def __unicode__(self):
+ """Return a unicode string representation.
+
+ @return unicode
+ """
return self.astext()
def __hash__(self):
+ """A stable identifier to be used as a key in hash-tables."""
return hash(u'%s:%s:%s' % (self.site.family.name,
self.site.code,
self.title))
@@ -3510,9 +3814,14 @@
def fromPage(page, source=None):
"""
Create a Link to a Page.
- @param source: Link from site source
- """
+ @param page: target Page
+ @type page: Page
+ @param source: Link from site source
+ @param source: Site
+
+ @return: Link
+ """
link = Link.__new__(Link)
link._site = page.site
link._section = page.section()
@@ -3529,7 +3838,17 @@
def langlinkUnsafe(lang, title, source):
"""
Create a "lang:title" Link linked from source.
+
Assumes that the lang & title come clean, no checks are made.
+
+ @param lang: target site code (language)
+ @type lang: str
+ @param title: target Page
+ @type title: unicode
+ @param source: Link from site source
+ @param source: Site
+
+ @return: Link
"""
link = Link.__new__(Link)
if source.family.interwiki_forward:
@@ -3560,7 +3879,13 @@
def html2unicode(text, ignore=None):
- """Return text, replacing HTML entities by equivalent unicode characters."""
+ """Replace HTML entities with equivalent unicode.
+
+ @param ignore: HTML entities to ignore
+ @param ignore: list of int
+
+ @return: unicode
+ """
if ignore is None:
ignore = []
# This regular expression will match any decimal and hexadecimal entity and
@@ -3654,13 +3979,19 @@
def unicode2html(x, encoding):
"""
-Ensure unicode string is encodable, or else convert to ASCII for HTML.
+ Convert unicode string to requested HTML encoding.
-Arguments are a unicode string and an encoding. Attempt to encode the
-string into the desired format; if that doesn't work, encode the unicode
-into html &#; entities. If it does work, return it unchanged.
+ Attempt to encode the
+ string into the desired format; if that doesn't work, encode the unicode
+ into html &#; entities. If it does work, return it unchanged.
-"""
+ @param x: String to update
+ @type x: unicode
+ @param encoding: Encoding to use
+ @type encoding: str
+
+ @return: str
+ """
try:
x.encode(encoding)
except UnicodeError:
--
To view, visit https://gerrit.wikimedia.org/r/145759
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I35d28dbd4ad87b50f4e93cf6dc460c0fb59ea57a
Gerrit-PatchSet: 4
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Hashar <hashar(a)free.fr>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Do not repeat the login process
......................................................................
Do not repeat the login process
Log calls to site.login when loginstatus is IN_PROGRESS.
If site.logged_in() is True, set the loginstatus and
do not re-attempt to log in.
Change-Id: Icb4c717201c661eed8d4a13eeaf2bbfb1f58aef0
---
M pywikibot/site.py
1 file changed, 20 insertions(+), 0 deletions(-)
Approvals:
John Vandenberg: Looks good to me, but someone else must approve
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/site.py b/pywikibot/site.py
index 385ed16..cd2a41d 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -743,6 +743,26 @@
def login(self, sysop=False):
"""Log the user in if not already logged in."""
+ # TODO: this should include an assert that loginstatus
+ # is not already IN_PROGRESS, however the
+ # login status may be left 'IN_PROGRESS' because
+ # of exceptions or if the first method of login
+ # (below) is successful. Instead, log the problem,
+ # to be increased to 'warning' level once majority
+ # of issues are resolved.
+ if self._loginstatus == LoginStatus.IN_PROGRESS:
+ pywikibot.log(
+ u'%r.login(%r) called when a previous login was in progress.'
+ % (self, sysop)
+ )
+ # There are several ways that the site may already be
+ # logged in, and we do not need to hit the server again.
+ # logged_in() is False if _userinfo exists, which means this
+ # will have no effect for the invocation from api.py
+ if self.logged_in(sysop):
+ self._loginstatus = (LoginStatus.AS_SYSOP
+ if sysop else LoginStatus.AS_USER)
+ return
# check whether a login cookie already exists for this user
self._loginstatus = LoginStatus.IN_PROGRESS
if hasattr(self, "_userinfo"):
--
To view, visit https://gerrit.wikimedia.org/r/144248
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Icb4c717201c661eed8d4a13eeaf2bbfb1f58aef0
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Remove site NotImplementedYet methods
......................................................................
Remove site NotImplementedYet methods
These methods are largely copied from compat and rely on attributes
and data files which do not exist.
They have been unneeded for years and have only seen maintenance edits
to 'fix' the structure of the code as other methods change, pepifying, etc.
If the algorithms are needed again, they can be found in compat.
Change-Id: Iad18a29b9b02bdb4164b3d810fe3dd482280765b
---
M pywikibot/site.py
1 file changed, 0 insertions(+), 45 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/site.py b/pywikibot/site.py
index fb51fdf..93c9eba 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -4123,48 +4123,3 @@
def newimages(self, *args, **kwargs):
raise NotImplementedError
-
-
-# ### METHODS NOT IMPLEMENTED YET ####
-class NotImplementedYet:
-
- # TODO: is this needed any more? can it be obtained from the http module?
- def cookies(self, sysop=False):
- """Return a string containing the user's current cookies."""
- self._loadCookies(sysop=sysop)
- index = self._userIndex(sysop)
- return self._cookies[index]
-
- def _loadCookies(self, sysop=False):
- """Retrieve session cookies for login"""
- index = self._userIndex(sysop)
- if self._cookies[index] is not None:
- return
- try:
- if sysop:
- try:
- username = config.sysopnames[self.family.name][self.code]
- except KeyError:
- raise NoUsername("""\
-You tried to perform an action that requires admin privileges, but you haven't
-entered your sysop name in your user-config.py. Please add
-sysopnames['%s']['%s']='name' to your user-config.py"""
- % (self.family.name, self.code))
- else:
- username = pywikibot.config2.usernames[self.family.name
- ][self.code]
- except KeyError:
- self._cookies[index] = None
- self._isLoggedIn[index] = False
- else:
- tmp = '%s-%s-%s-login.data' % (self.family.name, self.code,
- username)
- fn = config.datafilepath('login-data', tmp)
- if not os.path.exists(fn):
- self._cookies[index] = None
- self._isLoggedIn[index] = False
- else:
- f = open(fn)
- self._cookies[index] = '; '.join([x.strip()
- for x in f.readlines()])
- f.close()
--
To view, visit https://gerrit.wikimedia.org/r/145880
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Iad18a29b9b02bdb4164b3d810fe3dd482280765b
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Docstring fixes
......................................................................
Docstring fixes
This change addresses all but three docstring format issues.
Not all missing docstrings have been added.
- Colons added in @param and @type.
- Replaced 'pywikibot.Claim' with 'Claim' in @type.
Change-Id: I83a1a672de9415a9fbc3baab906cee78e85a44b9
---
M pywikibot/site.py
1 file changed, 160 insertions(+), 124 deletions(-)
Approvals:
John Vandenberg: Looks good to me, but someone else must approve
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/site.py b/pywikibot/site.py
index fb51fdf..5d962cb 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
"""
-Objects representing MediaWiki sites (wikis) and families (groups of wikis
-on the same topic in different languages).
+Objects representing MediaWiki sites (wikis).
+
+This module also includes functions to load families, which are
+groups of wikis on the same topic in different languages.
"""
#
# (C) Pywikibot team, 2008-2014
@@ -64,6 +66,7 @@
>>> LoginStatus.name(0)
'AS_USER'
"""
+
NOT_ATTEMPTED = -3
IN_PROGRESS = -2
NOT_LOGGED_IN = -1
@@ -121,11 +124,11 @@
class BaseSite(object):
"""Site methods that are independent of the communication interface."""
- # to implement a specific interface, define a Site class that inherits
- # from this
def __init__(self, code, fam=None, user=None, sysop=None):
"""
+ Constructor.
+
@param code: the site's language code
@type code: str
@param fam: wiki family name (optional)
@@ -179,16 +182,13 @@
@property
def throttle(self):
"""Return this Site's throttle. Initialize a new one if needed."""
-
if not hasattr(self, "_throttle"):
self._throttle = Throttle(self, multiplydelay=True)
-
return self._throttle
@property
def family(self):
"""The Family object for this Site's wiki family."""
-
return self.__family
@property
@@ -212,7 +212,6 @@
def __cmp__(self, other):
"""Perform equality and inequality tests on Site objects."""
-
if not isinstance(other, BaseSite):
return 1
if self.family == other.family:
@@ -220,7 +219,7 @@
return cmp(self.family.name, other.family.name)
def __getstate__(self):
- """ Remove Lock based classes before pickling """
+ """ Remove Lock based classes before pickling. """
new = self.__dict__.copy()
del new['_pagemutex']
if '_throttle' in new:
@@ -228,13 +227,12 @@
return new
def __setstate__(self, attrs):
- """ Restore things removed in __getstate__ """
+ """ Restore things removed in __getstate__. """
self.__dict__.update(attrs)
self._pagemutex = threading.Lock()
def user(self):
"""Return the currently-logged in bot user, or None."""
-
if self.logged_in(True):
return self._username[True]
elif self.logged_in(False):
@@ -244,8 +242,7 @@
return self._username[sysop]
def __getattr__(self, attr):
- """Calls to methods not defined in this object are passed to Family."""
-
+ """Delegate undefined methods calls to the Family object."""
if hasattr(self.__class__, attr):
return getattr(self.__class__, attr)
try:
@@ -260,7 +257,6 @@
def sitename(self):
"""Return string representing this Site's name and code."""
-
return self.family.name + ':' + self.code
__str__ = sitename
@@ -273,19 +269,16 @@
def languages(self):
"""Return list of all valid language codes for this site's Family."""
-
return list(self.family.langs.keys())
def validLanguageLinks(self):
"""Return list of language codes that can be used in interwiki links."""
-
nsnames = [name for name in self.namespaces().values()]
return [lang for lang in self.languages()
if lang[:1].upper() + lang[1:] not in nsnames]
def ns_index(self, namespace):
"""Given a namespace name, return its int index, or None if invalid."""
-
for ns in self.namespaces():
if namespace.lower() in [name.lower()
for name in self.namespaces()[ns]]:
@@ -295,7 +288,6 @@
def namespaces(self):
"""Return dict of valid namespaces on this wiki."""
-
return self._namespaces
def ns_normalize(self, value):
@@ -321,12 +313,10 @@
def pagenamecodes(self, default=True):
"""Return list of localized PAGENAME tags for the site."""
-
return [u"PAGENAME"]
def pagename2codes(self, default=True):
"""Return list of localized PAGENAMEE tags for the site."""
-
return [u"PAGENAMEE"]
def lock_page(self, page, block=True):
@@ -366,7 +356,6 @@
def disambcategory(self):
"""Return Category in which disambig pages are listed."""
-
try:
name = '%s:%s' % (self.namespace(14),
self.family.disambcatname[self.code])
@@ -377,10 +366,14 @@
@deprecated("pywikibot.Link")
def linkto(self, title, othersite=None):
- """Return unicode string in the form of a wikilink to 'title'
+ """DEPRECATED. Return a wikilink to a page.
- Use optional Site argument 'othersite' to generate an interwiki link.
+ @param title: Title of the page to link to
+ @type title: unicode
+ @param othersite: Generate a interwiki link for use on this site.
+ @type othersite: Site (optional)
+ @return: unicode
"""
return pywikibot.Link(title, self).astext(othersite)
@@ -414,12 +407,17 @@
# title1 and title2 may be unequal but still identify the same page,
# if they use different aliases for the same namespace
- def valid_namespace(text, number):
- """Return True if text is a valid alias for namespace with given
- number.
+ def valid_namespace(alias, ns):
+ """Determine if a string is a valid alias for a namespace.
+ @param alias: namespace alias
+ @type alias: unicode
+ @param ns: namespace
+ @type ns: int
+
+ @return: bool
"""
- for alias in self.namespace(number, all=True):
+ for text in self.namespace(ns, all=True):
if text.lower() == alias.lower():
return True
return False
@@ -480,12 +478,10 @@
def category_on_one_line(self):
"""Return True if this site wants all category links on one line."""
-
return self.code in self.family.category_on_one_line
def interwiki_putfirst(self):
"""Return list of language codes for ordering of interwiki links."""
-
return self.family.interwiki_putfirst.get(self.code, None)
def interwiki_putfirst_doubled(self, list_of_links):
@@ -509,7 +505,6 @@
def getSite(self, code):
"""Return Site object for language 'code' in this Family."""
-
return pywikibot.Site(code=code, fam=self.family, user=self.user())
# deprecated methods for backwards-compatibility
@@ -517,12 +512,11 @@
@deprecated("family attribute")
def fam(self):
"""Return Family object for this Site."""
-
return self.family
@deprecated("urllib.urlencode()")
def urlEncode(self, query):
- """DEPRECATED"""
+ """DEPRECATED."""
return urllib.urlencode(query)
@deprecated("pywikibot.comms.http.request")
@@ -544,31 +538,27 @@
@deprecated()
def postForm(self, address, predata, sysop=False, cookies=None):
- """DEPRECATED"""
+ """DEPRECATED."""
return self.getUrl(address, data=predata)
@deprecated()
def postData(self, address, data, contentType=None, sysop=False,
compress=True, cookies=None):
- """DEPRECATED"""
+ """DEPRECATED."""
return self.getUrl(address, data=data)
def must_be(group=None, right=None):
- """ Decorator to require a certain user status. For now, only the values
- group = 'user' and group = 'sysop' are supported. The right property
- will be ignored for now.
+ """ Decorator to require a certain user status when method is called.
- @param group: the group the logged in user should belong to
- this parameter can be overridden by
- keyword argument 'as_group'
- legal values: 'user' and 'sysop'
- @param right: the rights the logged in user should have
- not supported yet and thus ignored.
- @returns: a decorator to make sure the requirement is statisfied when
- the decorated function is called. The function can be called
- with as_group='sysop' to override the group set in the
- decorator.
+ @param group: The group the logged in user should belong to
+ this parameter can be overridden by
+ keyword argument 'as_group'.
+ @type group: str ('user' or 'sysop')
+ @param right: The rights the logged in user should have.
+ Not supported yet and thus ignored.
+
+ @return: method decorator
"""
def decorator(fn):
def callee(self, *args, **kwargs):
@@ -618,6 +608,7 @@
Do not use directly; use pywikibot.Site function.
"""
+
# Site methods from version 1.0 (as these are implemented in this file,
# or declared deprecated/obsolete, they will be removed from this list)
#########
@@ -640,6 +631,7 @@
#
def __init__(self, code, fam=None, user=None, sysop=None):
+ """ Constructor. """
BaseSite.__init__(self, code, fam, user, sysop)
self._namespaces = {
# These are the MediaWiki built-in names, which always work.
@@ -730,12 +722,16 @@
return gen
def logged_in(self, sysop=False):
- """Return True if logged in with the user specified in user-config.py
- (or the sysop user specified if the sysop parameter is True).
+ """Verify the bot is logged into the site as the expected user.
+
+ The expected usernames are those provided as either the user or sysop
+ parameter at instantiation.
@param sysop: if True, test if user is logged in as the sysop user
instead of the normal user.
+ @type sysop: bool
+ @return: bool
"""
if not hasattr(self, "_userinfo"):
return False
@@ -757,6 +753,11 @@
DEPRECATED (use .user() method instead)
+ @param sysop: if True, test if user is logged in as the sysop user
+ instead of the normal user.
+ @type sysop: bool
+
+ @return: bool
"""
return self.logged_in(sysop) and self.user()
@@ -790,6 +791,10 @@
forceLogin = login # alias for backward-compatibility
def logout(self):
+ """ Logout of the site and load details for the logged out user.
+
+ Also logs out of the global account if linked to the user.
+ """
uirequest = api.Request(site=self, action="logout")
uirequest.submit()
self._loginstatus = LoginStatus.NOT_LOGGED_IN
@@ -832,8 +837,7 @@
userinfo = property(fget=getuserinfo, doc=getuserinfo.__doc__)
def getglobaluserinfo(self):
- """Retrieve globaluserinfo from site and store in _globaluserinfo
- attribute.
+ """Retrieve globaluserinfo from site and cache it.
self._globaluserinfo will be a dict with the following keys and values:
@@ -874,11 +878,9 @@
self.login(sysop)
return 'blockinfo' in self._userinfo
+ @deprecated('is_blocked()')
def isBlocked(self, sysop=False):
- """Deprecated synonym for is_blocked"""
- pywikibot.debug(
- u"Site method 'isBlocked' should be changed to 'is_blocked'",
- _logger)
+ """DEPRECATED."""
return self.is_blocked(sysop)
def checkBlocks(self, sysop=False):
@@ -903,7 +905,7 @@
@deprecated("Site.has_right()")
def isAllowed(self, right, sysop=False):
- """Deprecated; retained for backwards-compatibility"""
+ """DEPRECATED."""
return self.has_right(right, sysop)
def has_group(self, group, sysop=False):
@@ -924,13 +926,16 @@
return 'hasmsg' in self._userinfo
def mediawiki_messages(self, keys):
- """Return the MediaWiki message text for each 'key' in keys in a dict:
- -. dict['key'] = text message
+ """Fetch the text of a set of MediaWiki messages.
- keys='*' or ['*'] will return all messages
+ If keys is '*' or ['*'], all messages will be fetched.
+ The returned dict uses each key to store the associated message.
+ @param keys: MediaWiki messages to fetch
+ @type keys: set of str, '*' or ['*']
+
+ @return: dict
"""
-
if not all(_key in self._msgcache for _key in keys):
msg_query = api.QueryGenerator(
site=self,
@@ -959,17 +964,32 @@
return dict((_key, self._msgcache[_key]) for _key in keys)
def mediawiki_message(self, key):
- """Return the MediaWiki message text for key 'key' """
+ """Fetch the text for a MediaWiki message.
+
+ @param key: name of MediaWiki message
+ @type key: str
+
+ @return: unicode
+ """
return self.mediawiki_messages([key])[key]
def has_mediawiki_message(self, key):
- """Return True if this site defines a MediaWiki message for 'key' """
+ """Determine if the site defines a MediaWiki message.
+
+ @param key: name of MediaWiki message
+ @type key: str
+
+ @return: bool
+ """
return self.has_all_mediawiki_messages([key])
def has_all_mediawiki_messages(self, keys):
- """Return True if this site defines MediaWiki messages for all 'keys';
- False otherwise.
+ """Confirm that the site defines a set of MediaWiki messages.
+ @param keys: names of MediaWiki messages
+ @type keys: set of str
+
+ @return: bool
"""
try:
self.mediawiki_messages(keys)
@@ -979,9 +999,12 @@
@property
def months_names(self):
- """Return a zero-indexed list of (month name, abbreviation) tuples,
- ordered by month in calendar, in original site language.
+ """Obtain month names from the site messages.
+ The list is zero-indexed, ordered by month in calendar, and should
+ be in the original site language.
+
+ @return: list of tuples (month name, abbreviation)
"""
if hasattr(self, "_months_names"):
return self._months_names
@@ -1002,15 +1025,16 @@
return self._months_names
def list_to_text(self, args):
- """Join a list of strings together into a human-readable
- list. The MediaWiki message 'and' is used as separator
+ """Convert a list of strings into human-readable text.
+
+ The MediaWiki message 'and' is used as separator
between the last two arguments.
If present, other arguments are joined using a comma.
@param args: text to be expanded
@type args: iterable
- @return: unicode
+ @return: unicode
"""
if not args:
return u''
@@ -1033,7 +1057,8 @@
return msgs['comma-separator'].join(args[:-2] + [concat.join(args[-2:])])
def expand_text(self, text, title=None, includecomments=None):
- """ Parse the given text for preprocessing and rendering
+ """ Parse the given text for preprocessing and rendering.
+
e.g expand templates and strip comments if includecomments
parameter is not True. Keeps text inside
<nowiki></nowiki> tags unchanges etc. Can be used to parse
@@ -1192,10 +1217,12 @@
def hasExtension(self, name, unknown=NotImplementedError):
""" Determine whether extension `name` is loaded.
- @param name The extension to check for
- @param unknown The value to return if the site does not list loaded
- extensions. Valid values are an exception to raise,
- True or False. Default: NotImplementedError
+ @param name: The extension to check for
+ @param unknown: The value to return if the site does not list loaded
+ extensions. Valid values are an exception to raise,
+ True or False. Default: NotImplementedError
+
+ @return: bool
"""
if not hasattr(self, '_extensions'):
self._getsiteinfo()
@@ -1213,56 +1240,50 @@
@property
def siteinfo(self):
"""Site information dict."""
-
if not hasattr(self, "_siteinfo"):
self._getsiteinfo()
return self._siteinfo
def case(self):
"""Return this site's capitalization rule."""
-
return self.siteinfo['case']
def dbName(self):
"""Return this site's internal id."""
-
return self.siteinfo['wikiid']
def language(self):
"""Return the code for the language of this Site."""
-
return self.siteinfo['lang']
lang = property(fget=language, doc=language.__doc__)
@property
def has_image_repository(self):
- """Return True if site has a shared image repository like Commons"""
+ """Return True if site has a shared image repository like Commons."""
code, fam = self.shared_image_repository()
return bool(code or fam)
@property
def has_data_repository(self):
- """Return True if site has a shared data repository like Wikidata"""
+ """Return True if site has a shared data repository like Wikidata."""
code, fam = self.shared_data_repository()
return bool(code or fam)
@property
def has_transcluded_data(self):
- """Return True if site has a shared data repository like Wikidata"""
+ """Return True if site has a shared data repository like Wikidata."""
code, fam = self.shared_data_repository(True)
return bool(code or fam)
def image_repository(self):
"""Return Site object for image repository e.g. commons."""
-
code, fam = self.shared_image_repository()
if bool(code or fam):
return pywikibot.Site(code, fam, self.username())
def data_repository(self):
"""Return Site object for data repository e.g. Wikidata."""
-
code, fam = self.shared_data_repository()
if bool(code or fam):
return pywikibot.Site(code, fam, self.username(),
@@ -1283,7 +1304,6 @@
def namespaces(self):
"""Return dict of valid namespaces on this wiki."""
-
if not hasattr(self, "_siteinfo"):
self._getsiteinfo()
return self._namespaces
@@ -1300,7 +1320,7 @@
return self.namespaces()[num][0]
def live_version(self, force=False):
- """Return the 'real' version number found on [[Special:Version]]
+ """Return the 'real' version number found on [[Special:Version]].
Return value is a tuple (int, int, str) of the major and minor
version numbers and any other text contained in the version.
@@ -1317,7 +1337,7 @@
return (0, 0, 0)
def loadpageinfo(self, page, preload=False):
- """Load page info from api and save in page attributes"""
+ """Load page info from api and store in page attributes."""
title = page.title(withSection=False)
inprop = 'protection'
if preload:
@@ -1337,8 +1357,7 @@
api.update_page(page, pageitem)
def loadcoordinfo(self, page):
- """Load [[mw:Extension:GeoData]] info"""
- # prop=coordinates&titles=Wikimedia Foundation&format=jsonfm&coprop=type|name|dim|country|region&coprimary=all
+ """Load [[mw:Extension:GeoData]] info."""
title = page.title(withSection=False)
query = self._generator(api.PropertyGenerator,
type_arg="coordinates",
@@ -1370,7 +1389,7 @@
api.update_page(page, pageitem)
def loadimageinfo(self, page, history=False):
- """Load image info from api and save in page attributes
+ """Load image info from api and save in page attributes.
@param history: if true, return the image's version history
@@ -1396,8 +1415,9 @@
def loadflowinfo(self, page):
"""
- Loads Flow-related information about a given page
- Assumes that the Flow extension is installed
+ Load Flow-related information about a given page.
+
+ FIXME: Assumes that the Flow extension is installed.
"""
title = page.title(withSection=False)
query = self._generator(api.PropertyGenerator,
@@ -1419,7 +1439,7 @@
return page._pageid > 0
def page_restrictions(self, page):
- """Return a dictionary reflecting page protections"""
+ """Return a dictionary reflecting page protections."""
if not self.page_exists(page):
raise NoPage(page)
if not hasattr(page, "_protection"):
@@ -1428,10 +1448,13 @@
def page_can_be_edited(self, page):
"""
- Returns True if and only if:
+ Determine if the page can be edited.
+
+ Return True if and only if:
- page is unprotected, and bot has an account for this site, or
- page is protected, and bot has a sysop account for this site.
+ @return: bool
"""
rest = self.page_restrictions(page)
sysop_protected = "edit" in rest and rest['edit'][0] == 'sysop'
@@ -1657,7 +1680,6 @@
withTemplateInclusion=True, onlyTemplateInclusion=False,
namespaces=None, step=None, total=None, content=False):
"""Convenience method combining pagebacklinks and page_embeddedin."""
-
if onlyTemplateInclusion:
return self.page_embeddedin(page, namespaces=namespaces,
filterRedirects=filterRedirects,
@@ -2184,7 +2206,7 @@
@deprecated("Site.allcategories()")
def categories(self, number=10, repeat=False):
- """Deprecated; retained for backwards-compatibility"""
+ """DEPRECATED."""
if repeat:
limit = None
else:
@@ -2193,7 +2215,6 @@
def isBot(self, username):
"""Return True is username is a bot user. """
-
return username in [userdata['name'] for userdata in self.botusers()]
def botusers(self, step=None, total=None):
@@ -2206,7 +2227,6 @@
present.
"""
-
if not hasattr(self, "_bots"):
self._bots = {}
@@ -2728,6 +2748,8 @@
@deprecated("Site.randompages()")
def randompage(self, redirect=False):
"""
+ DEPRECATED.
+
@param redirect: Return a random redirect page
@return: pywikibot.Page
"""
@@ -3654,11 +3676,11 @@
class DataSite(APISite):
def __getattr__(self, attr):
- """Calls to methods get_info, get_sitelinks, get_aliases, get_labels,
- get_descriptions, get_urls
+ """Provide data access methods.
+ Methods provided are get_info, get_sitelinks, get_aliases,
+ get_labels, get_descriptions, and get_urls.
"""
-
if hasattr(self.__class__, attr):
return getattr(self.__class__, attr)
if attr.startswith("get_"):
@@ -3680,7 +3702,7 @@
@deprecated("pywikibot.PropertyPage")
def _get_propertyitem(self, props, source, **params):
- """Generic method to get the data for multiple Wikibase items"""
+ """Generic method to get the data for multiple Wikibase items."""
wbdata = self.get_item(source, props=props, **params)
assert props in wbdata, \
"API wbgetentities response lacks %s key" % props
@@ -3688,7 +3710,7 @@
@deprecated("pywikibot.WikibasePage")
def get_item(self, source, **params):
- """Get the data for multiple Wikibase items"""
+ """Get the data for multiple Wikibase items."""
if isinstance(source, int) or \
isinstance(source, basestring) and source.isdigit():
ids = 'q' + str(source)
@@ -3709,13 +3731,16 @@
def loadcontent(self, identification, *props):
"""
+ Fetch the current content of a Wikibase item.
+
This is called loadcontent since
wbgetentities does not support fetching old
revisions. Eventually this will get replaced by
an actual loadrevisions.
- @param identification Parameters used to identify the page(s)
- @type identification dict
- @param props the optional properties to fetch.
+
+ @param identification: Parameters used to identify the page(s)
+ @type identification: dict
+ @param props: the optional properties to fetch.
"""
params = dict(**identification)
params['action'] = 'wbgetentities'
@@ -3754,6 +3779,8 @@
def getPropertyType(self, prop):
"""
+ Obtain the type of a property.
+
This is used specifically because we can cache
the value for a much longer time (near infinite).
"""
@@ -3830,8 +3857,12 @@
@must_be(group='user')
def changeClaimTarget(self, claim, snaktype='value', bot=True, **kwargs):
"""
- Sets the claim target to whatever claim.target is
- An optional snaktype lets you set a novalue or somevalue.
+ Set the claim target to the value of the provided claim target.
+
+ @param claim: The source of the claim target value
+ @type claim: Claim
+ @param snaktype: An optional snaktype. Default: 'value'
+ @type snaktype: str ('value', 'novalue' or 'somevalue')
"""
if claim.isReference or claim.isQualifier:
raise NotImplementedError
@@ -3860,12 +3891,13 @@
def editSource(self, claim, source, new=False, bot=True, **kwargs):
"""
Create/Edit a source.
- @param claim A Claim object to add the source to
- @type claim pywikibot.Claim
- @param source A Claim object to be used as a source
- @type source pywikibot.Claim
- @param new Whether to create a new one if the "source" already exists
- @type new bool
+
+ @param claim: A Claim object to add the source to
+ @type claim: Claim
+ @param source: A Claim object to be used as a source
+ @type source: Claim
+ @param new: Whether to create a new one if the "source" already exists
+ @type new: bool
"""
if claim.isReference or claim.isQualifier:
raise ValueError("The claim cannot have a source.")
@@ -3927,12 +3959,12 @@
@must_be(group='user')
def editQualifier(self, claim, qualifier, new=False, bot=True, **kwargs):
"""
- Create/Edit a qualifier
+ Create/Edit a qualifier.
@param claim: A Claim object to add the qualifier to
- @type claim: pywikibot.Claim
+ @type claim: Claim
@param qualifier: A Claim object to be used as a qualifier
- @type qualifier: pywikibot.Claim
+ @type qualifier: Claim
"""
if claim.isReference or claim.isQualifier:
raise ValueError("The claim cannot have a qualifier.")
@@ -3981,11 +4013,12 @@
@must_be(group='user')
def removeSources(self, claim, sources, bot=True, **kwargs):
"""
- Removes sources.
- @param claim A Claim object to remove the sources from
- @type claim pywikibot.Claim
- @param sources A list of Claim objects that are sources
- @type sources pywikibot.Claim
+ Remove sources.
+
+ @param claim: A Claim object to remove the sources from
+ @type claim: Claim
+ @param sources: A list of Claim objects that are sources
+ @type sources: Claim
"""
params = dict(action='wbremovereferences')
if bot:
@@ -4003,7 +4036,8 @@
def linkTitles(self, page1, page2, bot=True):
"""
- Link two pages together
+ Link two pages together.
+
@param page1: First page to link
@type page1: pywikibot.Page
@param page2: Second page to link
@@ -4027,7 +4061,8 @@
def mergeItems(self, fromItem, toItem, **kwargs):
"""
- Merge two items together
+ Merge two items together.
+
@param fromItem: Item to merge from
@type fromItem: pywikibot.ItemPage
@param toItem: Item to merge into
@@ -4049,7 +4084,8 @@
def createNewItemFromPage(self, page, bot=True, **kwargs):
"""
- Create a new Wikibase item for a provided page
+ Create a new Wikibase item for a provided page.
+
@param page: page to fetch links from
@type page: pywikibot.Page
@param bot: whether to mark the edit as bot
--
To view, visit https://gerrit.wikimedia.org/r/145762
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I83a1a672de9415a9fbc3baab906cee78e85a44b9
Gerrit-PatchSet: 3
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Hashar <hashar(a)free.fr>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Remove unnecessary commented out code re eo.wp
......................................................................
Remove unnecessary commented out code re eo.wp
Since 2010, the HTML ' (') is ignored for all sites
with a comment that indicates mediawiki bug 24093 is the reason.
Before that, eo.wp had ' on its ignore list with
a comment indicating this diff was the reason:
https://eo.wikipedia.org/w/index.php?title=Liberec&diff=next&oldid=2320801
When this hex code was added to the global ignore list, the eo.wp
rule was commented out.
https://mediawiki.org/wiki/Special:Code/pywikipedia/8314
This changeset removes the commented out old rule, as it is
unlikely to be reinstated or needed. Documentation of these
rules should be on the wiki, and test cases should exist to
ensure site-specific issues raised in the past are not regressed.
Change-Id: Idec1eb9af527d6a812631dd13a9b1fcf151f03f6
---
M scripts/cosmetic_changes.py
1 file changed, 1 insertion(+), 4 deletions(-)
Approvals:
John Vandenberg: Looks good to me, but someone else must approve
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/scripts/cosmetic_changes.py b/scripts/cosmetic_changes.py
index 33ae271..472535b 100755
--- a/scripts/cosmetic_changes.py
+++ b/scripts/cosmetic_changes.py
@@ -515,7 +515,7 @@
def resolveHtmlEntities(self, text):
ignore = [
38, # Ampersand (&)
- 39, # Bugzilla 24093
+ 39, # Single quotation mark (") - Bugzilla 24093
60, # Less than (<)
62, # Great than (>)
91, # Opening bracket - sometimes used intentionally inside links
@@ -526,9 +526,6 @@
8206, # left-to-right mark (<r;)
8207, # right-to-left mark (&rtl;)
]
- # ignore ' see https://eo.wikipedia.org/w/index.php?title=Liberec&diff=next&oldid=2320801
- #if self.site.lang == 'eo':
- # ignore += [39]
if self.template:
ignore += [58]
text = pywikibot.html2unicode(text, ignore=ignore)
--
To view, visit https://gerrit.wikimedia.org/r/145865
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Idec1eb9af527d6a812631dd13a9b1fcf151f03f6
Gerrit-PatchSet: 2
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>