jenkins-bot merged this change.

View Change

Approvals: Dvorapa: Looks good to me, approved jenkins-bot: Verified
component: fix docstrings to avoid warnings

Use fully specified type names to avoid many warnings
when building documentation.

Bug: T187009
Change-Id: I352c3aab964446777b5bf8142422b2efc65ec467
---
M pywikibot/__init__.py
M pywikibot/bot.py
M pywikibot/data/api.py
M pywikibot/family.py
M pywikibot/flow.py
M pywikibot/interwiki_graph.py
M pywikibot/login.py
M pywikibot/page.py
M pywikibot/site.py
M pywikibot/site_detect.py
M scripts/archivebot.py
M scripts/checkimages.py
M scripts/data_ingestion.py
M scripts/flickrripper.py
14 files changed, 112 insertions(+), 95 deletions(-)

diff --git a/pywikibot/__init__.py b/pywikibot/__init__.py
index 6fe82cc..1f5baa7 100644
--- a/pywikibot/__init__.py
+++ b/pywikibot/__init__.py
@@ -1166,7 +1166,7 @@
Site helper method.
@param url: The site URL to get code and family
@type url: str
- @raises SiteDefinitionError: Unknown URL
+ @raises pywikibot.exceptions.SiteDefinitionError: Unknown URL
"""
if url not in _url_cache:
matched_sites = []
@@ -1202,7 +1202,7 @@
@param code: language code (override config.mylang)
@type code: str
@param fam: family name or object (override config.family)
- @type fam: str or Family
+ @type fam: str or pywikibot.family.Family
@param user: bot user name to use on this site (override config.usernames)
@type user: str
@param sysop: sysop user to use on this site (override config.sysopnames)
@@ -1216,7 +1216,7 @@
@rtype: pywikibot.site.APISite
@raises ValueError: URL and pair of code and family given
@raises ValueError: Invalid interface name
- @raises SiteDefinitionError: Unknown URL
+ @raises pywikibot.exceptions.SiteDefinitionError: Unknown URL
"""
_logger = 'wiki'

diff --git a/pywikibot/bot.py b/pywikibot/bot.py
index db38a0d..6029cc1 100644
--- a/pywikibot/bot.py
+++ b/pywikibot/bot.py
@@ -726,12 +726,12 @@

@param old_link: The old link which is searched. The label and section
are ignored.
- @type old_link: Link or Page
+ @type old_link: pywikibot.page.Link or pywikibot.page.Page
@param new_link: The new link with which it should be replaced.
Depending on the replacement mode it'll use this link's label and
section. If False it'll unlink all and the attributes beginning
with allow_replace are ignored.
- @type new_link: Link or Page or False
+ @type new_link: pywikibot.page.Link or pywikibot.page.Page or False
@param default: The default answer as the shortcut
@type default: None or str
@param automatic_quit: Add an option to quit and raise a
@@ -1181,7 +1181,8 @@
Get the current value of an option.

@param option: key defined in OptionHandler.availableOptions
- @raise Error: No valid option is given with option parameter
+ @raise pywikibot.exceptions.Error: No valid option is given with
+ option parameter
"""
try:
return self.options.get(option, self.availableOptions[option])
@@ -2028,7 +2029,7 @@
Edit entity with data provided, with user confirmation as required.

@param item: page to be edited
- @type item: ItemPage
+ @type item: pywikibot.page.ItemPage
@param data: data to be saved, or None if the diff should be created
automatically
@param ignore_save_related_errors: Ignore save related errors and
diff --git a/pywikibot/data/api.py b/pywikibot/data/api.py
index 5e6b1a6..05b15ca 100644
--- a/pywikibot/data/api.py
+++ b/pywikibot/data/api.py
@@ -873,7 +873,7 @@
If a site is given, the module and param must be given too.

@param site: The associated site
- @type site: APISite
+ @type site: piwikibot.site.APISite
@param module: The module name which is used by paraminfo. (Ignored
when site is None)
@type module: str
@@ -898,7 +898,7 @@
unless there had been invalid names and a KeyError was thrown.

@param site: The associated site
- @type site: APISite
+ @type site: pywikibot.site.APISite
@param module: The module name which is used by paraminfo.
@type module: str
@param param: The parameter name inside the module. That parameter must
@@ -3293,7 +3293,7 @@
"""Update attributes of Page object page, based on query data in pagedict.

@param page: object to be updated
- @type page: Page
+ @type page: pywikibot.page.Page
@param pagedict: the contents of a "page" element of a query response
@type pagedict: dict
@param props: the property names which resulted in pagedict. If a missing
@@ -3301,8 +3301,9 @@
property which would make the value present must be in the props
parameter.
@type props: iterable of string
- @raises InvalidTitle: Page title is invalid
- @raises UnsupportedPage: Page with namespace < 0 is not supported yet
+ @raises pywikibot.exceptions.InvalidTitle: Page title is invalid
+ @raises pywikibot.exceptions.UnsupportedPage: Page with namespace < 0
+ is not supported yet
"""
_update_pageid(page, pagedict)
_update_contentmodel(page, pagedict)
diff --git a/pywikibot/family.py b/pywikibot/family.py
index f12ef93..9eeccc6 100644
--- a/pywikibot/family.py
+++ b/pywikibot/family.py
@@ -986,7 +986,7 @@
@param fam: family name (if omitted, uses the configured default)
@type fam: str
@return: a Family instance configured for the named family.
- @raises UnknownFamily: family not known
+ @raises pywikibot.exceptions.UnknownFamily: family not known
"""
if fam is None:
fam = config.family
diff --git a/pywikibot/flow.py b/pywikibot/flow.py
index 99a8d28..12a1388 100644
--- a/pywikibot/flow.py
+++ b/pywikibot/flow.py
@@ -38,7 +38,7 @@
"""Initializer.

@param source: A Flow-enabled site or a Link or Page on such a site
- @type source: Site, Link, or Page
+ @type source: Site, pywikibot.page.Link, or pywikibot.page.Page
@param title: normalized title of the page
@type title: str

diff --git a/pywikibot/interwiki_graph.py b/pywikibot/interwiki_graph.py
index 1f6583a..ead4cf3 100644
--- a/pywikibot/interwiki_graph.py
+++ b/pywikibot/interwiki_graph.py
@@ -67,7 +67,7 @@
"""Initializer.

@param originPage: the page on the 'origin' wiki
- @type originPage: Page
+ @type originPage: pywikibot.page.Page
"""
# Remember the "origin page"
self._origin = origin
@@ -131,7 +131,7 @@
"""Initializer.

@param subject: page data to graph
- @type subject: Subject
+ @type subject: pywikibot.interwiki_graph.Subject

@raises GraphImpossible: pydot is not installed
"""
@@ -255,7 +255,7 @@
Create a filename that is unique for the page.

@param page: page used to create the new filename
- @type page: Page
+ @type page: pywikibot.page.Page
@param extension: file extension
@type extension: str
@return: filename of <family>-<lang>-<page>.<ext>
diff --git a/pywikibot/login.py b/pywikibot/login.py
index 5d0f51c..ef65e03 100644
--- a/pywikibot/login.py
+++ b/pywikibot/login.py
@@ -75,7 +75,8 @@
@param password: password to use
@type password: basestring

- @raises NoUsername: No username is configured for the requested site.
+ @raises pywikibot.exceptions.NoUsername: No username is configured
+ for the requested site.
"""
site = self.site = site or pywikibot.Site()
if not user:
@@ -104,7 +105,8 @@

@see: U{https://www.mediawiki.org/wiki/API:Users}

- @raises NoUsername: Username doesn't exist in user list.
+ @raises pywikibot.exceptions.NoUsername: Username doesn't exist in
+ user list.
"""
# convert any Special:BotPassword usernames to main account equivalent
main_username = self.username
@@ -275,7 +277,8 @@
using unified login
@type autocreate: bool

- @raises NoUsername: Username is not recognised by the site.
+ @raises pywikibot.exceptions.NoUsername: Username is not recognised by
+ the site.
"""
if not self.password:
# First check that the username exists,
@@ -371,7 +374,8 @@
@param password: consumer secret
@type password: basestring

- @raises NoUsername: No username is configured for the requested site.
+ @raises pywikibot.exceptions.NoUsername: No username is configured
+ for the requested site.
@raises OAuthImpossible: mwoauth isn't installed
"""
if isinstance(mwoauth, ImportError):
diff --git a/pywikibot/page.py b/pywikibot/page.py
index aa64aad..6e1bcd3 100644
--- a/pywikibot/page.py
+++ b/pywikibot/page.py
@@ -184,7 +184,8 @@
wikitext, URLs, or another non-normalized source.

@param source: the source of the page
- @type source: BaseLink (or subclass), Page (or subclass), or Site
+ @type source: pywikibot.page.BaseLink (or subclass),
+ pywikibot.page.Page (or subclass), or pywikibot.page.Site
@param title: normalized title of the page; required if source is a
Site, ignored otherwise
@type title: str
@@ -468,12 +469,12 @@
retrieved yet, or if force is True. This can raise the following
exceptions that should be caught by the calling code:

- @exception NoPage: The page does not exist
- @exception IsRedirectPage: The page is a redirect. The argument of the
- exception is the title of the page it
- redirects to.
- @exception SectionError: The section does not exist on a page with
- a # link
+ @exception pywikibot.exceptions.NoPage: The page does not exist
+ @exception pywikibot.exceptions.IsRedirectPage: The page is a redirect.
+ The argument of the exception is the title of the page it
+ redirects to.
+ @exception pywikibot.exceptions.SectionError: The section does not
+ exist on a page with a # link

@param force: reload all page attributes, including errors.
@param get_redirect: return the redirect text, do not follow the
@@ -883,7 +884,7 @@
"""
If this is a category redirect, return the target category title.

- @rtype: Category
+ @rtype: pywikibot.page.Category
"""
if self.isCategoryRedirect():
return Category(Link(self._catredirect, self.site))
@@ -1561,7 +1562,7 @@
"""
Convenience function to get the Wikibase item of a page.

- @rtype: ItemPage
+ @rtype: pywikibot.page.ItemPage
"""
return ItemPage.fromPage(self)

@@ -1676,7 +1677,7 @@
Uses the MediaWiki extension PageImages.

@return: A FilePage object
- @rtype: FilePage
+ @rtype: pywikibot.page.FilePage
"""
if not hasattr(self, '_pageimage'):
self._pageimage = None
@@ -1705,8 +1706,8 @@
If this page was not moved, it will raise a NoPage exception.
This method also works if the source was already deleted.

- @rtype: pywikibot.Page
- @raises NoPage: this page was not moved
+ @rtype: pywikibot.page.Page
+ @raises pywikibot.exceptions.NoPage: this page was not moved
"""
try:
return self.moved_target()
@@ -1720,8 +1721,8 @@
If this page was not moved, it will raise a NoMoveTarget exception.
This method also works if the source was already deleted.

- @rtype: pywikibot.Page
- @raises NoMoveTarget: this page was not moved
+ @rtype: pywikibot.page.Page
+ @raises pywikibot.exceptions.NoMoveTarget: this page was not moved
"""
gen = iter(self.site.logevents(logtype='move', page=self, total=1))
try:
@@ -2137,9 +2138,9 @@
Remove page from oldCat and add it to newCat.

@param old_cat: category to be removed
- @type old_cat: Category
+ @type old_cat: pywikibot.page.Category
@param new_cat: category to be added, if any
- @type new_cat: Category or None
+ @type new_cat: pywikibot.page.Category or None

@param summary: string to use as an edit summary

@@ -2359,7 +2360,7 @@
@return: a list of tuples with one tuple for each template invocation
in the page, with the template Page as the first entry and a list
of parameters as the second entry.
- @rtype: list of (Page, list)
+ @rtype: list of (pywikibot.page.Page, list)
"""
# WARNING: may not return all templates used in particularly
# intricate cases such as template substitution
@@ -3037,7 +3038,7 @@
Copy text of category page to a new page. Does not move contents.

@param cat: New category title (without namespace) or Category object
- @type cat: str or Category
+ @type cat: str or pywikibot.page.Category
@param message: message to use for category creation message
If two %s are provided in message, will be replaced
by (self.title, authorsList)
@@ -3814,7 +3815,7 @@
initialisation logic.

@param site: Wikibase data site
- @type site: DataSite
+ @type site: pywikibot.site.DataSite
@param title: normalized title of the page
@type title: str
@kwarg ns: namespace
@@ -4311,7 +4312,7 @@
Add a claim to the entity.

@param claim: The claim to add
- @type claim: Claim
+ @type claim: pywikibot.page.Claim
@param bot: Whether to flag as bot (if possible)
@type bot: bool
@keyword asynchronous: if True, launch a separate thread to add claim
@@ -4504,14 +4505,16 @@
Get the ItemPage for a Page that links to it.

@param page: Page to look for corresponding data item
- @type page: pywikibot.Page
+ @type page: pywikibot.page.Page
@param lazy_load: Do not raise NoPage if either page or corresponding
ItemPage does not exist.
@type lazy_load: bool
- @rtype: ItemPage
+ @rtype: pywikibot.page.ItemPage

- @raise NoPage: There is no corresponding ItemPage for the page
- @raise WikiBaseError: The site of the page has no data repository.
+ @raise pywikibot.exceptions.NoPage: There is no corresponding
+ ItemPage for the page
+ @raise pywikibot.exceptions.WikiBaseError: The site of the page
+ has no data repository.
"""
if hasattr(page, '_item'):
return page._item
@@ -4549,11 +4552,11 @@
@type uri: basestring
@param lazy_load: Do not raise NoPage if ItemPage does not exist.
@type lazy_load: bool
- @rtype: ItemPage
+ @rtype: pywikibot.page.ItemPage

@raise TypeError: Site is not a valid DataSite.
@raise ValueError: Site does not match the base of the provided uri.
- @raise NoPage: Uri points to non-existent item.
+ @raise pywikibot.exceptions.NoPage: Uri points to non-existent item.
"""
if not isinstance(site, DataSite):
raise TypeError('{0} is not a data repository.'.format(site))
@@ -4761,7 +4764,7 @@
Merge the item into another item.

@param item: The item to merge into
- @type item: ItemPage
+ @type item: pywikibot.page.ItemPage
"""
data = self.repo.mergeItems(from_item=self, to_item=item, **kwargs)
if not data.get('success', 0):
@@ -4780,7 +4783,7 @@
You need to define an extra argument to make this work, like save=True

@param target_page: target of the redirect, this argument is required.
- @type target_page: ItemPage or string
+ @type target_page: pywikibot.page.ItemPage or string
@param force: if true, it sets the redirect target even the page
is not redirect.
@type force: bool
@@ -4985,7 +4988,7 @@
"""
Helper function to create a new claim object for this property.

- @rtype: Claim
+ @rtype: pywikibot.page.Claim
"""
# todo: raise when self.id is -1
return Claim(self.site, self.getID(), datatype=self.type,
@@ -5099,7 +5102,7 @@
"""
Create an independent copy of this object.

- @rtype: Claim
+ @rtype: pywikibot.page.Claim
"""
is_qualifier = self.isQualifier
is_reference = self.isReference
@@ -5121,7 +5124,7 @@
@param data: JSON containing claim data
@type data: dict

- @rtype: Claim
+ @rtype: pywikibot.page.Claim
"""
claim = cls(site, data['mainsnak']['property'],
datatype=data['mainsnak'].get('datatype', None))
@@ -5191,7 +5194,7 @@
differently like references, but I'm not
sure if this even requires it's own function.

- @rtype: Claim
+ @rtype: pywikibot.page.Claim
"""
claim = cls.fromJSON(site, {'mainsnak': data,
'hash': data.get('hash')})
@@ -5404,7 +5407,7 @@
"""Add the given qualifier.

@param qualifier: the qualifier to add
- @type qualifier: Claim
+ @type qualifier: pywikibot.page.Claim
"""
if qualifier.on_item is not None:
raise ValueError(
@@ -5423,7 +5426,7 @@
Remove the qualifier. Call removeQualifiers().

@param qualifier: the qualifier to remove
- @type qualifier: Claim
+ @type qualifier: pywikibot.page.Claim
"""
self.removeQualifiers([qualifier], **kwargs)

@@ -5894,10 +5897,10 @@
"""
Create a BaseLink to a Page.

- @param page: target Page
- @type page: Page
+ @param page: target pywikibot.page.Page
+ @type page: pywikibot.page.Page

- @rtype: BaseLink
+ @rtype: pywikibot.page.BaseLink
"""
title = page.title(with_ns=False,
allow_interwiki=False,
@@ -6259,11 +6262,11 @@
Create a Link to a Page.

@param page: target Page
- @type page: Page
+ @type page: pywikibot.page.Page
@param source: Link from site source
@param source: Site

- @rtype: Link
+ @rtype: pywikibot.page.Link
"""
base_link = BaseLink.fromPage(page)
link = cls.__new__(cls)
@@ -6291,7 +6294,7 @@
@param source: Link from site source
@param source: Site

- @rtype: Link
+ @rtype: pywikibot.page.Link
"""
link = cls.__new__(cls)
if source.family.interwiki_forward:
@@ -6434,7 +6437,7 @@
@param site: The Wikibase site
@type site: pywikibot.site.DataSite

- @rtype: SiteLink
+ @rtype: pywikibot.page.SiteLink
"""
sl = cls(data['title'], data['site'])
repo = site or sl.site.data_repository()
@@ -6475,8 +6478,8 @@
Get the SiteLink with the given key.

@param key: site key as Site instance or db key
- @type key: pywikibot.Site or str
- @rtype: SiteLink
+ @type key: pywikibot.page.Site or str
+ @rtype: pywikibot.page.SiteLink
"""
if isinstance(key, pywikibot.site.BaseSite):
key = key.dbName()
@@ -6490,7 +6493,7 @@
@type key: pywikibot.Site or str
@param val: page name as a string or JSON containing SiteLink data
@type val: dict or str
- @rtype: SiteLink
+ @rtype: pywikibot.page.SiteLink
"""
if isinstance(val, UnicodeType):
val = SiteLink(val, key)
diff --git a/pywikibot/site.py b/pywikibot/site.py
index ef2ed92..fac1d66 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -663,7 +663,7 @@
Create an empty uninitialized interwiki map for the given site.

@param site: Given site for which interwiki map is to be created
- @type site: APISite
+ @type site: pywikibot.site.APISite
"""
super(_InterwikiMap, self).__init__()
self._site = site
@@ -726,7 +726,7 @@
@param code: the site's language code
@type code: str
@param fam: wiki family name (optional)
- @type fam: str or Family
+ @type fam: str or pywikibot.family.Family
@param user: bot user name (optional)
@type user: str
@param sysop: sysop account user name (optional)
@@ -948,8 +948,8 @@
"""
Return the site for a corresponding interwiki prefix.

- @raises SiteDefinitionError: if the url given in the interwiki table
- doesn't match any of the existing families.
+ @raises pywikibot.exceptions.SiteDefinitionError: if the url given in
+ the interwiki table doesn't match any of the existing families.
@raises KeyError: if the prefix is not an interwiki prefix.
"""
return self._interwikimap[prefix].site
@@ -985,8 +985,8 @@
link. So if that link also contains an interwiki link it does follow
it as long as it's a local link.

- @raises SiteDefinitionError: if the url given in the interwiki table
- doesn't match any of the existing families.
+ @raises pywikibot.exceptions.SiteDefinitionError: if the url given in
+ the interwiki table doesn't match any of the existing families.
@raises KeyError: if the prefix is not an interwiki prefix.
"""
return self._interwikimap[prefix].local
@@ -2041,7 +2041,8 @@
using unified login
@type autocreate: bool

- @raises NoUsername: Username is not recognised by the site.
+ @raises pywikibot.exceptions.NoUsername: Username is not recognised
+ by the site.
@see: U{https://www.mediawiki.org/wiki/API:Login}
"""
# TODO: this should include an assert that loginstatus
@@ -2233,7 +2234,8 @@

@param sysop: If true, log in to sysop account (if available)
@type sysop: bool
- @raises UserBlocked: The logged in user/sysop account is blocked.
+ @raises pywikibot.exceptions.UserBlocked: The logged in user/sysop
+ account is blocked.
"""
if self.is_blocked(sysop):
# User blocked
@@ -2802,7 +2804,7 @@
Return the data repository connected to this site.

@return: The data repository if one is connected or None otherwise.
- @rtype: DataSite or None
+ @rtype: pywikibot.site.DataSite or None
"""
def handle_warning(mod, warning):
return (mod == 'query' and re.match(
@@ -2845,7 +2847,8 @@
for this site object.
@rtype: pywikibot.Page or None

- @raises UnknownExtension: site has no wikibase extension
+ @raises pywikibot.exceptions.UnknownExtension: site has no wikibase
+ extension
@raises NotimplementedError: method not implemented for a wikibase site
"""
if not self.has_data_repository:
@@ -3047,8 +3050,8 @@
@type image: pywikibot.FilePage
@param total: iterate no more than this number of pages in total.
@raises TypeError: input page is not a FilePage.
- @raises SiteDefinitionError: Site could not be defined for a returned
- entry in API response.
+ @raises pywikibot.exceptions.SiteDefinitionError: Site could not be
+ defined for a returned entry in API response.
"""
if not isinstance(page, pywikibot.FilePage):
raise TypeError('Page %s must be a FilePage.' % page)
@@ -3195,11 +3198,12 @@
@return: redirect target of page
@rtype: pywikibot.Page

- @raises IsNotRedirectPage: page is not a redirect
+ @raises pywikibot.exceptions.IsNotRedirectPage: page is not a redirect
@raises RuntimeError: no redirects found
- @raises CircularRedirect: page is a circular redirect
- @raises InterwikiRedirectPage: the redirect target is
- on another site
+ @raises pywikibot.exceptions.CircularRedirect: page is a circular
+ redirect
+ @raises pywikibot.exceptions.InterwikiRedirectPage: the redirect
+ target is on another site
"""
if not self.page_isredirect(page):
raise IsNotRedirectPage(page)
@@ -4670,7 +4674,7 @@
@param user: only iterate entries that match this user name
@type user: basestring
@param page: only iterate entries affecting this page
- @type page: Page or basestring
+ @type page: pywikibot.page.Page or basestring
@param namespace: namespace(s) to retrieve logevents from
@type namespace: int or Namespace or an iterable of them
@note: due to an API limitation, if namespace param contains multiple
@@ -4938,7 +4942,8 @@
@type total: int
@param top_only: if True, iterate only edits which are the latest
revision (default: False)
- @raises Error: either user or userprefix must be non-empty
+ @raises pywikibot.exceptions.Error: either user or userprefix must be
+ non-empty
@raises KeyError: a namespace identifier was not resolved
@raises TypeError: a namespace identifier has an inappropriate
type such as NoneType or bool
@@ -5298,10 +5303,11 @@
@type undo: int
@return: True if edit succeeded, False if it failed
@rtype: bool
- @raises Error: No text to be saved
- @raises NoPage: recreate is disabled and page does not exist
- @raises CaptchaError: config.solve_captcha is False and saving
- the page requires solving a captcha
+ @raises pywikibot.exceptions.Error: No text to be saved
+ @raises pywikibot.exceptions.NoPage: recreate is disabled and page does
+ not exist
+ @raises pywikibot.exceptions.CaptchaError: config.solve_captcha is
+ False and saving the page requires solving a captcha
"""
basetimestamp = True
text_overrides = self._ep_text_overrides.intersection(kwargs.keys())
@@ -5771,7 +5777,7 @@
@see: U{https://www.mediawiki.org/wiki/API:Delete}

@param page: Page to be deleted.
- @type page: Page
+ @type page: pywikibot.page.Page
@param reason: Deletion reason.
@type reason: basestring

@@ -7872,7 +7878,7 @@
is this site by definition.

@return: this Site object
- @rtype: DataSite
+ @rtype: pywikibot.site.DataSite
"""
return self

diff --git a/pywikibot/site_detect.py b/pywikibot/site_detect.py
index c85d984..71567e6 100644
--- a/pywikibot/site_detect.py
+++ b/pywikibot/site_detect.py
@@ -45,7 +45,8 @@
"""
Initializer.

- @raises ServerError: a server error occurred while loading the site
+ @raises pywikibot.exceptions.ServerError: a server error occurred
+ while loading the site
@raises Timeout: a timeout occurred while loading the site
@raises RuntimeError: Version not found or version less than 1.14
"""
diff --git a/scripts/archivebot.py b/scripts/archivebot.py
index 86b54a3..ad075c1 100755
--- a/scripts/archivebot.py
+++ b/scripts/archivebot.py
@@ -280,7 +280,7 @@
case-insensitivity depending on the namespace.

@param tpl_page: The template page
- @type tpl_page: Page
+ @type tpl_page: pywikibot.page.Page
"""
ns = tpl_page.site.namespaces[tpl_page.namespace()]
marker = '?' if ns.id == 10 else ''
diff --git a/scripts/checkimages.py b/scripts/checkimages.py
index e63139a..8668778 100755
--- a/scripts/checkimages.py
+++ b/scripts/checkimages.py
@@ -737,7 +737,7 @@
@param listGiven: a list of tuples which hold seconds and FilePage
@type listGiven: list
@return: the most used or oldest image
- @rtype: FilePage
+ @rtype: pywikibot.page.FilePage
"""
# find the most used image
inx_found = None # index of found image
diff --git a/scripts/data_ingestion.py b/scripts/data_ingestion.py
index ed8977f..886de71 100755
--- a/scripts/data_ingestion.py
+++ b/scripts/data_ingestion.py
@@ -53,7 +53,7 @@
from the title & template
@type metadata: dict
@param site: target site
- @type site: APISite
+ @type site: pywikibot.site.APISite

"""
self.URL = URL
@@ -161,7 +161,8 @@
Use None to determine the site from the pages treated.
Defaults to 'deprecated_default_commons' to use Wikimedia Commons
for backwards compatibility reasons. Deprecated.
- @type site: APISite, 'deprecated_default_commons' or None
+ @type site: pywikibot.site.APISite, 'deprecated_default_commons' or
+ None
"""
if site == 'deprecated_default_commons':
warn("site='deprecated_default_commons' is deprecated; "
diff --git a/scripts/flickrripper.py b/scripts/flickrripper.py
index d3a2ce2..fe50dcf 100755
--- a/scripts/flickrripper.py
+++ b/scripts/flickrripper.py
@@ -130,7 +130,7 @@
@type photo: io.BytesIO
@param site: Site to search for duplicates.
Defaults to using Wikimedia Commons if not supplied.
- @type site: APISite or None
+ @type site: pywikibot.site.APISite or None
"""
if not site:
site = pywikibot.Site('commons', 'commons')

To view, visit change 552684. To unsubscribe, or for help writing mail filters, visit settings.

Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-MessageType: merged
Gerrit-Change-Id: I352c3aab964446777b5bf8142422b2efc65ec467
Gerrit-Change-Number: 552684
Gerrit-PatchSet: 7
Gerrit-Owner: Mph8318 <mph5094@gmail.com>
Gerrit-Reviewer: D3r1ck01 <xsavitar.wiki@aol.com>
Gerrit-Reviewer: Dvorapa <dvorapa@seznam.cz>
Gerrit-Reviewer: Framawiki <framawiki@tools.wmflabs.org>
Gerrit-Reviewer: Morgan11235 <morgan11235@hotmail.com>
Gerrit-Reviewer: Siebrand <siebrand@kitano.nl>
Gerrit-Reviewer: Zoranzoki21 <zorandori4444@gmail.com>
Gerrit-Reviewer: jenkins-bot (75)
Gerrit-CC: Welcome, new contributor! <ssethi@wikimedia.org>
Gerrit-CC: Xqt <info@gno.de>