jenkins-bot has submitted this change and it was merged.
Change subject: Increase limits in QueryGenerator when data are sparse
......................................................................
Increase limits in QueryGenerator when data are sparse
In QueryGenerator's query loop, instead of just narrowing down limit
given to each api request.submit(), increase limit when only
"query-continue" data are fetched by a request.submit().
This will reduce the number of needed queries if data are "sparse".
See bug details for an example with data showing the background of the change.
Bug: 72209
Change-Id: I800a3399a991f5c51d23ed38b8636f09b704a9c6
---
M pywikibot/data/api.py
1 file changed, 31 insertions(+), 5 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
XZise: Looks good to me, but someone else must approve
jenkins-bot: Verified
diff --git a/pywikibot/data/api.py b/pywikibot/data/api.py
index 5357cc5..9603381 100644
--- a/pywikibot/data/api.py
+++ b/pywikibot/data/api.py
@@ -565,10 +565,6 @@
rawdata = http.request(
self.site, uri, method="POST",
headers=headers, body=body)
-
-# import traceback
-# traceback.print_stack()
-# print rawdata
except Server504Error:
pywikibot.log(u"Caught HTTP 504 error; retrying")
self.wait()
@@ -1005,13 +1001,24 @@
Continues response as needed until limit (if any) is reached.
"""
+ previous_result_had_data = True
+ prev_limit = new_limit = None
+
count = 0
while True:
if self.query_limit is not None:
+ prev_limit = new_limit
if self.limit is None:
new_limit = self.query_limit
elif self.limit > 0:
- new_limit = min(self.query_limit, self.limit - count)
+ if previous_result_had_data:
+ # self.resultkey in data in last request.submit()
+ new_limit = min(self.query_limit, self.limit - count)
+ else:
+ # only "query-continue" returned. See Bug 72209.
+ # increase new_limit to advance faster until new
+ # useful data are found again.
+ new_limit = min(new_limit * 2, self.query_limit)
else:
new_limit = None
@@ -1025,6 +1032,20 @@
new_limit = min(new_limit, self.api_limit // 10, 250)
if new_limit is not None:
self.request[self.prefix + "limit"] = str(new_limit)
+ if prev_limit != new_limit:
+ pywikibot.debug(
+ u"%s: query_limit: %s, api_limit: %s, "
+ u"limit: %s, new_limit: %s, count: %s"
+ % (self.__class__.__name__,
+ self.query_limit, self.api_limit,
+ self.limit, new_limit, count),
+ _logger)
+ pywikibot.debug(
+ u"%s: %s: %s"
+ % (self.__class__.__name__,
+ self.prefix + "limit",
+ self.request[self.prefix + "limit"]),
+ _logger)
if not hasattr(self, "data"):
self.data = self.request.submit()
if not self.data or not isinstance(self.data, dict):
@@ -1085,12 +1106,17 @@
# note: self.limit could be -1
if self.limit and self.limit > 0 and count >= self.limit:
return
+ # self.resultkey in data in last request.submit()
+ previous_result_had_data = True
else:
# if query-continue is present, self.resultkey might not have been
# fetched yet
if "query-continue" not in self.data:
# No results.
return
+ # self.resultkey not in data in last request.submit()
+ # only "query-continue" was retrieved.
+ previous_result_had_data = False
if self.module == "random" and self.limit:
# "random" module does not return "query-continue"
# now we loop for a new random query
--
To view, visit https://gerrit.wikimedia.org/r/167438
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I800a3399a991f5c51d23ed38b8636f09b704a9c6
Gerrit-PatchSet: 3
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
XZise has submitted this change and it was merged.
Change subject: Docstring fixes in scripts
......................................................................
Docstring fixes in scripts
Change-Id: Ic5a25fac9592fead9a6d8b0748bf13947ef7f2c7
---
M scripts/__init__.py
M scripts/category_redirect.py
M scripts/clean_sandbox.py
M scripts/commons_link.py
M scripts/commonscat.py
M scripts/cosmetic_changes.py
M scripts/create_categories.py
M scripts/data_ingestion.py
M scripts/delete.py
M scripts/disambredir.py
M scripts/editarticle.py
M scripts/featured.py
M scripts/fixing_redirects.py
M scripts/flickrripper.py
M scripts/freebasemappingupload.py
M scripts/image.py
M scripts/imagerecat.py
M scripts/imagetransfer.py
M scripts/imageuncat.py
M scripts/isbn.py
M scripts/lonelypages.py
M scripts/misspelling.py
M scripts/movepages.py
M scripts/noreferences.py
M scripts/nowcommons.py
M scripts/redirect.py
M scripts/reflinks.py
M scripts/replace.py
M scripts/replicate_wiki.py
M scripts/revertbot.py
M scripts/script_wui.py
M scripts/selflink.py
M scripts/solve_disambiguation.py
M scripts/spamremove.py
M scripts/template.py
M scripts/templatecount.py
M scripts/touch.py
M scripts/transferbot.py
M scripts/unlink.py
M scripts/unusedfiles.py
M scripts/upload.py
M scripts/version.py
M scripts/weblinkchecker.py
M scripts/welcome.py
M tox.ini
45 files changed, 417 insertions(+), 196 deletions(-)
Approvals:
XZise: Looks good to me, approved
diff --git a/scripts/__init__.py b/scripts/__init__.py
index c47da75..6eb2664 100644
--- a/scripts/__init__.py
+++ b/scripts/__init__.py
@@ -1 +1 @@
-# THIS DIRECTORY IS TO HOLD BOT SCRIPTS FOR THE NEW FRAMEWORK
+""" THIS DIRECTORY IS TO HOLD BOT SCRIPTS FOR THE NEW FRAMEWORK. """
diff --git a/scripts/category_redirect.py b/scripts/category_redirect.py
index cb5d321..f6274af 100755
--- a/scripts/category_redirect.py
+++ b/scripts/category_redirect.py
@@ -1,6 +1,6 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
-"""This bot will move pages out of redirected categories
+"""This bot will move pages out of redirected categories.
Usage: category_redirect.py [options]
@@ -36,7 +36,10 @@
class CategoryRedirectBot(object):
+ """Page category update bot."""
+
def __init__(self):
+ """Constructor."""
self.cooldown = 7 # days
self.site = pywikibot.Site()
self.catprefix = self.site.namespace(14) + ":"
diff --git a/scripts/clean_sandbox.py b/scripts/clean_sandbox.py
index ee1666d..09503eb 100755
--- a/scripts/clean_sandbox.py
+++ b/scripts/clean_sandbox.py
@@ -1,8 +1,7 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-This bot cleans a (user) sandbox by replacing the current contents with
-predefined text.
+This bot resets a (user) sandbox with predefined text.
This script understands the following command-line arguments:
@@ -137,6 +136,9 @@
class SandboxBot(Bot):
+
+ """Sandbox reset bot."""
+
availableOptions = {
'hours': 1,
'no_repeat': True,
@@ -149,6 +151,7 @@
}
def __init__(self, **kwargs):
+ """Constructor."""
super(SandboxBot, self).__init__(**kwargs)
if self.getOption('delay') is None:
d = min(15, max(5, int(self.getOption('hours') * 60)))
@@ -177,6 +180,7 @@
sys.exit(0)
def run(self):
+ """Run bot."""
self.site.login()
while True:
wait = False
diff --git a/scripts/commons_link.py b/scripts/commons_link.py
index 690d1e1..fae8aa5 100644
--- a/scripts/commons_link.py
+++ b/scripts/commons_link.py
@@ -43,6 +43,9 @@
class CommonsLinkBot(Bot):
+
+ """Commons linking bot."""
+
def __init__(self, generator, **kwargs):
self.availableOptions.update({
'action': None,
diff --git a/scripts/commonscat.py b/scripts/commonscat.py
index 760a5cc..1d89b7c 100755
--- a/scripts/commonscat.py
+++ b/scripts/commonscat.py
@@ -2,6 +2,7 @@
# -*- coding: utf-8 -*-
"""
With this tool you can add the template {{commonscat}} to categories.
+
The tool works by following the interwiki links. If the template is present on
another langauge page, the bot will use it.
@@ -236,6 +237,8 @@
class CommonscatBot(Bot):
+ """Commons categorisation bot."""
+
def __init__(self, generator, always, summary=None):
super(CommonscatBot, self).__init__(always=always)
self.generator = generator
@@ -262,6 +265,7 @@
@classmethod
def getCommonscatTemplate(self, code=None):
"""Get the template name of a site. Expects the site code.
+
Return as tuple containing the primary template and it's alternatives
"""
@@ -271,7 +275,7 @@
return commonscatTemplates[u'_default']
def skipPage(self, page):
- """Do we want to skip this page?"""
+ """Determine if the page should be skipped."""
if page.site.code in ignoreTemplates:
templatesInThePage = page.templates()
templatesWithParams = page.templatesWithParams()
@@ -288,7 +292,10 @@
return False
def addCommonscat(self, page):
- """Take a page. Go to all the interwiki page looking for a commonscat
+ """
+ Add CommonsCat template to page.
+
+ Take a page. Go to all the interwiki page looking for a commonscat
template. When all the interwiki's links are checked and a proper
category is found add it to the page.
@@ -418,7 +425,10 @@
return u''
def getCommonscatLink(self, wikipediaPage=None):
- """Go through the page and return a tuple of (<templatename>, <target>)"""
+ """Find CommonsCat template on page.
+
+ @rtype: tuple of (<templatename>, <target>)
+ """
primaryCommonscat, commonscatAlternatives = self.getCommonscatTemplate(
wikipediaPage.site.code)
commonscatTemplate = u''
@@ -444,7 +454,8 @@
return None
def checkCommonscatLink(self, name=""):
- """ This function will return the name of a valid commons category
+ """ Return the name of a valid commons category.
+
If the page is a redirect this function tries to follow it.
If the page doesnt exists the function will return an empty string
diff --git a/scripts/cosmetic_changes.py b/scripts/cosmetic_changes.py
index 1c827fb..e02e7ca 100755
--- a/scripts/cosmetic_changes.py
+++ b/scripts/cosmetic_changes.py
@@ -1,9 +1,9 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-This module can do slight modifications to a wiki page source code such that
-the code looks cleaner. The changes are not supposed to change the look of the
-rendered wiki page.
+This module can do slight modifications to tidy a wiki page's source code.
+
+The changes are not supposed to change the look of the rendered wiki page.
The following parameters are supported:
@@ -157,6 +157,8 @@
class CosmeticChangesToolkit:
+ """Cosmetic changes toolkit."""
+
def __init__(self, site, debug=False, redirect=False, namespace=None,
pageTitle=None, ignore=CANCEL_ALL):
self.site = site
@@ -241,6 +243,7 @@
def fixSelfInterwiki(self, text):
"""
Interwiki links to the site itself are displayed like local links.
+
Remove their language code prefix.
"""
if not self.talkpage and pywikibot.calledModuleName() != 'interwiki':
@@ -251,6 +254,8 @@
def standardizePageFooter(self, text):
"""
+ Standardize page footer.
+
Makes sure that interwiki links, categories and star templates are
put to the correct position and into the right order. This combines the
old instances standardizeInterwiki and standardizeCategories
@@ -359,7 +364,7 @@
return text
def translateAndCapitalizeNamespaces(self, text):
- """Makes sure that localized namespace names are used."""
+ """Use localized namespace names."""
# arz uses english stylish codes
if self.site.sitename() == 'wikipedia:arz':
return text
@@ -404,7 +409,7 @@
return text
def translateMagicWords(self, text):
- """Makes sure that localized namespace names are used."""
+ """Use localized magic words."""
# not wanted at ru
# arz uses english stylish codes
if self.site.code not in ['arz', 'ru']:
@@ -587,10 +592,13 @@
def removeNonBreakingSpaceBeforePercent(self, text):
"""
+ Insert a non-breaking space between number and percent sign.
+
Newer MediaWiki versions automatically place a non-breaking space in
front of a percent sign, so it is no longer required to place it
manually.
+ FIXME: which version should this be run on?
"""
text = textlib.replaceExcept(text, r'(\d) %', r'\1 %',
['timeline'])
@@ -598,8 +606,8 @@
def cleanUpSectionHeaders(self, text):
"""
- For better readability of section header source code, puts a space
- between the equal signs and the title.
+ Add a space between the equal signs and the section title.
+
Example: ==Section title== becomes == Section title ==
NOTE: This space is recommended in the syntax help on the English and
@@ -614,8 +622,7 @@
def putSpacesInLists(self, text):
"""
- For better readability of bullet list and enumeration wiki source code,
- puts a space between the * or # and the text.
+ Add a space between the * or # and the text.
NOTE: This space is recommended in the syntax help on the English,
German, and French Wikipedia. It might be that it is not wanted on other
@@ -884,6 +891,9 @@
class CosmeticChangesBot(Bot):
+
+ """Cosmetic changes bot."""
+
def __init__(self, generator, **kwargs):
self.availableOptions.update({
'async': False,
diff --git a/scripts/create_categories.py b/scripts/create_categories.py
index 568219c..f6ef329 100755
--- a/scripts/create_categories.py
+++ b/scripts/create_categories.py
@@ -37,6 +37,9 @@
class CreateCategoriesBot(Bot):
+
+ """Category creator bot."""
+
def __init__(self, generator, parent, basename, **kwargs):
super(CreateCategoriesBot, self).__init__(**kwargs)
self.generator = generator
diff --git a/scripts/data_ingestion.py b/scripts/data_ingestion.py
index 70856eb..72e22f5 100755
--- a/scripts/data_ingestion.py
+++ b/scripts/data_ingestion.py
@@ -1,6 +1,6 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
-"""A generic bot to do data ingestion (batch uploading) to Commons"""
+"""A generic bot to do data ingestion (batch uploading) to Commons."""
#
# (C) Pywikibot team, 2013
#
@@ -42,6 +42,7 @@
"""
def __init__(self, URL, metadata):
+ """Constructor."""
self.URL = URL
self.metadata = metadata
self.metadata["_url"] = URL
@@ -66,7 +67,9 @@
def findDuplicateImages(self,
site=pywikibot.Site(u'commons', u'commons')):
"""
- Takes the photo, calculates the SHA1 hash and asks the MediaWiki api
+ Find duplicates of the photo.
+
+ Calculates the SHA1 hash and asks the MediaWiki api
for a list of duplicates.
TODO: Add exception handling, fix site thing
@@ -77,8 +80,12 @@
def getTitle(self, fmt):
"""
- Given a format string with %(name)s entries, returns the string
- formatted with metadata
+ Populate format string with %(name)s entries using metadata.
+
+ @param fmt: format string
+ @type fmt: unicode
+ @return: formatted string
+ @rtype: unicode
"""
return fmt % self.metadata
@@ -102,6 +109,7 @@
def CSVReader(fileobj, urlcolumn, *args, **kwargs):
+ """CSV reader."""
import csv
reader = csv.DictReader(fileobj, *args, **kwargs)
@@ -110,6 +118,9 @@
class DataIngestionBot:
+
+ """Data ingestion bot."""
+
def __init__(self, reader, titlefmt, pagefmt,
site=pywikibot.Site(u'commons', u'commons')):
self.reader = reader
diff --git a/scripts/delete.py b/scripts/delete.py
index 2b5c329..4f1e367 100644
--- a/scripts/delete.py
+++ b/scripts/delete.py
@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
"""
This script can be used to delete and undelete pages en masse.
+
Of course, you will need an admin account on the relevant wiki.
These command line parameters can be used to specify which pages to work on:
@@ -43,6 +44,7 @@
class DeletionRobot(Bot):
+
""" This robot allows deletion of pages en masse. """
def __init__(self, generator, summary, **kwargs):
@@ -63,9 +65,10 @@
self.summary = summary
def run(self):
- """ Start the robot's action:
- Loop through everything in the page generator and delete it.
+ """
+ Run bot.
+ Loop through everything in the page generator and delete it.
"""
for page in self.generator:
self.current_page = page
diff --git a/scripts/disambredir.py b/scripts/disambredir.py
index 3413dde..9e30414 100644
--- a/scripts/disambredir.py
+++ b/scripts/disambredir.py
@@ -1,8 +1,7 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-Goes through the disambiguation pages, checks their links, and asks for
-each link that goes to a redirect page whether it should be replaced.
+User assisted updating redirect links on disambiguation pages.
Usage:
python disambredir.py [start]
diff --git a/scripts/editarticle.py b/scripts/editarticle.py
index d9ffe13..680c9b5 100755
--- a/scripts/editarticle.py
+++ b/scripts/editarticle.py
@@ -28,6 +28,9 @@
class ArticleEditor(object):
+
+ """Edit a wiki page."""
+
# join lines if line starts with this ones
# TODO: No apparent usage
# joinchars = string.letters + '[]' + string.digits
@@ -58,7 +61,7 @@
self.options.page = args[0]
def setpage(self):
- """Sets page and page title."""
+ """Set page and page title."""
site = pywikibot.Site()
pageTitle = self.options.page or pywikibot.input(u"Page to edit:")
self.page = pywikibot.Page(pywikibot.Link(pageTitle, site))
diff --git a/scripts/featured.py b/scripts/featured.py
index 7413860..270619c 100644
--- a/scripts/featured.py
+++ b/scripts/featured.py
@@ -1,6 +1,8 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
+Manage featured/good article/list status template.
+
This script understands various command-line arguments:
Task commands:
@@ -205,6 +207,9 @@
class FeaturedBot(pywikibot.Bot):
+
+ """Featured article bot."""
+
# Bot configuration.
# Only the keys of the dict can be passed as init options
# The values are the default values
@@ -247,7 +252,7 @@
self.tasks = ['featured']
def itersites(self, task):
- """generator for site codes to be processed."""
+ """Generator for site codes to be processed."""
def _generator():
if task == 'good':
item_no = good_name['wikidata'][1]
@@ -500,7 +505,6 @@
remember the page in the cache dict.
"""
-
tosite = self.site
if fromsite.code not in self.cache:
self.cache[fromsite.code] = {}
@@ -532,7 +536,6 @@
def add_template(self, source, dest, task, fromsite):
"""Place or remove the Link_GA/FA template on/from a page."""
-
def compile_link(site, templates):
"""compile one link template list."""
findtemplate = '(%s)' % '|'.join(templates)
diff --git a/scripts/fixing_redirects.py b/scripts/fixing_redirects.py
index 8a7fa0f..10de73d 100644
--- a/scripts/fixing_redirects.py
+++ b/scripts/fixing_redirects.py
@@ -1,8 +1,7 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-This script has the intention to correct all redirect
-links in featured pages or only one page of each wiki.
+Correct all redirect links in featured pages or only one page of each wiki.
Can be using with:
¶ms;
diff --git a/scripts/flickrripper.py b/scripts/flickrripper.py
index eaff57e..0f5302d 100644
--- a/scripts/flickrripper.py
+++ b/scripts/flickrripper.py
@@ -1,7 +1,7 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-Tool to copy a flickr stream to Commons
+Tool to copy a flickr stream to Commons.
# Get a set to work on (start with just a username).
# * Make it possible to delimit the set (from/to)
@@ -81,7 +81,7 @@
def getPhoto(flickr=None, photo_id=''):
"""
- Get the photo info and the photo sizes so we can use these later on
+ Get the photo info and the photo sizes so we can use these later on.
TODO: Add exception handling
@@ -100,7 +100,7 @@
def isAllowedLicense(photoInfo=None):
"""
- Check if the image contains the right license
+ Check if the image contains the right license.
TODO: Maybe add more licenses
"""
@@ -133,7 +133,9 @@
def findDuplicateImages(photo=None,
site=pywikibot.Site(u'commons', u'commons')):
- """ Take the photo, calculate the SHA1 hash and ask the MediaWiki api
+ """ Find duplicate images.
+
+ Take the photo, calculate the SHA1 hash and ask the MediaWiki api
for a list of duplicates.
TODO: Add exception handling, fix site thing
@@ -155,7 +157,7 @@
def getFlinfoDescription(photo_id=0):
"""
- Get the description from http://wikipedia.ramselehof.de/flinfo.php
+ Get the description from http://wikipedia.ramselehof.de/flinfo.php.
TODO: Add exception handling, try a couple of times
"""
@@ -168,8 +170,9 @@
def getFilename(photoInfo=None, site=None, project=u'Flickr'):
- """ Build a good filename for the upload based on the username and the
- title. Prevents naming collisions.
+ """ Build a good filename for the upload based on the username and title.
+
+ Prevents naming collisions.
"""
if not site:
@@ -209,8 +212,9 @@
def cleanUpTitle(title):
- """ Clean up the title of a potential MediaWiki page. Otherwise the title of
- the page might not be allowed by the software.
+ """ Clean up the title of a potential MediaWiki page.
+
+ Otherwise the title of the page might not be allowed by the software.
"""
title = title.strip()
@@ -236,8 +240,9 @@
def buildDescription(flinfoDescription=u'', flickrreview=False, reviewer=u'',
override=u'', addCategory=u'', removeCategories=False):
- """ Build the final description for the image. The description is based on
- the info from flickrinfo and improved.
+ """ Build the final description for the image.
+
+ The description is based on the info from flickrinfo and improved.
"""
description = u'== {{int:filedesc}} ==\n%s' % flinfoDescription
@@ -324,6 +329,7 @@
""" The user dialog. """
def __init__(self, photoDescription, photo, filename):
+ """Constructor."""
self.root = Tk()
# "%dx%d%+d%+d" % (width, height, xoffset, yoffset)
self.root.geometry("%ix%i+10-10" % (config.tkhorsize, config.tkvertsize))
@@ -396,9 +402,10 @@
self.root.destroy()
def run(self):
- """ Activate the dialog and return the new name and if the image is
- skipped.
+ """ Activate the dialog.
+ @return: new description, name, and if the image is skipped
+ @rtype: tuple of (unicode, unicode, bool)
"""
self.root.mainloop()
return self.photoDescription, self.filename, self.skip
@@ -504,7 +511,7 @@
def usage():
"""
- Print usage information
+ Print usage information.
TODO : Need more.
"""
diff --git a/scripts/freebasemappingupload.py b/scripts/freebasemappingupload.py
index b021abd..bbacec6 100644
--- a/scripts/freebasemappingupload.py
+++ b/scripts/freebasemappingupload.py
@@ -1,7 +1,8 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-Script to upload the mappings of Freebase to Wikidata
+Script to upload the mappings of Freebase to Wikidata.
+
Can be easily adapted to upload other String identifiers as well
This bot needs the dump from
@@ -31,6 +32,8 @@
class FreebaseMapperRobot:
+ """Freebase Mapping bot."""
+
def __init__(self, filename):
self.repo = pywikibot.Site('wikidata', 'wikidata').data_repository()
self.filename = filename
diff --git a/scripts/image.py b/scripts/image.py
index 68eb89f..6053795 100644
--- a/scripts/image.py
+++ b/scripts/image.py
@@ -1,7 +1,6 @@
# -*- coding: utf-8 -*-
"""
-This script can be used to change one image to another or remove an image
-entirely.
+This script can be used to change one image to another or remove an image.
Syntax: python image.py image_name [new_image_name]
@@ -49,10 +48,7 @@
class ImageRobot(Bot):
- """
- This bot will load all pages yielded by a file links image page generator and
- replace or remove all occurences of the old image.
- """
+ """This bot will replace or remove all occurences of an old image."""
# Summary messages for replacing images
msg_replace = {
diff --git a/scripts/imagerecat.py b/scripts/imagerecat.py
index 8a8558d..f9bcb12 100644
--- a/scripts/imagerecat.py
+++ b/scripts/imagerecat.py
@@ -74,8 +74,9 @@
def categorizeImages(generator, onlyFilter, onlyUncat):
- """ Loop over all images in generator and try to categorize them. Get
- category suggestions from CommonSense.
+ """ Loop over all images in generator and try to categorize them.
+
+ Get category suggestions from CommonSense.
"""
for page in generator:
@@ -112,8 +113,9 @@
def getCommonshelperCats(imagepage):
- """ Get category suggestions from CommonSense. Parse them and return a list
- of suggestions.
+ """ Get category suggestions from CommonSense.
+
+ @rtype: list of unicode
"""
commonshelperCats = []
@@ -212,8 +214,9 @@
def getOpenStreetMap(latitude, longitude):
"""
- Get the result from https://nominatim.openstreetmap.org/reverse
- and put it in a list of tuples to play around with
+ Get the result from https://nominatim.openstreetmap.org/reverse .
+
+ @rtype: list of tuples
"""
result = []
gotInfo = False
@@ -246,7 +249,7 @@
def getCategoryByName(name, parent=u'', grandparent=u''):
-
+ """Get category by name."""
if not parent == u'':
workname = name.strip() + u',_' + parent.strip()
workcat = pywikibot.Category(pywikibot.Site(u'commons', u'commons'), workname)
@@ -336,6 +339,7 @@
def filterCountries(categories):
""" Try to filter out ...by country categories.
+
First make a list of any ...by country categories and try to find some
countries. If a by country category has a subcategoy containing one of the
countries found, add it. The ...by country categories remain in the set and
diff --git a/scripts/imagetransfer.py b/scripts/imagetransfer.py
index 587502b..10758cf 100644
--- a/scripts/imagetransfer.py
+++ b/scripts/imagetransfer.py
@@ -151,6 +151,9 @@
class ImageTransferBot:
+
+ """Image transfer bot."""
+
def __init__(self, generator, targetSite=None, interwiki=False,
keep_name=False, ignore_warning=False):
self.generator = generator
@@ -160,11 +163,10 @@
self.ignore_warning = ignore_warning
def transferImage(self, sourceImagePage):
- """Get a wikilink to an image, download it and its description,
- and upload it to another wikipedia.
- Return the filename which was used to upload the image
- This function is used by imagetransfer.py and by copy_table.py
+ """
+ Download image and its description, and upload it to another site.
+ @return: the filename which was used to upload the image
"""
sourceSite = sourceImagePage.site
url = sourceImagePage.fileUrl().encode('utf-8')
diff --git a/scripts/imageuncat.py b/scripts/imageuncat.py
index cc2aa9b..36bacd0 100755
--- a/scripts/imageuncat.py
+++ b/scripts/imageuncat.py
@@ -2,6 +2,7 @@
# -*- coding: utf-8 -*-
"""
Program to add uncat template to images without categories at commons.
+
See imagerecat.py (still working on that one) to add these images to categories.
"""
@@ -1235,12 +1236,11 @@
def uploadedYesterday(site):
- '''
+ """
Return a pagegenerator containing all the pictures uploaded yesterday.
+
Should probably copied to somewhere else
-
- '''
-
+ """
today = pywikibot.Timestamp.utcnow()
yesterday = today + timedelta(days=-1)
@@ -1249,11 +1249,12 @@
def recentChanges(site=None, delay=0, block=70):
- '''
+ """
Return a pagegenerator containing all the images edited in a certain timespan.
+
The delay is the amount of minutes to wait and the block is the timespan to return images in.
Should probably be copied to somewhere else
- '''
+ """
rcstart = site.getcurrenttime() + timedelta(minutes=-(delay + block))
rcend = site.getcurrenttime() + timedelta(minutes=-delay)
@@ -1267,13 +1268,13 @@
def isUncat(page):
- '''
- Do we want to skip this page?
+ """
+ Do we want to skip this page.
If we found a category which is not in the ignore list it means
that the page is categorized so skip the page.
If we found a template which is in the ignore list, skip the page.
- '''
+ """
pywikibot.output(u'Working on ' + page.title())
for category in page.categories():
@@ -1298,9 +1299,12 @@
def addUncat(page):
- '''
- Add the uncat template to the page
- '''
+ """
+ Add the uncat template to the page.
+
+ @param page: Page to be modified
+ @rtype: Page
+ """
newtext = page.get() + puttext
pywikibot.showDiff(page.get(), newtext)
try:
diff --git a/scripts/isbn.py b/scripts/isbn.py
index 2a22aaa..c9f14d6 100755
--- a/scripts/isbn.py
+++ b/scripts/isbn.py
@@ -2,8 +2,7 @@
# -*- coding: utf-8 -*-
"""
-This script goes over multiple pages of the home wiki, and reports invalid
-ISBN numbers.
+This script reports and fixes invalid ISBN numbers.
Additionally, it can convert all ISBN-10 codes to the ISBN-13 format, and
correct the ISBN format by placing hyphens.
@@ -1160,16 +1159,16 @@
class InvalidIsbnException(pywikibot.Error):
+
"""Invalid ISBN."""
class ISBN:
- """
- Abstract superclass
- """
+
+ """Abstract superclass."""
def format(self):
- """Puts hyphens into this ISBN number."""
+ """Put hyphens into this ISBN number."""
result = ''
rest = ''
for digit in self.digits():
@@ -1209,6 +1208,9 @@
class ISBN13(ISBN):
+
+ """ISBN 13."""
+
def __init__(self, code, checksumMissing=False):
self.code = code
if checksumMissing:
@@ -1219,7 +1221,7 @@
return ['978', '979']
def digits(self):
- """Returns a list of the digits in the ISBN code."""
+ """Return a list of the digits in the ISBN code."""
result = []
for c in self.code:
if c.isdigit():
@@ -1248,6 +1250,9 @@
class ISBN10(ISBN):
+
+ """ISBN 10."""
+
def __init__(self, code):
self.code = code
self.checkValidity()
@@ -1256,7 +1261,7 @@
return []
def digits(self):
- """Returns a list of the digits and Xs in the ISBN code."""
+ """Return a list of the digits and Xs in the ISBN code."""
result = []
for c in self.code:
if c.isdigit() or c in 'Xx':
@@ -1267,10 +1272,7 @@
return result
def checkChecksum(self):
- """
- Raises an InvalidIsbnException if the checksum shows that the
- ISBN is incorrect.
- """
+ """Raise an InvalidIsbnException if the ISBN checksum is incorrect."""
# See https://en.wikipedia.org/wiki/ISBN#Check_digit_in_ISBN_10
sum = 0
for i in range(0, 9):
@@ -1297,8 +1299,9 @@
def toISBN13(self):
"""
- Creates a 13-digit ISBN from this 10-digit ISBN by prefixing the GS1
- prefix '978' and recalculating the checksum.
+ Create a 13-digit ISBN from this 10-digit ISBN.
+
+ Adds the GS1 prefix '978' and recalculates the checksum.
The hyphenation structure is taken from the format of the original
ISBN number.
"""
@@ -1317,6 +1320,7 @@
def getIsbn(code):
+ """Return an ISBN object for the code."""
try:
i = ISBN13(code)
except InvalidIsbnException as e13:
@@ -1341,6 +1345,7 @@
def hyphenateIsbnNumbers(text):
+ """Helper function to hyphenate an ISBN."""
isbnR = re.compile(r'(?<=ISBN )(?P<code>[\d\-]+[\dXx])')
text = isbnR.sub(_hyphenateIsbnNumber, text)
return text
@@ -1359,6 +1364,7 @@
def convertIsbn10toIsbn13(text):
+ """Helper function to convert ISBN 10 to ISBN 13."""
isbnR = re.compile(r'(?<=ISBN )(?P<code>[\d\-]+[Xx]?)')
text = isbnR.sub(_isbn10toIsbn13, text)
return text
@@ -1366,6 +1372,8 @@
class IsbnBot(Bot):
+ """ISBN bot."""
+
def __init__(self, generator, **kwargs):
self.availableOptions.update({
'to13': False,
diff --git a/scripts/lonelypages.py b/scripts/lonelypages.py
index 89695bc..af36aa1 100644
--- a/scripts/lonelypages.py
+++ b/scripts/lonelypages.py
@@ -1,9 +1,7 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-This is a script written to add the template "orphan" to the pages that aren't
-linked by other pages. It can give some strange Errors sometime, I hope that
-all of them are fixed in this version.
+This is a script written to add the template "orphan" to pages.
These command line parameters can be used to specify which pages to work on:
@@ -76,6 +74,9 @@
class LonelyPagesBot(Bot):
+
+ """Orphan page tagging bot."""
+
def __init__(self, generator, **kwargs):
self.availableOptions.update({
'enablePage': None, # Check if someone set an enablePage or not
diff --git a/scripts/misspelling.py b/scripts/misspelling.py
index 2865c81..a18a7e4 100644
--- a/scripts/misspelling.py
+++ b/scripts/misspelling.py
@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
"""
-This script works similar to solve_disambiguation.py. It is supposed to fix
-links that contain common spelling mistakes. This is only possible on wikis
-that have a template for these misspellings.
+This script fixes links that contain common spelling mistakes.
+
+This is only possible on wikis that have a template for these misspellings.
Command line options:
@@ -45,6 +45,8 @@
class MisspellingRobot(DisambiguationRobot):
+ """Spelling bot."""
+
misspellingTemplate = {
'da': None, # uses simple redirects
'de': u'Falschschreibung',
diff --git a/scripts/movepages.py b/scripts/movepages.py
index 7c2c6ac..d5fc6ec 100644
--- a/scripts/movepages.py
+++ b/scripts/movepages.py
@@ -53,6 +53,9 @@
class MovePagesBot(Bot):
+
+ """Page move bot."""
+
def __init__(self, generator, **kwargs):
self.availableOptions.update({
'prefix': None,
diff --git a/scripts/noreferences.py b/scripts/noreferences.py
index 0c5ff7c..3ca9711 100755
--- a/scripts/noreferences.py
+++ b/scripts/noreferences.py
@@ -1,8 +1,9 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
-
"""
-This script goes over multiple pages, searches for pages where <references />
+This script adds a missing references section to pages.
+
+It goes over multiple pages, searches for pages where <references />
is missing although a <ref> tag is present, and in that case adds a new
references section.
@@ -426,12 +427,15 @@
"""
Generator which will yield Pages that might lack a references tag.
+
These pages will be retrieved from a local XML dump file
(pages-articles or pages-meta-current).
"""
def __init__(self, xmlFilename):
"""
+ Constructor.
+
Arguments:
* xmlFilename - The dump's path, either absolute or relative
"""
@@ -452,7 +456,10 @@
class NoReferencesBot(Bot):
+ """References section bot."""
+
def __init__(self, generator, **kwargs):
+ """Constructor."""
self.availableOptions.update({
'verbose': True,
})
@@ -478,7 +485,7 @@
self.referencesText = u'<references />'
def lacksReferences(self, text):
- """Checks whether or not the page is lacking a references tag."""
+ """Check whether or not the page is lacking a references tag."""
oldTextCleaned = textlib.removeDisabledParts(text)
if self.referencesR.search(oldTextCleaned) or \
self.referencesTagR.search(oldTextCleaned):
@@ -503,8 +510,9 @@
def addReferences(self, oldText):
"""
- Tries to add a references tag into an existing section where it fits
- into. If there is no such section, creates a new section containing
+ Add a references tag into an existing section where it fits into.
+
+ If there is no such section, creates a new section containing
the references tag.
* Returns : The modified pagetext
diff --git a/scripts/nowcommons.py b/scripts/nowcommons.py
index dc16bd0..b3494e4 100644
--- a/scripts/nowcommons.py
+++ b/scripts/nowcommons.py
@@ -1,8 +1,9 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-Script to delete files that are also present on Wikimedia Commons on a local
-wiki. Do not run this script on Wikimedia Commons itself. It works based on
+Script to delete files that are also present on Wikimedia Commons.
+
+Do not run this script on Wikimedia Commons itself. It works based on
a given array of templates defined below.
Files are downloaded and compared. If the files match, it can be deleted on
@@ -182,6 +183,9 @@
class NowCommonsDeleteBot(Bot):
+
+ """Bot to delete migrated files."""
+
def __init__(self, **kwargs):
self.availableOptions.update({
'replace': False,
diff --git a/scripts/redirect.py b/scripts/redirect.py
index 2f3d087..254d9a2 100755
--- a/scripts/redirect.py
+++ b/scripts/redirect.py
@@ -1,9 +1,10 @@
#! /usr/bin/python
# -*- coding: utf-8 -*-
"""
-Script to resolve double redirects, and to delete broken redirects. Requires
-access to MediaWiki's maintenance pages or to a XML dump file. Delete
-function requires adminship.
+Script to resolve double redirects, and to delete broken redirects.
+
+Requires access to MediaWiki's maintenance pages or to a XML dump file.
+Delete function requires adminship.
Syntax:
@@ -76,6 +77,9 @@
class RedirectGenerator:
+
+ """Redirect generator."""
+
def __init__(self, xmlFilename=None, namespaces=[], offset=-1,
use_move_log=False, use_api=False, start=None, until=None,
number=None, step=None):
@@ -94,6 +98,8 @@
def get_redirects_from_dump(self, alsoGetPageTitles=False):
"""
+ Extract redirects from dump.
+
Load a local XML dump file, look at all pages which have the
redirect flag set, and find out where they're pointing at. Return
a dictionary where the redirect names are the keys and the redirect
@@ -180,10 +186,7 @@
yield p
def _next_redirect_group(self):
- """
- Return a generator that retrieves pageids from the API 500 at a time
- and yields them as a list
- """
+ """Generator that yields batches of 500 redirects as a list."""
apiQ = []
for page in self.get_redirect_pages_via_api():
apiQ.append(str(page._pageid))
@@ -195,7 +198,8 @@
def get_redirects_via_api(self, maxlen=8):
"""
- Return a generator that yields tuples of data about redirect Pages:
+ Return a generator that yields tuples of data about redirect Pages.
+
0 - page title of a redirect page
1 - type of redirect:
0 - broken redirect, target page title missing
@@ -331,7 +335,6 @@
def get_moved_pages_redirects(self):
"""Generate redirects to recently-moved pages."""
# this will run forever, until user interrupts it
-
if self.offset <= 0:
self.offset = 1
start = (datetime.datetime.utcnow() -
@@ -367,6 +370,9 @@
class RedirectRobot(Bot):
+
+ """Redirect bot."""
+
def __init__(self, action, generator, **kwargs):
self.availableOptions.update({
'number': None,
@@ -380,10 +386,12 @@
self._valid_template = None
def has_valid_template(self, twtitle):
- """Check whether a template from translatewiki.net does exist on real
- wiki. We assume we are always working on self.site
+ """
+ Check whether a template from translatewiki.net exists on the wiki.
- @param twtitle - a sting which is the i18n key
+ We assume we are always working on self.site
+
+ @param twtitle - a string which is the i18n key
"""
if self._valid_template is None:
diff --git a/scripts/reflinks.py b/scripts/reflinks.py
index 2404093..6fcc135 100644
--- a/scripts/reflinks.py
+++ b/scripts/reflinks.py
@@ -1,5 +1,7 @@
# -*- coding: utf-8 -*-
"""
+Fetch and add titles for bare links in references.
+
This bot will search for references which are only made of a link without title,
(i.e. <ref>[https://www.google.fr/]</ref> or <ref>https://www.google.fr/</ref>)
and will fetch the html title from the link to use it as the title of the wiki
@@ -181,7 +183,7 @@
class XmlDumpPageGenerator:
- """Xml generator that yiels pages containing bare references."""
+ """Xml generator that yields pages containing bare references."""
def __init__(self, xmlFilename, xmlStart, namespaces):
self.xmlStart = xmlStart
@@ -268,9 +270,10 @@
# TODO : remove HTML when both opening and closing tags are included
def avoid_uppercase(self):
- """ If title has more than 6 characters and has 60% of uppercase
- characters, capitalize() it
+ """
+ Convert to title()-case if title is 70% uppercase characters.
+ Skip title that has less than 6 characters.
"""
if len(self.title) <= 6:
return
@@ -289,10 +292,12 @@
class DuplicateReferences:
- """ When some references are duplicated in an article,
- name the first, and remove the content of the others
+ """Helper to de-duplicate references in text.
+ When some references are duplicated in an article,
+ name the first, and remove the content of the others
"""
+
def __init__(self):
# Match references
self.REFS = re.compile(
@@ -400,6 +405,8 @@
class ReferencesRobot(Bot):
+ """References bot."""
+
def __init__(self, generator, **kwargs):
"""- generator : Page generator."""
self.availableOptions.update({
@@ -463,7 +470,8 @@
def getPDFTitle(self, ref, f):
""" Use pdfinfo to retrieve title from a PDF.
- Unix-only, I'm afraid.
+
+ FIXME: Unix-only, I'm afraid.
"""
pywikibot.output(u'PDF file.')
diff --git a/scripts/replace.py b/scripts/replace.py
index da332cf..af28780 100755
--- a/scripts/replace.py
+++ b/scripts/replace.py
@@ -1,9 +1,10 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
-"""
-This bot will make direct text replacements. It will retrieve information on
-which pages might need changes either from an XML dump or a text file, or only
-change a single page.
+r"""
+This bot will make direct text replacements.
+
+It will retrieve information on which pages might need changes either from
+an XML dump or a text file, or only change a single page.
These command line parameters can be used to specify which pages to work on:
@@ -83,7 +84,7 @@
other: First argument is the old text, second argument is the new
text. If the -regex argument is given, the first argument
will be regarded as a regular expression, and the second
- argument might contain expressions like \\1 or \g<name>.
+ argument might contain expressions like \1 or \g<name>.
It is possible to introduce more than one pair of old text
and replacement.
@@ -93,7 +94,7 @@
new syntax, e.g. {{Stub}}, download an XML dump file (pages-articles) from
https://download.wikimedia.org, then use this command:
- python replace.py -xml -regex "{{msg:(.*?)}}" "{{\\1}}"
+ python replace.py -xml -regex "{{msg:(.*?)}}" "{{\1}}"
If you have a dump called foobar.xml and want to fix typos in articles, e.g.
Errror -> Error, use this:
@@ -161,8 +162,8 @@
* exceptions - A dictionary which defines when to ignore an
occurence. See docu of the ReplaceRobot
constructor below.
-
"""
+
def __init__(self, xmlFilename, xmlStart, replacements, exceptions):
self.xmlFilename = xmlFilename
self.replacements = replacements
@@ -232,6 +233,8 @@
acceptall=False, allowoverlap=False, recursive=False,
addedCat=None, sleep=None, summary='', site=None):
"""
+ Constructor.
+
Arguments:
* generator - A generator that yields Page objects.
* replacements - A list of 2-tuples of original text (as a
@@ -295,8 +298,9 @@
def isTextExcepted(self, original_text):
"""
- Iff one of the exceptions applies for the given page contents,
- returns True.
+ Return True iff one of the exceptions applies for the given text.
+
+ @rtype: bool
"""
if "text-contains" in self.exceptions:
for exc in self.exceptions['text-contains']:
@@ -306,8 +310,9 @@
def doReplacements(self, original_text):
"""
- Returns the text which is generated by applying all replacements to
- the given text.
+ Apply all replacements to the given text.
+
+ @rtype: unicode
"""
new_text = original_text
exceptions = []
@@ -324,7 +329,7 @@
return new_text
def run(self):
- """Starts the bot."""
+ """Start the bot."""
# Run the generator which will yield Pages which might need to be
# changed.
for page in self.generator:
@@ -426,6 +431,7 @@
def prepareRegexForMySQL(pattern):
+ """Convert regex to MySQL syntax."""
pattern = pattern.replace('\s', '[:space:]')
pattern = pattern.replace('\d', '[:digit:]')
pattern = pattern.replace('\w', '[:alnum:]')
diff --git a/scripts/replicate_wiki.py b/scripts/replicate_wiki.py
index e7976da..f417541 100644
--- a/scripts/replicate_wiki.py
+++ b/scripts/replicate_wiki.py
@@ -1,8 +1,7 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
-This bot replicates all pages (from specific namespaces) in a wiki to a second
-wiki within one family.
+This bot replicates pages in a wiki to a second wiki within one family.
Example:
python replicate_wiki.py [-r] -ns 10 -f wikipedia -o nl li fy
diff --git a/scripts/revertbot.py b/scripts/revertbot.py
index 7c28df2..9a2741f 100644
--- a/scripts/revertbot.py
+++ b/scripts/revertbot.py
@@ -123,6 +123,8 @@
class myRevertBot(BaseRevertBot):
+ """Example revert bot."""
+
def callback(self, item):
if 'top' in item:
page = pywikibot.Page(self.site, item['title'])
diff --git a/scripts/script_wui.py b/scripts/script_wui.py
index d3a2c82..730312d 100755
--- a/scripts/script_wui.py
+++ b/scripts/script_wui.py
@@ -1,8 +1,9 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-Bot which runs python framework scripts as (sub-)bot and provides a
-WikiUserInterface (WUI) with Lua support for bot operators.
+Bot which runs python framework scripts as (sub-)bot.
+
+It provides a WikiUserInterface (WUI) with Lua support for bot operators.
This script needs external libraries (see imports and comments there)
in order to run properly. Most of them can be checked-out at:
@@ -116,6 +117,9 @@
class ScriptWUIBot(pywikibot.botirc.IRCBot):
+
+ """WikiUserInterface bot."""
+
def __init__(self, *arg):
pywikibot.output(u'\03{lightgreen}* Initialization of bot\03{default}')
@@ -205,6 +209,7 @@
# Define a function for the thread
def main_script(page, rev=None, params=None):
+ """Main thread."""
# http://opensourcehacker.com/2011/02/23/temporarily-capturing-python-logging…
# https://docs.python.org/release/2.6/library/logging.html
from io import StringIO
@@ -261,6 +266,7 @@
def wiki_logger(buffer, page, rev=None):
+ """Log to wiki."""
# (might be a problem here for TS and SGE, output string has another encoding)
#buffer = buffer.decode(config.console_encoding)
buffer = re.sub("\03\{(.*?)\}(.*?)\03\{default\}", "\g<2>", buffer)
diff --git a/scripts/selflink.py b/scripts/selflink.py
index fdbc920..cb21cc9 100644
--- a/scripts/selflink.py
+++ b/scripts/selflink.py
@@ -2,8 +2,7 @@
# -*- coding: utf-8 -*-
"""
-This bot goes over multiple pages of the site, searches for selflinks, and
-allows removing them.
+This bot searches for selflinks and allows removing them.
These command line parameters can be used to specify which pages to work on:
@@ -34,6 +33,8 @@
class SelflinkBot(Bot):
+ """Self-link removal bot."""
+
def __init__(self, generator, **kwargs):
super(SelflinkBot, self).__init__(**kwargs)
self.generator = generator
diff --git a/scripts/solve_disambiguation.py b/scripts/solve_disambiguation.py
index 1c70083..e79eab8 100644
--- a/scripts/solve_disambiguation.py
+++ b/scripts/solve_disambiguation.py
@@ -1,6 +1,6 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
-"""
+u"""
Script to help a human solve disambiguations by presenting a set of options.
Specify the disambiguation page on the command line.
@@ -353,6 +353,9 @@
class ReferringPageGeneratorWithIgnore:
+
+ """Referring Page generator, with an ignore manager."""
+
def __init__(self, disambPage, primary=False, minimum=0):
self.disambPage = disambPage
# if run with the -primary argument, enable the ignore manager
@@ -390,11 +393,14 @@
class PrimaryIgnoreManager(object):
"""
+ Primary ignore manager.
+
If run with the -primary argument, reads from a file which pages should
not be worked on; these are the ones where the user pressed n last time.
If run without the -primary argument, doesn't ignore any pages.
"""
+
def __init__(self, disambPage, enabled=False):
self.disambPage = disambPage
self.enabled = enabled
@@ -438,6 +444,8 @@
class DisambiguationRobot(Bot):
+ """Disambiguation bot."""
+
ignore_contents = {
'de': (u'{{[Ii]nuse}}',
u'{{[Ll]öschen}}',
@@ -478,13 +486,15 @@
self.setupRegexes()
def checkContents(self, text):
- '''
+ """
+ Check if the text matches any of the ignore regexes.
+
For a given text, returns False if none of the regular
expressions given in the dictionary at the top of this class
matches a substring of the text.
Otherwise returns the substring which is matched by one of
the regular expressions.
- '''
+ """
for ig in self.ignore_contents_regexes:
match = ig.search(text)
if match:
@@ -531,6 +541,8 @@
def treat(self, refPage, disambPage):
"""
+ Treat a page.
+
Parameters:
disambPage - The disambiguation page or redirect we don't want
anything to link to
diff --git a/scripts/spamremove.py b/scripts/spamremove.py
index f4156bf..17cb25f 100755
--- a/scripts/spamremove.py
+++ b/scripts/spamremove.py
@@ -3,6 +3,7 @@
"""
Script to remove links that are being or have been spammed.
+
Usage:
spamremove.py www.spammedsite.com
diff --git a/scripts/template.py b/scripts/template.py
index bd3fc40..b6389d8 100755
--- a/scripts/template.py
+++ b/scripts/template.py
@@ -1,8 +1,9 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-Very simple script to replace a template with another one,
-and to convert the old MediaWiki boilerplate format to the new template format.
+Very simple script to replace a template with another one.
+
+It also converts the old MediaWiki boilerplate format to the new template format.
Syntax: python template.py [-remove] [xml[:filename]] oldTemplate [newTemplate]
@@ -24,7 +25,7 @@
the same effect.
-xml retrieve information from a local dump
- (https://download.wikimedia.org). If this argument isn\'t given,
+ (https://download.wikimedia.org). If this argument isn't given,
info will be loaded from the maintenance page of the live wiki.
argument can also be given as "-xml:filename.xml".
@@ -115,8 +116,9 @@
def UserEditFilterGenerator(generator, username, timestamp=None, skip=False):
"""
- Generator which will yield Pages depending of user:username is an Author of
- that page (only looks at the last 100 editors).
+ Generator which will yield Pages modified by username.
+
+ It only looks at the last 100 editors.
If timestamp is set in MediaWiki format JJJJMMDDhhmmss, older edits are
ignored
If skip is set, pages edited by the given user are ignored otherwise only
@@ -145,13 +147,16 @@
class XmlDumpTemplatePageGenerator:
"""
- Generator which will yield Pages to pages that might contain the chosen
- template. These pages will be retrieved from a local XML dump file
- (cur table).
+ Generator which yield Pages that transclude a template.
+
+ These pages will be retrieved from a local XML dump file
+ (cur table), and may not still transclude the template.
"""
def __init__(self, templates, xmlfilename):
"""
+ Constructor.
+
Arguments:
* templateNames - A list of Page object representing the searched
templates
@@ -189,12 +194,8 @@
class TemplateRobot(Bot):
- """
- This bot will load all pages yielded by a page generator and replace or
- remove all occurences of the old template, or substitute them with the
- template's text.
+ """This bot will replace, remove or subst all occurences of a template."""
- """
def __init__(self, generator, templates, **kwargs):
"""
Constructor.
diff --git a/scripts/templatecount.py b/scripts/templatecount.py
index b26c37d..fd67c21 100644
--- a/scripts/templatecount.py
+++ b/scripts/templatecount.py
@@ -1,8 +1,9 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-This script will display the list of pages transcluding a given list of
-templates. It can also be used to simply count the number of pages (rather than
+This script will display the list of pages transcluding a given list of templates.
+
+It can also be used to simply count the number of pages (rather than
listing each individually).
Syntax: python templatecount.py command [arguments]
@@ -46,6 +47,8 @@
class TemplateCountRobot:
+ """Template count bot."""
+
@staticmethod
def countTemplates(templates, namespaces):
templateDict = TemplateCountRobot.template_dict(templates, namespaces)
diff --git a/scripts/touch.py b/scripts/touch.py
index 3d696a6..8048a57 100755
--- a/scripts/touch.py
+++ b/scripts/touch.py
@@ -2,8 +2,9 @@
# -*- coding: utf-8 -*-
"""
-This bot goes over multiple pages of a wiki, and edits them without
-changing. This is for example used to get category links in templates
+This bot goes over multiple pages of a wiki, and edits them without changes.
+
+This is for example used to get category links in templates
working.
This script understands various command-line arguments:
@@ -31,6 +32,8 @@
class TouchBot(pywikibot.Bot):
+ """Page touch bot."""
+
def __init__(self, generator, **kwargs):
self.availableOptions.update({
'redir': False, # include redirect pages
diff --git a/scripts/transferbot.py b/scripts/transferbot.py
index 594ba59..599bc91 100644
--- a/scripts/transferbot.py
+++ b/scripts/transferbot.py
@@ -2,8 +2,9 @@
# -*- coding: utf-8 -*-
"""
-This script transfers pages from a source wiki to a target wiki. It also
-copies edit history to a subpage.
+This script transfers pages from a source wiki to a target wiki.
+
+It also copies edit history to a subpage.
-tolang: The target site code.
diff --git a/scripts/unlink.py b/scripts/unlink.py
index ee0796a..a94ff9f 100755
--- a/scripts/unlink.py
+++ b/scripts/unlink.py
@@ -36,6 +36,8 @@
class UnlinkBot(Bot):
+ """Page unlinking bot."""
+
def __init__(self, pageToUnlink, **kwargs):
self.availableOptions.update({
'namespaces': [],
@@ -73,7 +75,7 @@
def handleNextLink(self, text, match, context=100):
"""
- Returns a tuple (text, jumpToBeginning).
+ Return a tuple (text, jumpToBeginning).
text is the unicode string after the current link has been processed.
jumpToBeginning is a boolean which specifies if the cursor position
diff --git a/scripts/unusedfiles.py b/scripts/unusedfiles.py
index bf17f78..611bd90 100644
--- a/scripts/unusedfiles.py
+++ b/scripts/unusedfiles.py
@@ -1,8 +1,7 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
"""
-This bot appends some text to all unused images and other text to the
-respective uploaders.
+This bot appends some text to all unused images and notifies uploaders.
Parameters:
@@ -51,6 +50,8 @@
class UnusedFilesBot(Bot):
+ """Unused files bot."""
+
def __init__(self, site, **kwargs):
super(UnusedFilesBot, self).__init__(**kwargs)
self.site = site
diff --git a/scripts/upload.py b/scripts/upload.py
index 22d1f12..69127b0 100755
--- a/scripts/upload.py
+++ b/scripts/upload.py
@@ -59,11 +59,16 @@
class UploadRobot:
+
+ """Upload bot."""
+
def __init__(self, url, urlEncoding=None, description=u'',
useFilename=None, keepFilename=False,
verifyDescription=True, ignoreWarning=False,
targetSite=None, uploadByUrl=False, aborts=[], chunk_size=0):
"""
+ Constructor.
+
@param ignoreWarning: Set this to True if you want to upload even if
another file would be overwritten or another mistake would be
risked.
@@ -294,7 +299,7 @@
return filename # data['filename']
def run(self):
-
+ """Run bot."""
# early check that upload is enabled
if self.targetSite.is_uploaddisabled():
pywikibot.error(
diff --git a/scripts/version.py b/scripts/version.py
index 3dc2b2d..d211884 100755
--- a/scripts/version.py
+++ b/scripts/version.py
@@ -1,6 +1,6 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
-""" Script to determine the Pywikibot version (tag, revision and date) """
+""" Script to determine the Pywikibot version (tag, revision and date). """
#
# (C) Merlijn 'valhallasw' van Deen, 2007-2008
# (C) xqt, 2010-2014
diff --git a/scripts/weblinkchecker.py b/scripts/weblinkchecker.py
index 062f883..5c2a0e9 100644
--- a/scripts/weblinkchecker.py
+++ b/scripts/weblinkchecker.py
@@ -1,7 +1,8 @@
# -*- coding: utf-8 -*-
"""
-This bot is used for checking external links found at the wiki. It checks
-several pages at once, with a limit set by the config variable
+This bot is used for checking external links found at the wiki.
+
+It checks several pages at once, with a limit set by the config variable
max_external_links, which defaults to 50.
The bot won't change any wiki pages, it will only report dead links such that
@@ -149,6 +150,11 @@
def weblinksIn(text, withoutBracketed=False, onlyBracketed=False):
+ """
+ Yield web links from text.
+
+ TODO: move to textlib
+ """
text = textlib.removeDisabledParts(text)
# MediaWiki parses templates before parsing external links. Thus, there
@@ -229,6 +235,8 @@
class LinkChecker(object):
"""
+ Check links.
+
Given a HTTP URL, tries to load the page from the Internet and checks if it
is still online.
@@ -239,9 +247,12 @@
correctly! (This will give a Socket Error)
"""
+
def __init__(self, url, redirectChain=[], serverEncoding=None,
HTTPignore=[]):
"""
+ Constructor.
+
redirectChain is a list of redirects which were resolved by
resolveRedirect(). This is needed to detect redirect loops.
"""
@@ -322,12 +333,12 @@
def resolveRedirect(self, useHEAD=False):
"""
- Requests the header from the server. If the page is an HTTP redirect,
- returns the redirect target URL as a string. Otherwise returns None.
+ Return the redirect target URL as a string, if it is a HTTP redirect.
If useHEAD is true, uses the HTTP HEAD method, which saves bandwidth
by not downloading the body. Otherwise, the HTTP GET method is used.
+ @rtype: unicode or None
"""
conn = self.getConnection()
try:
@@ -387,8 +398,9 @@
def check(self, useHEAD=False):
"""
- Returns True and the server status message if the page is alive.
- Otherwise returns false
+ Return True and the server status message if the page is alive.
+
+ @rtype: tuple of (bool, unicode)
"""
try:
wasRedirected = self.resolveRedirect(useHEAD=useHEAD)
@@ -480,10 +492,11 @@
class LinkCheckThread(threading.Thread):
- """ A thread responsible for checking one URL. After checking the page, it
- will die.
+ """ A thread responsible for checking one URL.
+ After checking the page, it will die.
"""
+
def __init__(self, page, url, history, HTTPignore, day):
threading.Thread.__init__(self)
self.page = page
@@ -515,7 +528,10 @@
class History:
- """ Store previously found dead links. The URLs are dictionary keys, and
+ """
+ Store previously found dead links.
+
+ The URLs are dictionary keys, and
values are lists of tuples where each tuple represents one time the URL was
found dead. Tuples have the form (title, date, error) where title is the
wiki page where the URL was found, date is an instance of time, and error is
@@ -551,7 +567,7 @@
self.historyDict = {}
def log(self, url, error, containingPage, archiveURL):
- """Logs an error report to a text file in the deadlinks subdirectory."""
+ """Log an error report to a text file in the deadlinks subdirectory."""
if archiveURL:
errorReport = u'* %s ([%s archive])\n' % (url, archiveURL)
else:
@@ -579,7 +595,7 @@
archiveURL)
def setLinkDead(self, url, error, page, day):
- """Adds the fact that the link was found dead to the .dat file."""
+ """Add the fact that the link was found dead to the .dat file."""
self.semaphore.acquire()
now = time.time()
if url in self.historyDict:
@@ -604,8 +620,11 @@
def setLinkAlive(self, url):
"""
- If the link was previously found dead, removes it from the .dat file
- and returns True, else returns False.
+ Record that the link is now alive.
+
+ If link was previously found dead, remove it from the .dat file.
+
+ @return: True if previously found dead, else returns False.
"""
if url in self.historyDict:
self.semaphore.acquire()
@@ -628,8 +647,9 @@
class DeadLinkReportThread(threading.Thread):
"""
- A Thread that is responsible for posting error reports on talk pages. There
- will only be one DeadLinkReportThread, and it is using a semaphore to make
+ A Thread that is responsible for posting error reports on talk pages.
+
+ There is only one DeadLinkReportThread, and it is using a semaphore to make
sure that two LinkCheckerThreads can not access the queue at the same time.
"""
@@ -641,10 +661,7 @@
self.killed = False
def report(self, url, errorReport, containingPage, archiveURL):
- """ Tries to add an error report to the talk page belonging to the page
- containing the dead link.
-
- """
+ """Report error on talk page of the page containing the dead link."""
self.semaphore.acquire()
self.queue.append((url, errorReport, containingPage, archiveURL))
self.semaphore.release()
@@ -727,10 +744,11 @@
class WeblinkCheckerRobot:
"""
- Bot which will use several LinkCheckThreads at once to search for dead
- weblinks on pages provided by the given generator.
+ Bot which will search for dead weblinks.
+ It uses several LinkCheckThreads at once to process pages from generator.
"""
+
def __init__(self, generator, HTTPignore=None, day=7):
self.generator = generator
if config.report_dead_links_on_talk:
@@ -777,6 +795,7 @@
def RepeatPageGenerator():
+ """Generator for pages in History."""
history = History(None)
pageTitles = set()
for value in history.historyDict.values():
@@ -788,6 +807,12 @@
def countLinkCheckThreads():
+ """
+ Count LinkCheckThread threads.
+
+ @return: number of LinkCheckThread threads
+ @rtype: int
+ """
i = 0
for thread in threading.enumerate():
if isinstance(thread, LinkCheckThread):
diff --git a/scripts/welcome.py b/scripts/welcome.py
index e0805ce..63a9764 100644
--- a/scripts/welcome.py
+++ b/scripts/welcome.py
@@ -1,11 +1,10 @@
# -*- coding: utf-8 -*-
-"""
-Script to welcome new users. This script works out of the box for Wikis that
+u"""
+Script to welcome new users.
+
+This script works out of the box for Wikis that
have been defined in the script. It is currently used on the Dutch, Norwegian,
Albanian, Italian Wikipedia, Wikimedia Commons and English Wikiquote.
-
-Note: You can download the latest version available
-from here: https://www.mediawiki.org/wiki/Manual:Pywikibot/welcome.py
Ensure you have community support before running this bot!
@@ -402,14 +401,13 @@
class FilenameNotSet(pywikibot.Error):
+
"""An exception indicating that a signature filename was not specifed."""
class Global(object):
- """Container class for global settings.
- Use of globals outside of this is to be avoided.
- """
+ """Container class for global settings."""
attachEditCount = 1 # number of edits that an user required to be welcomed
dumpToLog = 15 # number of users that are required to add the log :)
@@ -437,7 +435,6 @@
def __init__(self):
"""Constructor."""
-
self.site = pywikibot.Site()
self.check_managed_sites()
self.bname = dict()
@@ -856,6 +853,7 @@
def showStatus(n=0):
+ """Output colorized status."""
staColor = {
0: 'lightpurple',
1: 'lightaqua',
@@ -877,7 +875,7 @@
def load_word_function(raw):
- """ This is a function used to load the badword and the whitelist."""
+ """Load the badword list and the whitelist."""
page = re.compile(r"(?:\"|\')(.*?)(?:\"|\')(?:, |\))", re.UNICODE)
list_loaded = page.findall(raw)
if len(list_loaded) == 0:
diff --git a/tox.ini b/tox.ini
index 359851b..e873031 100644
--- a/tox.ini
+++ b/tox.ini
@@ -53,13 +53,58 @@
./pywikibot/data/api.py \
./pywikibot/userinterfaces/transliteration.py \
./pywikibot/userinterfaces/terminal_interface.py \
+ ./scripts/__init__.py \
+ ./scripts/basic.py \
./scripts/category.py \
+ ./scripts/category_redirect.py \
./scripts/claimit.py \
+ ./scripts/clean_sandbox.py \
+ ./scripts/commons_link.py \
+ ./scripts/commonscat.py \
./scripts/coordinate_import.py \
+ ./scripts/cosmetic_changes.py \
+ ./scripts/create_categories.py \
+ ./scripts/data_ingestion.py \
+ ./scripts/delete.py \
+ ./scripts/editarticle.py \
+ ./scripts/flickrripper.py \
+ ./scripts/freebasemappingupload.py \
./scripts/harvest_template.py \
./scripts/illustrate_wikidata.py \
+ ./scripts/image.py \
+ ./scripts/imagerecat.py \
+ ./scripts/imagetransfer.py \
+ ./scripts/imageuncat.py \
+ ./scripts/isbn.py \
+ ./scripts/listpages.py \
+ ./scripts/login.py \
+ ./scripts/lonelypages.py \
./scripts/newitem.py \
+ ./scripts/misspelling.py \
+ ./scripts/movepages.py \
+ ./scripts/noreferences.py \
+ ./scripts/nowcommons.py \
./scripts/pagefromfile.py \
+ ./scripts/protect.py \
+ ./scripts/redirect.py \
+ ./scripts/reflinks.py \
+ ./scripts/replace.py \
+ ./scripts/replicate_wiki.py \
+ ./scripts/revertbot.py \
+ ./scripts/script_wui.py \
+ ./scripts/selflink.py \
+ ./scripts/shell.py \
+ ./scripts/spamremove.py \
+ ./scripts/template.py \
+ ./scripts/templatecount.py \
+ ./scripts/touch.py \
+ ./scripts/transferbot.py \
+ ./scripts/unlink.py \
+ ./scripts/unusedfiles.py \
+ ./scripts/version.py \
+ ./scripts/watchlist.py \
+ ./scripts/weblinkchecker.py \
+ ./scripts/welcome.py \
./tests/aspects.py \
./tests/deprecation_tests.py \
./tests/api_tests.py \
--
To view, visit https://gerrit.wikimedia.org/r/165453
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Ic5a25fac9592fead9a6d8b0748bf13947ef7f2c7
Gerrit-PatchSet: 4
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Decode __file__ using filesystem encoding
......................................................................
Decode __file__ using filesystem encoding
If the python packages loaded are in a non-ASCII named directory,
pywikibot on Python 2.x fails with UnicodeDecodeError.
Decode the __file__ values with sys.getfilesystemencoding().
Bug: 69476
Change-Id: I74c8d08b48b4f03ef38ccdbfce8ff3e5886289d5
---
M pywikibot/version.py
1 file changed, 3 insertions(+), 0 deletions(-)
Approvals:
Mpaa: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/version.py b/pywikibot/version.py
index 7e232a1..e82f0c5 100644
--- a/pywikibot/version.py
+++ b/pywikibot/version.py
@@ -346,6 +346,9 @@
if '__init__.py' in path:
path = path[0:path.index('__init__.py')]
+ if sys.version_info[0] == 2:
+ path = path.decode(sys.getfilesystemencoding())
+
info['path'] = path
assert(path not in paths)
paths[path] = name
--
To view, visit https://gerrit.wikimedia.org/r/167011
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I74c8d08b48b4f03ef38ccdbfce8ff3e5886289d5
Gerrit-PatchSet: 2
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: jenkins-bot <>
XZise has submitted this change and it was merged.
Change subject: Use sensible default for config.mylang
......................................................................
Use sensible default for config.mylang
If the user has not altered user-config.mylang, it is set to
'language' and pywikibot.Site() raises a confusing exception:
UnknownSite: Language language does not exist in family wikipedia
Detect this situation, report a warning, and set the default to
family 'test' and mylang 'test, which is test.wikipedia.org.
Change-Id: I3b890039381c7f8d06dfe6c0d83140ec5f2f3ed6
---
M pywikibot/config2.py
1 file changed, 7 insertions(+), 0 deletions(-)
Approvals:
XZise: Looks good to me, approved
diff --git a/pywikibot/config2.py b/pywikibot/config2.py
index 4e01416..55453dc 100644
--- a/pywikibot/config2.py
+++ b/pywikibot/config2.py
@@ -50,6 +50,8 @@
family = 'wikipedia'
# The language code of the site we're working on.
mylang = 'language'
+# If family and mylang are not modified from the above, the default is changed
+# to test:test, which is test.wikipedia.org, at the end of this module.
# The dictionary usernames should contain a username for each site where you
# have a bot account. Please set your usernames by adding such lines to your
@@ -831,6 +833,11 @@
elif transliteration_target in ('None', 'none'):
transliteration_target = None
+# Fix up default site
+if family == 'wikipedia' and mylang == 'language':
+ print("WARNING: family and mylang are not set.\n"
+ "Defaulting to family='test' and mylang='test'.")
+ family = mylang = 'test'
#
# When called as main program, list all configuration variables
--
To view, visit https://gerrit.wikimedia.org/r/167571
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I3b890039381c7f8d06dfe6c0d83140ec5f2f3ed6
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
XZise has submitted this change and it was merged.
Change subject: Standardise script main()
......................................................................
Standardise script main()
Allow scripts to be invoked with command line options from within
Python, such as <module>.main('-simulate')
Use new handle_args, and process global arguments before local arguments.
Also add docstrings to main() functions.
Not ready to be standardised:
makecat
casechecker
data_ingestion
Change-Id: I828ba3cf155fb6d3e052564d20768fdf9c3acc95
---
M scripts/add_text.py
M scripts/archivebot.py
M scripts/basic.py
M scripts/blockpageschecker.py
M scripts/blockreview.py
M scripts/capitalize_redirects.py
M scripts/catall.py
M scripts/category.py
M scripts/category_redirect.py
M scripts/cfd.py
M scripts/checkimages.py
M scripts/claimit.py
M scripts/clean_sandbox.py
M scripts/commons_link.py
M scripts/commonscat.py
M scripts/coordinate_import.py
M scripts/cosmetic_changes.py
M scripts/create_categories.py
M scripts/delete.py
M scripts/disambredir.py
M scripts/editarticle.py
M scripts/featured.py
M scripts/fixing_redirects.py
M scripts/flickrripper.py
M scripts/freebasemappingupload.py
M scripts/harvest_template.py
M scripts/illustrate_wikidata.py
M scripts/image.py
M scripts/imagerecat.py
M scripts/imagetransfer.py
M scripts/imageuncat.py
M scripts/interwiki.py
M scripts/isbn.py
M scripts/listpages.py
M scripts/login.py
M scripts/lonelypages.py
M scripts/misspelling.py
M scripts/movepages.py
M scripts/newitem.py
M scripts/noreferences.py
M scripts/nowcommons.py
M scripts/pagefromfile.py
M scripts/protect.py
M scripts/redirect.py
M scripts/reflinks.py
M scripts/replace.py
M scripts/revertbot.py
M scripts/script_wui.py
M scripts/selflink.py
M scripts/solve_disambiguation.py
M scripts/spamremove.py
M scripts/template.py
M scripts/templatecount.py
M scripts/touch.py
M scripts/transferbot.py
M scripts/unlink.py
M scripts/unusedfiles.py
M scripts/upload.py
M scripts/watchlist.py
M scripts/weblinkchecker.py
M scripts/welcome.py
M tox.ini
62 files changed, 603 insertions(+), 140 deletions(-)
Approvals:
John Vandenberg: Looks good to me, but someone else must approve
XZise: Looks good to me, approved
diff --git a/scripts/add_text.py b/scripts/add_text.py
index 1d104e1..8979dba 100644
--- a/scripts/add_text.py
+++ b/scripts/add_text.py
@@ -275,7 +275,15 @@
return (text, newtext, always)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# If none, the var is setted only for check purpose.
summary = None
addText = None
@@ -292,7 +300,7 @@
up = False
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
# Loading the arguments
diff --git a/scripts/archivebot.py b/scripts/archivebot.py
index 015f924..63f5090 100644
--- a/scripts/archivebot.py
+++ b/scripts/archivebot.py
@@ -524,7 +524,15 @@
self.page.update(comment)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
filename = None
pagename = None
namespace = None
@@ -537,7 +545,7 @@
if arg.startswith(name):
yield arg[len(name) + 1:]
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
for v in if_arg_value(arg, '-file'):
filename = v
for v in if_arg_value(arg, '-locale'):
diff --git a/scripts/basic.py b/scripts/basic.py
index 99f83f1..c1af240 100755
--- a/scripts/basic.py
+++ b/scripts/basic.py
@@ -136,10 +136,17 @@
return False
-def main():
- """ Process command line arguments and invoke BasicBot. """
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Process global arguments to determine desired site
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
# This factory is responsible for processing command line arguments
# that are also used by other scripts and that determine on which pages
diff --git a/scripts/blockpageschecker.py b/scripts/blockpageschecker.py
index 70c5fdf..fd548af 100755
--- a/scripts/blockpageschecker.py
+++ b/scripts/blockpageschecker.py
@@ -205,8 +205,15 @@
editor.edit(page.text)
-def main():
- """Main Function."""
+def main(*args):
+ """
+ Process command line arguments and perform task.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Loading the comments
global categoryToCheck, project_inserted
# always, define a generator to understand if the user sets one,
@@ -223,7 +230,7 @@
errorCount = 0
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
# Process local args
diff --git a/scripts/blockreview.py b/scripts/blockreview.py
index f184f57..f3f9c98 100644
--- a/scripts/blockreview.py
+++ b/scripts/blockreview.py
@@ -303,11 +303,19 @@
return True
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
show = False
# Parse command line arguments
- for arg in pywikibot.handleArgs():
+ if pywikibot.handle_args(args):
show = True
if not show:
diff --git a/scripts/capitalize_redirects.py b/scripts/capitalize_redirects.py
index 0c4c7a9..d646b38 100644
--- a/scripts/capitalize_redirects.py
+++ b/scripts/capitalize_redirects.py
@@ -87,10 +87,18 @@
pywikibot.output(u"An error occurred, skipping...")
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/catall.py b/scripts/catall.py
index 9c176a9..dbd1705 100755
--- a/scripts/catall.py
+++ b/scripts/catall.py
@@ -72,11 +72,19 @@
comment=i18n.twtranslate(site.code, 'catall-changing'))
-def main():
+def main(*args):
+ """
+ Process command line arguments and perform task.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
docorrections = True
start = 'A'
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
for arg in local_args:
if arg == '-onlynew':
diff --git a/scripts/category.py b/scripts/category.py
index 429fc8a..ce88daa 100755
--- a/scripts/category.py
+++ b/scripts/category.py
@@ -1034,6 +1034,14 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
fromGiven = False
toGiven = False
batchMode = False
@@ -1053,7 +1061,7 @@
depth = 5
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs(*args)
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
# The generator gives the pages that should be worked upon.
diff --git a/scripts/category_redirect.py b/scripts/category_redirect.py
index b6e946f..cb5d321 100755
--- a/scripts/category_redirect.py
+++ b/scripts/category_redirect.py
@@ -416,7 +416,15 @@
def main(*args):
- a = pywikibot.handleArgs(*args)
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
+ a = pywikibot.handle_args(args)
if len(a) == 1:
raise RuntimeError('Unrecognized argument "%s"' % a[0])
elif a:
diff --git a/scripts/cfd.py b/scripts/cfd.py
index ab2b0e9..c29e19d 100644
--- a/scripts/cfd.py
+++ b/scripts/cfd.py
@@ -54,8 +54,16 @@
return self.result
-def main():
- pywikibot.handleArgs()
+def main(*args):
+ """
+ Process command line arguments and perform task.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
+ pywikibot.handle_args(args)
page = pywikibot.Page(pywikibot.Site(), cfdPage)
diff --git a/scripts/checkimages.py b/scripts/checkimages.py
index 980f3ce..ff56c81 100644
--- a/scripts/checkimages.py
+++ b/scripts/checkimages.py
@@ -1761,8 +1761,15 @@
return True
-def main():
- """Main function."""
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Command line configurable parameters
repeat = True # Restart after having check all the images?
limit = 80 # How many images check?
@@ -1781,7 +1788,7 @@
generator = None
# Here below there are the parameters.
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-limit'):
if len(arg) == 7:
limit = int(pywikibot.input(
diff --git a/scripts/claimit.py b/scripts/claimit.py
index d389c93..108f2f7 100755
--- a/scripts/claimit.py
+++ b/scripts/claimit.py
@@ -149,12 +149,20 @@
return True
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
exists_arg = ''
commandline_claims = list()
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
gen = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/clean_sandbox.py b/scripts/clean_sandbox.py
index 052ec14..ee1666d 100755
--- a/scripts/clean_sandbox.py
+++ b/scripts/clean_sandbox.py
@@ -272,9 +272,17 @@
time.sleep(self.getOption('hours') * 60 * 60)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
opts = {}
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-hours:'):
opts['hours'] = float(arg[7:])
opts['no_repeat'] = False
diff --git a/scripts/commons_link.py b/scripts/commons_link.py
index 45887c0..690d1e1 100644
--- a/scripts/commons_link.py
+++ b/scripts/commons_link.py
@@ -107,10 +107,18 @@
pywikibot.output(u'Page %s is locked' % page.title())
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/commonscat.py b/scripts/commonscat.py
index 5029830..760a5cc 100755
--- a/scripts/commonscat.py
+++ b/scripts/commonscat.py
@@ -498,9 +498,14 @@
return u''
-def main():
- """ Parse the command line arguments and get a pagegenerator to work on.
- Iterate through all the pages.
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
"""
summary = None
generator = None
@@ -510,7 +515,7 @@
ns.append(14)
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/coordinate_import.py b/scripts/coordinate_import.py
index 71cef9c..0febc70 100644
--- a/scripts/coordinate_import.py
+++ b/scripts/coordinate_import.py
@@ -101,9 +101,17 @@
pywikibot.output(u'Skipping unsupported globe: %s' % e.args)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
gen = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/cosmetic_changes.py b/scripts/cosmetic_changes.py
index e758452..1c827fb 100755
--- a/scripts/cosmetic_changes.py
+++ b/scripts/cosmetic_changes.py
@@ -919,12 +919,20 @@
% page.title(asLink=True))
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
answer = 'y'
options = {}
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
@@ -967,7 +975,4 @@
pywikibot.showHelp()
if __name__ == "__main__":
- try:
- main()
- finally:
- pywikibot.stopme()
+ main()
diff --git a/scripts/create_categories.py b/scripts/create_categories.py
index 7294a57..568219c 100755
--- a/scripts/create_categories.py
+++ b/scripts/create_categories.py
@@ -72,14 +72,21 @@
self.create_category(page)
-def main():
- """Main loop. Get a generator and options."""
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
parent = None
basename = None
options = {}
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/delete.py b/scripts/delete.py
index 2efe8e0..2b5c329 100644
--- a/scripts/delete.py
+++ b/scripts/delete.py
@@ -75,14 +75,22 @@
page.delete(self.summary, not self.getOption('always'))
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
pageName = ''
summary = None
generator = None
options = {}
# read command line parameters
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
mysite = pywikibot.Site()
diff --git a/scripts/disambredir.py b/scripts/disambredir.py
index 4cbc419..3413dde 100644
--- a/scripts/disambredir.py
+++ b/scripts/disambredir.py
@@ -149,8 +149,16 @@
page.put(text, comment)
-def main():
- local_args = pywikibot.handleArgs()
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
+ local_args = pywikibot.handle_args(args)
generator = None
start = local_args[0] if local_args else '!'
diff --git a/scripts/editarticle.py b/scripts/editarticle.py
index 9e4e16a..d9ffe13 100755
--- a/scripts/editarticle.py
+++ b/scripts/editarticle.py
@@ -40,7 +40,7 @@
def set_options(self, *args):
"""Parse commandline and set options attribute."""
my_args = []
- for arg in pywikibot.handleArgs(*args):
+ for arg in pywikibot.handle_args(args):
my_args.append(arg)
parser = optparse.OptionParser()
parser.add_option("-r", "--edit_redirect", action="store_true",
@@ -97,6 +97,14 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
app = ArticleEditor(*args)
app.run()
diff --git a/scripts/featured.py b/scripts/featured.py
index 59a4d4b..7413860 100644
--- a/scripts/featured.py
+++ b/scripts/featured.py
@@ -598,8 +598,16 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-fromlang:'):
options[arg[1:9]] = arg[10:].split(",")
elif arg.startswith('-after:'):
@@ -617,7 +625,4 @@
if __name__ == "__main__":
- try:
- main()
- finally:
- pywikibot.stopme()
+ main()
diff --git a/scripts/fixing_redirects.py b/scripts/fixing_redirects.py
index 3e13a6e..8a7fa0f 100644
--- a/scripts/fixing_redirects.py
+++ b/scripts/fixing_redirects.py
@@ -187,12 +187,20 @@
pywikibot.error('unable to put %s' % page)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
featured = False
gen = None
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/flickrripper.py b/scripts/flickrripper.py
index 46d3475..eaff57e 100644
--- a/scripts/flickrripper.py
+++ b/scripts/flickrripper.py
@@ -517,7 +517,15 @@
return
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Get the api key
if not config.flickr['api_key']:
pywikibot.output('Flickr api key not found! Get yourself an api key')
@@ -567,7 +575,7 @@
# Should be renamed to overrideLicense or something like that
override = u''
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-group_id'):
if len(arg) == 9:
group_id = pywikibot.input(u'What is the group_id of the pool?')
diff --git a/scripts/freebasemappingupload.py b/scripts/freebasemappingupload.py
index 59be5ce..b021abd 100644
--- a/scripts/freebasemappingupload.py
+++ b/scripts/freebasemappingupload.py
@@ -96,9 +96,17 @@
pywikibot.output('Claim added!')
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
filename = 'fb2w.nt.gz' # Default filename
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-filename'):
filename = arg[11:]
bot = FreebaseMapperRobot(filename)
diff --git a/scripts/harvest_template.py b/scripts/harvest_template.py
index cdb1275..60b5b2f 100755
--- a/scripts/harvest_template.py
+++ b/scripts/harvest_template.py
@@ -180,12 +180,20 @@
claim.addSource(source, bot=True)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
commandline_arguments = list()
template_title = u''
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
gen = pg.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/illustrate_wikidata.py b/scripts/illustrate_wikidata.py
index 0a4dc1c..3d9b737 100644
--- a/scripts/illustrate_wikidata.py
+++ b/scripts/illustrate_wikidata.py
@@ -86,9 +86,17 @@
newclaim.addSource(source, bot=True)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
gen = pg.GeneratorFactory()
wdproperty = u'P18'
diff --git a/scripts/image.py b/scripts/image.py
index 1bcb42b..68eb89f 100644
--- a/scripts/image.py
+++ b/scripts/image.py
@@ -169,12 +169,20 @@
replaceBot.run()
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
old_image = None
new_image = None
options = {}
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg == '-always':
options['always'] = True
elif arg == '-loose':
diff --git a/scripts/imagerecat.py b/scripts/imagerecat.py
index 295bb7f..8a8558d 100644
--- a/scripts/imagerecat.py
+++ b/scripts/imagerecat.py
@@ -440,14 +440,21 @@
return result
-def main():
- """Main loop. Get a generator and options. Work on all images in the generator."""
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
generator = None
onlyFilter = False
onlyUncat = False
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
global search_wikis
diff --git a/scripts/imagetransfer.py b/scripts/imagetransfer.py
index f26e254..587502b 100644
--- a/scripts/imagetransfer.py
+++ b/scripts/imagetransfer.py
@@ -298,7 +298,15 @@
pywikibot.output(u'No such image number.')
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
pageTitle = None
gen = None
@@ -307,7 +315,7 @@
targetLang = None
targetFamily = None
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
for arg in local_args:
if arg == '-interwiki':
diff --git a/scripts/imageuncat.py b/scripts/imageuncat.py
index 0bcb3d9..cc2aa9b 100755
--- a/scripts/imageuncat.py
+++ b/scripts/imageuncat.py
@@ -1315,12 +1315,17 @@
def main(*args):
- '''
- Grab a bunch of images and tag them if they are not categorized.
- '''
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
generator = None
- local_args = pywikibot.handleArgs(*args)
+ local_args = pywikibot.handle_args(args)
# use the default imagerepository normally commons
site = pywikibot.Site().image_repository()
diff --git a/scripts/interwiki.py b/scripts/interwiki.py
index f415cb7..61bc6b0 100755
--- a/scripts/interwiki.py
+++ b/scripts/interwiki.py
@@ -2417,7 +2417,15 @@
bot.add(page, hints=hintStrings)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
singlePageTitle = ''
opthintsonly = False
# Which namespaces should be processed?
@@ -2436,7 +2444,7 @@
newPages = None
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/isbn.py b/scripts/isbn.py
index ac67dac..2a22aaa 100755
--- a/scripts/isbn.py
+++ b/scripts/isbn.py
@@ -1417,11 +1417,19 @@
self.treat(page)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/listpages.py b/scripts/listpages.py
index e17c686..0580a54 100644
--- a/scripts/listpages.py
+++ b/scripts/listpages.py
@@ -142,7 +142,14 @@
def main(*args):
- """Main function."""
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
gen = None
notitle = False
fmt = '1'
@@ -150,7 +157,7 @@
page_get = False
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs(*args)
+ local_args = pywikibot.handle_args(args)
genFactory = GeneratorFactory()
for arg in local_args:
diff --git a/scripts/login.py b/scripts/login.py
index adb7a12..71cdbb2 100755
--- a/scripts/login.py
+++ b/scripts/login.py
@@ -62,11 +62,19 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
password = None
sysop = False
logall = False
logout = False
- for arg in pywikibot.handleArgs(*args):
+ for arg in pywikibot.handle_args(args):
if arg.startswith("-pass"):
if len(arg) == 5:
password = pywikibot.input(u'Password for all accounts (no characters will be shown):',
@@ -112,5 +120,7 @@
except SiteDefinitionError:
pywikibot.output(u'%s.%s is not a valid site, please remove it'
u' from your config' % (lang, familyName))
+
+
if __name__ == "__main__":
main()
diff --git a/scripts/lonelypages.py b/scripts/lonelypages.py
index f956d2a..89695bc 100644
--- a/scripts/lonelypages.py
+++ b/scripts/lonelypages.py
@@ -189,10 +189,18 @@
self.userPut(page, oldtxt, newtxt, comment=self.comment)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
site = pywikibot.Site()
diff --git a/scripts/misspelling.py b/scripts/misspelling.py
index ec9644b..2865c81 100644
--- a/scripts/misspelling.py
+++ b/scripts/misspelling.py
@@ -127,14 +127,22 @@
{'page': disambPage.title()})
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# the option that's always selected when the bot wonders what to do with
# a link. If it's None, the user is prompted (default behaviour).
always = None
main_only = False
firstPageTitle = None
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-always:'):
always = arg[8:]
elif arg.startswith('-start'):
diff --git a/scripts/movepages.py b/scripts/movepages.py
index 5af0ac3..7c2c6ac 100644
--- a/scripts/movepages.py
+++ b/scripts/movepages.py
@@ -197,14 +197,22 @@
self.treat(page)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
gen = None
oldName = None
options = {}
fromToPairs = []
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/newitem.py b/scripts/newitem.py
index b6ea11f..a756921 100644
--- a/scripts/newitem.py
+++ b/scripts/newitem.py
@@ -118,9 +118,17 @@
page.put(page.text)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
gen = pagegenerators.GeneratorFactory()
options = {}
diff --git a/scripts/noreferences.py b/scripts/noreferences.py
index 8fece93..0c5ff7c 100755
--- a/scripts/noreferences.py
+++ b/scripts/noreferences.py
@@ -650,11 +650,19 @@
pywikibot.output(u'Skipping %s (locked page)' % page.title())
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/nowcommons.py b/scripts/nowcommons.py
index ef6d436..dc16bd0 100644
--- a/scripts/nowcommons.py
+++ b/scripts/nowcommons.py
@@ -429,10 +429,18 @@
continue
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-') and \
arg[1:] in ('always', 'replace', 'replaceloose', 'replaceonly'):
options[arg[1:]] = True
diff --git a/scripts/pagefromfile.py b/scripts/pagefromfile.py
index dc50d89..cf067ba 100644
--- a/scripts/pagefromfile.py
+++ b/scripts/pagefromfile.py
@@ -241,8 +241,15 @@
return location.end(), title, contents
-def main():
- """Main function."""
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Adapt these to the file you are using. 'pageStartMarker' and
# 'pageEndMarker' are the beginning and end of each entry. Take text that
# should be included and does not occur elsewhere in the text.
@@ -257,7 +264,7 @@
include = False
notitle = False
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith("-start:"):
pageStartMarker = arg[7:]
elif arg.startswith("-end:"):
diff --git a/scripts/protect.py b/scripts/protect.py
index 7a20ed1..0a4279f 100644
--- a/scripts/protect.py
+++ b/scripts/protect.py
@@ -138,6 +138,14 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
message_properties = {}
generator = None
@@ -152,7 +160,7 @@
}
# read command line parameters
- local_args = pywikibot.handleArgs(*args)
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
site = pywikibot.Site()
diff --git a/scripts/redirect.py b/scripts/redirect.py
index d6eb487..2f3d087 100755
--- a/scripts/redirect.py
+++ b/scripts/redirect.py
@@ -711,6 +711,14 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
# what the bot should do (either resolve double redirs, or delete broken
# redirs)
@@ -731,7 +739,7 @@
until = ''
number = None
step = None
- for arg in pywikibot.handleArgs(*args):
+ for arg in pywikibot.handle_args(args):
if arg == 'double' or arg == 'do':
action = 'double'
elif arg == 'broken' or arg == 'br':
diff --git a/scripts/reflinks.py b/scripts/reflinks.py
index f0b7d61..2404093 100644
--- a/scripts/reflinks.py
+++ b/scripts/reflinks.py
@@ -768,14 +768,22 @@
return
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
xmlFilename = None
options = {}
namespaces = []
generator = None
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/replace.py b/scripts/replace.py
index 5c9e750..da332cf 100755
--- a/scripts/replace.py
+++ b/scripts/replace.py
@@ -438,6 +438,14 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
add_cat = None
gen = None
# summary message
@@ -485,7 +493,7 @@
# Read commandline parameters.
- local_args = pywikibot.handleArgs(*args)
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/revertbot.py b/scripts/revertbot.py
index 1b6d54e..7c28df2 100644
--- a/scripts/revertbot.py
+++ b/scripts/revertbot.py
@@ -132,10 +132,18 @@
return False
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
user = None
rollback = False
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-username'):
if len(arg) == 9:
user = pywikibot.input(
diff --git a/scripts/script_wui.py b/scripts/script_wui.py
index 9d0aab2..d3a2c82 100755
--- a/scripts/script_wui.py
+++ b/scripts/script_wui.py
@@ -274,10 +274,18 @@
# comment = pywikibot.translate(self.site.lang, bot_config['msg']))
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
global __simulate, __sys_argv
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
pywikibot.showHelp('script_wui')
return
@@ -295,5 +303,4 @@
raise
if __name__ == "__main__":
- # run bot
main()
diff --git a/scripts/selflink.py b/scripts/selflink.py
index 8ce58e2..fdbc920 100644
--- a/scripts/selflink.py
+++ b/scripts/selflink.py
@@ -183,11 +183,19 @@
self.treat(page)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# Page generator
gen = None
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = GeneratorFactory()
botArgs = {}
@@ -207,7 +215,4 @@
bot.run()
if __name__ == "__main__":
- try:
- main()
- finally:
- pywikibot.stopme()
+ main()
diff --git a/scripts/solve_disambiguation.py b/scripts/solve_disambiguation.py
index c6d86a2..1c70083 100644
--- a/scripts/solve_disambiguation.py
+++ b/scripts/solve_disambiguation.py
@@ -1006,6 +1006,14 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# the option that's always selected when the bot wonders what to do with
# a link. If it's None, the user is prompted (default behaviour).
always = None
@@ -1020,7 +1028,7 @@
# For sorting the linked pages, case can be ignored
minimum = 0
- local_args = pywikibot.handleArgs(*args)
+ local_args = pywikibot.handle_args(args)
for arg in local_args:
if arg.startswith('-primary:'):
diff --git a/scripts/spamremove.py b/scripts/spamremove.py
index 57c1b53..f4156bf 100755
--- a/scripts/spamremove.py
+++ b/scripts/spamremove.py
@@ -37,11 +37,19 @@
from pywikibot.editor import TextEditor
-def main():
+def main(*args):
+ """
+ Process command line arguments and perform task.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
always = False
namespaces = []
spamSite = ''
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg == "-always":
always = True
elif arg.startswith('-namespace:'):
diff --git a/scripts/template.py b/scripts/template.py
index 48763cc..bd3fc40 100755
--- a/scripts/template.py
+++ b/scripts/template.py
@@ -292,6 +292,14 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
templateNames = []
templates = {}
options = {}
@@ -302,7 +310,7 @@
timestamp = None
# read command line parameters
- local_args = pywikibot.handleArgs(*args)
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
if arg == '-remove':
diff --git a/scripts/templatecount.py b/scripts/templatecount.py
index cd2aa01..b26c37d 100644
--- a/scripts/templatecount.py
+++ b/scripts/templatecount.py
@@ -107,12 +107,20 @@
yield template, transcludingArray
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
operation = None
argsList = []
namespaces = []
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg in ('-count', '-list'):
operation = arg[1:]
elif arg.startswith('-namespace:'):
diff --git a/scripts/touch.py b/scripts/touch.py
index 4caabe1..3d696a6 100755
--- a/scripts/touch.py
+++ b/scripts/touch.py
@@ -68,11 +68,19 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
gen = None
options = {}
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs(*args)
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/transferbot.py b/scripts/transferbot.py
index 12590c1..594ba59 100644
--- a/scripts/transferbot.py
+++ b/scripts/transferbot.py
@@ -77,8 +77,16 @@
pass
-def main():
- tohandle = pywikibot.handleArgs()
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
+ local_args = pywikibot.handle_args(args)
fromsite = pywikibot.Site()
tolang = fromsite.code
@@ -89,7 +97,7 @@
genFactory = pagegenerators.GeneratorFactory()
- for arg in tohandle:
+ for arg in local_args:
if genFactory.handleArg(arg):
gen_args.append(arg)
continue
diff --git a/scripts/unlink.py b/scripts/unlink.py
index b9525ce..ee0796a 100755
--- a/scripts/unlink.py
+++ b/scripts/unlink.py
@@ -164,13 +164,21 @@
pywikibot.output(u"Page %s is locked?!" % page.title(asLink=True))
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
# This temporary string is used to read the title
# of the page that should be unlinked.
page_title = None
options = {}
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-namespace:'):
if 'namespaces' not in options:
options['namespaces'] = []
diff --git a/scripts/unusedfiles.py b/scripts/unusedfiles.py
index 3125e95..bf17f78 100644
--- a/scripts/unusedfiles.py
+++ b/scripts/unusedfiles.py
@@ -103,10 +103,18 @@
self.userPut(page, oldtext, text, comment=self.summary)
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
options = {}
- for arg in pywikibot.handleArgs():
+ for arg in pywikibot.handle_args(args):
if arg == '-always':
options['always'] = True
diff --git a/scripts/upload.py b/scripts/upload.py
index aab2f62..22d1f12 100755
--- a/scripts/upload.py
+++ b/scripts/upload.py
@@ -319,6 +319,14 @@
def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
url = u''
description = []
keepFilename = False
@@ -331,7 +339,7 @@
# process all global bot args
# returns a list of non-global args, i.e. args for upload.py
- for arg in pywikibot.handleArgs(*args):
+ for arg in pywikibot.handle_args(args):
if arg:
if arg.startswith('-keep'):
keepFilename = True
diff --git a/scripts/watchlist.py b/scripts/watchlist.py
index 140b858..2414c82 100755
--- a/scripts/watchlist.py
+++ b/scripts/watchlist.py
@@ -133,13 +133,19 @@
refresh(pywikibot.Site(lang, family))
-def main():
- """ Script entry point. """
- local_args = pywikibot.handleArgs()
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
all = False
new = False
sysop = False
- for arg in local_args:
+ for arg in pywikibot.handle_args(args):
if arg in ('-all', '-update'):
all = True
elif arg == '-new':
@@ -160,7 +166,4 @@
pywikibot.output(pageName, toStdout=True)
if __name__ == "__main__":
- try:
- main()
- finally:
- pywikibot.stopme()
+ main()
diff --git a/scripts/weblinkchecker.py b/scripts/weblinkchecker.py
index 80d7415..062f883 100644
--- a/scripts/weblinkchecker.py
+++ b/scripts/weblinkchecker.py
@@ -801,7 +801,15 @@
return c.check()
-def main():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
gen = None
xmlFilename = None
# Which namespaces should be processed?
@@ -811,7 +819,7 @@
day = 7
# Process global args and prepare generator args parser
- local_args = pywikibot.handleArgs()
+ local_args = pywikibot.handle_args(args)
genFactory = pagegenerators.GeneratorFactory()
for arg in local_args:
diff --git a/scripts/welcome.py b/scripts/welcome.py
index 3ae71b0..e0805ce 100644
--- a/scripts/welcome.py
+++ b/scripts/welcome.py
@@ -887,8 +887,16 @@
globalvar = Global()
-def main():
- for arg in pywikibot.handleArgs():
+def main(*args):
+ """
+ Process command line arguments and invoke bot.
+
+ If args is an empty list, sys.argv is used.
+
+ @param args: command line arguments
+ @type args: list of unicode
+ """
+ for arg in pywikibot.handle_args(args):
if arg.startswith('-edit'):
if len(arg) == 5:
globalvar.attachEditCount = int(pywikibot.input(
diff --git a/tox.ini b/tox.ini
index f722b37..359851b 100644
--- a/tox.ini
+++ b/tox.ini
@@ -53,6 +53,7 @@
./pywikibot/data/api.py \
./pywikibot/userinterfaces/transliteration.py \
./pywikibot/userinterfaces/terminal_interface.py \
+ ./scripts/category.py \
./scripts/claimit.py \
./scripts/coordinate_import.py \
./scripts/harvest_template.py \
--
To view, visit https://gerrit.wikimedia.org/r/165452
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I828ba3cf155fb6d3e052564d20768fdf9c3acc95
Gerrit-PatchSet: 4
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: Check httplib2.__version__ exists
......................................................................
Check httplib2.__version__ exists
Python 3.4+ will load externals/httplib2 when there is no system
package of that name, even if the directory is empty because the
user has not cloned the git sub-modules.
Bug: 72249
Change-Id: Ica1fb38d40f8c5e6e9d458b4c0ea2008df4b65c5
---
M pwb.py
1 file changed, 13 insertions(+), 2 deletions(-)
Approvals:
XZise: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pwb.py b/pwb.py
index 66b3aa0..befa539 100644
--- a/pwb.py
+++ b/pwb.py
@@ -101,11 +101,21 @@
# try importing the known externals, and raise an error if they are not found
try:
import httplib2
+ if not hasattr(httplib2, '__version__'):
+ print("httplib2 import problem: httplib2.__version__ does not exist.")
+ if sys.version_info > (3, 3):
+ print("Python 3.4+ has probably loaded externals/httplib2 "
+ "although it doesnt have an __init__.py.")
+ httplib2 = None
except ImportError as e:
print("ImportError: %s" % e)
+ httplib2 = None
+
+if not httplib2:
print("Python module httplib2 >= 0.6.0 is required.")
print("Did you clone without --recursive?\n"
- "Try running 'git submodule update --init'.")
+ "Try running 'git submodule update --init' "
+ "or 'pip install httplib2'.")
sys.exit(1)
# httplib2 0.6.0 was released with __version__ as '$Rev$'
@@ -117,7 +127,8 @@
print("Python module httplib2 (%s) needs to be 0.6.0 or greater." %
httplib2.__file__)
print("Did you clone without --recursive?\n"
- "Try running 'git submodule update --init'.")
+ "Try running 'git submodule update --init' "
+ "or 'pip install --upgrade httplib2'.")
sys.exit(1)
del httplib2
--
To view, visit https://gerrit.wikimedia.org/r/167533
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Ica1fb38d40f8c5e6e9d458b4c0ea2008df4b65c5
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: apy.py: RandomPageGenerator always return 10 pages
......................................................................
apy.py: RandomPageGenerator always return 10 pages
In QueryGenerator loop, self.data needs to be deleted at each cycle to
allow fresh data to be retrieved.
Bug: 72244
Change-Id: I78c32be606934f1abd51038e7150bd3a31b03462
---
M pywikibot/data/api.py
1 file changed, 1 insertion(+), 0 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/data/api.py b/pywikibot/data/api.py
index 5357cc5..fff6959 100644
--- a/pywikibot/data/api.py
+++ b/pywikibot/data/api.py
@@ -1094,6 +1094,7 @@
if self.module == "random" and self.limit:
# "random" module does not return "query-continue"
# now we loop for a new random query
+ del self.data # a new request is needed
continue
if "query-continue" not in self.data:
return
--
To view, visit https://gerrit.wikimedia.org/r/167518
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I78c32be606934f1abd51038e7150bd3a31b03462
Gerrit-PatchSet: 2
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: jenkins-bot <>