jenkins-bot has submitted this change and it was merged.
Change subject: MySQLPageGenerator fall back to .my.cnf if config.db_hostname etc are not set
......................................................................
MySQLPageGenerator fall back to .my.cnf if config.db_hostname etc are not set
code courtesy: jayvdb ( John Vandenberg )
Bug: T75147
Change-Id: I3e028d31e3ff7950f1a006ec5579558719a6736d
---
M pywikibot/config2.py
1 file changed, 1 insertion(+), 1 deletion(-)
Approvals:
John Vandenberg: Looks good to me, approved
Eranroz: Looks good to me, but someone else must approve
jenkins-bot: Verified
diff --git a/pywikibot/config2.py b/pywikibot/config2.py
index 6e1a27e..6fea03a 100644
--- a/pywikibot/config2.py
+++ b/pywikibot/config2.py
@@ -554,7 +554,7 @@
db_username = ''
db_password = ''
db_name_format = '{0}'
-db_connect_file = None
+db_connect_file = os.path.expanduser('~/.my.cnf')
# ############# SEARCH ENGINE SETTINGS ##############
--
To view, visit https://gerrit.wikimedia.org/r/185618
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I3e028d31e3ff7950f1a006ec5579558719a6736d
Gerrit-PatchSet: 2
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: 01tonythomas <01tonythomas(a)gmail.com>
Gerrit-Reviewer: 01tonythomas <01tonythomas(a)gmail.com>
Gerrit-Reviewer: Eranroz <eranroz89(a)gmail.com>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: update version number
......................................................................
update version number
Change-Id: I2175a71304dc22ba43cb27b3f5f04fb8fb355b71
---
M family.py
1 file changed, 1 insertion(+), 1 deletion(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/family.py b/family.py
index d20c078..2730ae9 100644
--- a/family.py
+++ b/family.py
@@ -4963,7 +4963,7 @@
# Don't use this, use versionnumber() instead. This only exists
# to not break family files.
# Here we return the latest mw release of wikimedia projects
- return '1.25wmf13'
+ return '1.25wmf14'
def shared_image_repository(self, code):
return ('commons', 'commons')
--
To view, visit https://gerrit.wikimedia.org/r/185612
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I2175a71304dc22ba43cb27b3f5f04fb8fb355b71
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/compat
Gerrit-Branch: master
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: [IMPROV] Thread tests: Simplified intersections
......................................................................
[IMPROV] Thread tests: Simplified intersections
Instead of using itertools.product it is using sets and their
'intersection' method.
Change-Id: I162573d122e09aba2b3dd0f2a12f78fb2b3018bf
---
M tests/thread_tests.py
1 file changed, 2 insertions(+), 12 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
Mpaa: Looks good to me, but someone else must approve
jenkins-bot: Verified
diff --git a/tests/thread_tests.py b/tests/thread_tests.py
index 3222a8f..79a67a4 100644
--- a/tests/thread_tests.py
+++ b/tests/thread_tests.py
@@ -7,12 +7,6 @@
#
__version__ = '$Id$'
-import itertools
-import sys
-
-if sys.version_info[0] == 2:
- from future_builtins import filter
-
from tests.aspects import unittest, TestCase
from pywikibot.tools import ThreadedGenerator, intersect_generators
@@ -50,17 +44,13 @@
# first otherwise the generator is empty the second time.
datasets = [list(gen) for gen in gens]
- itertools_result = set(
- [item[0] for item in filter(
- lambda lst: all([x == lst[0] for x in lst]),
- itertools.product(*datasets))
- ])
+ set_result = set(datasets[0]).intersection(*datasets[1:])
result = list(intersect_generators(datasets))
self.assertEqual(len(set(result)), len(result))
- self.assertCountEqual(result, itertools_result)
+ self.assertCountEqual(result, set_result)
class BasicGeneratorIntersectTestCase(GeneratorIntersectTestCase):
--
To view, visit https://gerrit.wikimedia.org/r/185185
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I162573d122e09aba2b3dd0f2a12f78fb2b3018bf
Gerrit-PatchSet: 2
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: [FEAT] Request: Support booleans
......................................................................
[FEAT] Request: Support booleans
This supports the values to be booleans so that of an argument is set to
True, that it is added with an empty string and arguments set to False.
This only applies to a bool in an list or tuple of the length 1.
It changes how the values are internally stored: Instead of being casted
into string on defining the value (which removes the information whether
it was a bool or not) all values are stored how they were defined
(although maybe converted into a list if they didn't support iteration)
and are converted into a string only when the http URI is built.
It also adds a 'OptionSet' which is a list of booleans and creates then
a list of pipe-separated entries.
Change-Id: I1a61095168544334a0876aff927c4aa26e5fa125
---
M pywikibot/data/api.py
M pywikibot/site.py
M tests/api_tests.py
3 files changed, 312 insertions(+), 85 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/data/api.py b/pywikibot/data/api.py
index 04245ce..f979b81 100644
--- a/pywikibot/data/api.py
+++ b/pywikibot/data/api.py
@@ -465,6 +465,194 @@
return self._with_limits
+class OptionSet(MutableMapping):
+
+ """
+ A class to store a set of options which can be either enabled or not.
+
+ If it is instantiated with the associated site, module and parameter it
+ will only allow valid names as options. If instantiated 'lazy loaded' it
+ won't checks if the names are valid until the site has been set (which
+ isn't required, but recommended). The site can only be set once if it's not
+ None and after setting it, any site (even None) will fail.
+ """
+
+ def __init__(self, site=None, module=None, param=None, dict=None):
+ """
+ Constructor.
+
+ If a site is given, the module and param must be given too.
+
+ @param site: The associated site
+ @type site: APISite
+ @param module: The module name which is used by paraminfo. (Ignored
+ when site is None)
+ @type module: string
+ @param param: The parameter name inside the module. That parameter must
+ have a 'type' entry. (Ignored when site is None)
+ @type param: string
+ @param dict: The initializing dict which is used for L{from_dict}.
+ @type dict: dict
+ """
+ self._site_set = False
+ self._enabled = set()
+ self._disabled = set()
+ self._set_site(site, module, param)
+ if dict:
+ self.from_dict(dict)
+
+ def _set_site(self, site, module, param, clear_invalid=False):
+ """
+ Set the site and valid names.
+
+ As soon as the site has been not None, any subsequent calls will fail,
+ unless there had been invalid names and a KeyError was thrown.
+
+ @param site: The associated site
+ @type site: APISite
+ @param module: The module name which is used by paraminfo.
+ @type module: string
+ @param param: The parameter name inside the module. That parameter must
+ have a 'type' entry.
+ @type param: string
+ @param clear_invalid: Instead of throwing a KeyError, invalid names are
+ silently removed from the options (disabled by default).
+ @type clear_invalid: bool
+ """
+ if self._site_set:
+ raise TypeError('The site can not be set multiple times.')
+ # If the entries written to this are valid, it will never be
+ # overwritten
+ self._valid_enable = set()
+ self._valid_disable = set()
+ if site is None:
+ return
+ for type in site._paraminfo.parameter(module, param)['type']:
+ if type[0] == '!':
+ self._valid_disable.add(type[1:])
+ else:
+ self._valid_enable.add(type)
+ if clear_invalid:
+ self._enabled &= self._valid_enable
+ self._disabled &= self._valid_disable
+ else:
+ invalid_names = ((self._enabled - self._valid_enable) |
+ (self._disabled - self._valid_disable))
+ if invalid_names:
+ raise KeyError(u'OptionSet already contains invalid name(s) '
+ u'"{0}"'.format('", "'.join(invalid_names)))
+ self._site_set = True
+
+ def from_dict(self, dict):
+ """
+ Load options from the dict.
+
+ The options are not cleared before. If changes have been made
+ previously, but only the dict values should be applied it needs to be
+ cleared first.
+
+ @param dict: A dictionary containing for each entry either the value
+ False, True or None. The names must be valid depending on whether
+ they enable or disable the option. All names with the value None
+ can be in either of the list.
+ @type dict: dict (keys are strings, values are bool/None)
+ """
+ enabled = set()
+ disabled = set()
+ removed = set()
+ for name, value in dict.items():
+ if value is True:
+ enabled.add(name)
+ elif value is False:
+ disabled.add(name)
+ elif value is None:
+ removed.add(name)
+ else:
+ raise ValueError(u'Dict contains invalid value "{0}"'.format(
+ value))
+ invalid_names = (
+ (enabled - self._valid_enable) | (disabled - self._valid_disable) |
+ (removed - self._valid_enable - self._valid_disable)
+ )
+ if invalid_names and self._site_set:
+ raise ValueError(u'Dict contains invalid name(s) "{0}"'.format(
+ '", "'.join(invalid_names)))
+ self._enabled = enabled | (self._enabled - disabled - removed)
+ self._disabled = disabled | (self._disabled - enabled - removed)
+
+ def clear(self):
+ """Clear all enabled and disabled options."""
+ self._enabled.clear()
+ self._disabled.clear()
+
+ def __setitem__(self, name, value):
+ """Set option to enabled, disabled or neither."""
+ if value is True:
+ if self._site_set and name not in self._valid_enable:
+ raise KeyError(u'Invalid name "{0}"'.format(name))
+ self._enabled.add(name)
+ self._disabled.discard(name)
+ elif value is False:
+ if self._site_set and name not in self._valid_disable:
+ raise KeyError(u'Invalid name "{0}"'.format(name))
+ self._disabled.add(name)
+ self._enabled.discard(name)
+ elif value is None:
+ if self._site_set and (name not in self._valid_enable or
+ name not in self._valid_disable):
+ raise KeyError(u'Invalid name "{0}"'.format(name))
+ self._enabled.discard(name)
+ self._disabled.discard(name)
+ else:
+ raise ValueError(u'Invalid value "{0}"'.format(value))
+
+ def __getitem__(self, name):
+ """
+ Return whether the option is enabled.
+
+ @return: If the name has been set it returns whether it is enabled.
+ Otherwise it returns None. If the site has been set it raises a
+ KeyError if the name is invalid. Otherwise it might return a value
+ even though the name might be invalid.
+ @rtype: bool/None
+ """
+ if name in self._enabled:
+ return True
+ elif name in self._disabled:
+ return False
+ elif (self._site_set or name in self._valid_enable or
+ name in self._valid_disable):
+ return None
+ else:
+ raise KeyError(u'Invalid name "{0}"'.format(name))
+
+ def __delitem__(self, name):
+ """Remove the item by setting it to None."""
+ self[name] = None
+
+ def __contains__(self, name):
+ """Return True if option has been set."""
+ return name in self._enabled or name in self._disabled
+
+ def __iter__(self):
+ """Iterate over each enabled and disabled option."""
+ for enabled in self._enabled:
+ yield enabled
+ for disabled in self._disabled:
+ yield disabled
+
+ def api_iter(self):
+ """Iterate over each option as they appear in the URL."""
+ for enabled in self._enabled:
+ yield enabled
+ for disabled in self._disabled:
+ yield '!{0}'.format(disabled)
+
+ def __len__(self):
+ """Return the number of enabled and disabled options."""
+ return len(self._enabled) + len(self._disabled)
+
+
class TimeoutError(Error):
"""API request failed with a timeout error."""
@@ -665,14 +853,16 @@
if isinstance(value, unicode):
value = value.split("|")
- try:
- iter(value)
- values = value
- except TypeError:
- # convert any non-iterable value into a single-element list
- values = [value]
-
- self._params[key] = [self._format_value(item) for item in values]
+ if hasattr(value, 'api_iter'):
+ self._params[key] = value
+ else:
+ try:
+ iter(value)
+ except TypeError:
+ # convert any non-iterable value into a single-element list
+ self._params[key] = [value]
+ else:
+ self._params[key] = list(value)
def __delitem__(self, key):
del self._params[key]
@@ -782,8 +972,19 @@
@rtype: dict with values of either str or bytes
"""
params = {}
- for key, value in self._params.items():
- value = u"|".join(value)
+ for key, values in self._params.items():
+ try:
+ iterator = values.api_iter()
+ except AttributeError:
+ if len(values) == 1:
+ value = values[0]
+ if value is True:
+ values = ['']
+ elif value is False:
+ # False booleans are not in the http URI
+ continue
+ iterator = iter(values)
+ value = u'|'.join(self._format_value(value) for value in iterator)
# If the value is encodable as ascii, do not encode it.
# This means that any value which can be encoded as ascii
# is presumed to be ascii, and servers using a site encoding
@@ -1307,7 +1508,7 @@
raise Error("%s: No query module name found in arguments."
% self.__class__.__name__)
- kwargs["indexpageids"] = "" # always ask for list of pageids
+ kwargs['indexpageids'] = True # always ask for list of pageids
if MediaWikiVersion(self.site.version()) < MediaWikiVersion('1.21'):
self.continue_name = 'query-continue'
self.continue_update = self._query_continue
@@ -1315,7 +1516,7 @@
self.continue_name = 'continue'
self.continue_update = self._continue
# Explicitly enable the simplified continuation
- kwargs['continue'] = ''
+ kwargs['continue'] = True
self.request = Request(**kwargs)
# This forces all paraminfo for all query modules to be bulk loaded.
diff --git a/pywikibot/site.py b/pywikibot/site.py
index 9e0c75a..5454267 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -1177,10 +1177,10 @@
site=self._site,
action='query',
meta='siteinfo',
- siprop='|'.join(props))
+ siprop=props)
# With 1.25wmf5 it'll require continue or rawcontinue. As we don't
# continue anyway we just always use continue.
- request['continue'] = ''
+ request['continue'] = True
# warnings are handled later
request._warning_handler = warn_handler
data = request.submit()
@@ -2891,11 +2891,10 @@
else:
pltitle = page.title(withSection=False).encode(self.encoding())
plargs['titles'] = pltitle
- if follow_redirects:
- plargs['redirects'] = ''
plgen = self._generator(api.PageGenerator, type_arg="links",
namespaces=namespaces, step=step, total=total,
- g_content=content, **plargs)
+ g_content=content, redirects=follow_redirects,
+ **plargs)
return plgen
@deprecate_arg("withSortKey", None) # Sortkey doesn't work with generator
@@ -3342,11 +3341,9 @@
raise Error("alllinks: unique and fromids cannot both be True.")
algen = self._generator(api.ListGenerator, type_arg="alllinks",
namespaces=namespace, alfrom=start,
- step=step, total=total)
+ step=step, total=total, alunique=unique)
if prefix:
algen.request["alprefix"] = prefix
- if unique:
- algen.request["alunique"] = ""
if fromids:
algen.request["alprop"] = "title|ids"
for link in algen:
@@ -3658,7 +3655,7 @@
rcprop="user|comment|timestamp|title|ids"
"|sizes|redirect|loginfo|flags",
namespaces=namespaces, step=step,
- total=total)
+ total=total, rctoponly=topOnly)
if start is not None:
rcgen.request["rcstart"] = start
if end is not None:
@@ -3670,12 +3667,10 @@
pywikibot.warning(
u"recentchanges: pagelist option is disabled; ignoring.")
else:
- rcgen.request["rctitles"] = u"|".join(p.title(withSection=False)
- for p in pagelist)
+ rcgen.request["rctitles"] = (p.title(withSection=False)
+ for p in pagelist)
if changetype:
rcgen.request["rctype"] = changetype
- if topOnly:
- rcgen.request["rctoponly"] = ""
filters = {'minor': showMinor,
'bot': showBot,
'anon': showAnon,
@@ -3685,12 +3680,8 @@
self.has_right('patrol') or self.has_right('patrolmarks')):
rcgen.request['rcprop'] += ['patrolled']
filters['patrolled'] = showPatrolled
- rcshow = []
- for item in filters:
- if filters[item] is not None:
- rcshow.append(filters[item] and item or ("!" + item))
- if rcshow:
- rcgen.request["rcshow"] = "|".join(rcshow)
+ rcgen.request['rcshow'] = api.OptionSet(self, 'recentchanges', 'show',
+ filters)
if user:
rcgen.request['rcuser'] = user
@@ -3737,8 +3728,8 @@
gsrsearch=searchstring, gsrwhat=where,
namespaces=namespaces, step=step,
total=total, g_content=content)
- if getredirects and MediaWikiVersion(self.version()) < MediaWikiVersion('1.23'):
- srgen.request["gsrredirects"] = ""
+ if MediaWikiVersion(self.version()) < MediaWikiVersion('1.23'):
+ srgen.request['gsrredirects'] = getredirects
return srgen
def usercontribs(self, user=None, userprefix=None, start=None, end=None,
@@ -3776,7 +3767,7 @@
ucgen = self._generator(api.ListGenerator, type_arg="usercontribs",
ucprop="ids|title|timestamp|comment|flags",
namespaces=namespaces, step=step,
- total=total)
+ total=total, uctoponly=top_only)
if user:
ucgen.request["ucuser"] = user
if userprefix:
@@ -3787,10 +3778,9 @@
ucgen.request["ucend"] = str(end)
if reverse:
ucgen.request["ucdir"] = "newer"
- if showMinor is not None:
- ucgen.request["ucshow"] = showMinor and "minor" or "!minor"
- if top_only:
- ucgen.request["uctoponly"] = ""
+ option_set = api.OptionSet(self, 'usercontribs', 'show')
+ option_set['minor'] = showMinor
+ ucgen.request['ucshow'] = option_set
return ucgen
def watchlist_revs(self, start=None, end=None, reverse=False,
@@ -3834,12 +3824,8 @@
filters = {'minor': showMinor,
'bot': showBot,
'anon': showAnon}
- wlshow = []
- for item in filters:
- if filters[item] is not None:
- wlshow.append(filters[item] and item or ("!" + item))
- if wlshow:
- wlgen.request["wlshow"] = "|".join(wlshow)
+ wlgen.request['wlshow'] = api.OptionSet(self, 'watchlist', 'show',
+ filters)
return wlgen
# TODO: T75370
@@ -3959,9 +3945,7 @@
"""
rngen = self._generator(api.PageGenerator, type_arg="random",
namespaces=namespaces, step=step, total=total,
- g_content=content)
- if redirects:
- rngen.request["grnredirect"] = ""
+ g_content=content, grnredirect=redirects)
return rngen
# Catalog of editpage error codes, for use in generating messages.
@@ -4028,23 +4012,15 @@
self.lock_page(page)
params = dict(action="edit",
title=page.title(withSection=False),
- text=text, token=token, summary=summary)
- if bot:
- params['bot'] = ""
+ text=text, token=token, summary=summary, bot=bot,
+ recreate=recreate, createonly=createonly,
+ nocreate=nocreate, minor=minor,
+ notminor=not minor and notminor)
if lastrev is not None:
if lastrev not in page._revisions:
self.loadrevisions(page)
params['basetimestamp'] = page._revisions[lastrev].timestamp
- if minor:
- params['minor'] = ""
- elif notminor:
- params['notminor'] = ""
- if recreate:
- params['recreate'] = ""
- if createonly:
- params['createonly'] = ""
- if nocreate:
- params['nocreate'] = ""
+
watch_items = set(["watch", "unwatch", "preferences", "nochange"])
if watch in watch_items:
if MediaWikiVersion(self.version()) < MediaWikiVersion("1.16"):
@@ -4052,7 +4028,7 @@
pywikibot.warning(u'The watch value {0} is not supported '
'by {1}'.format(watch, self))
else:
- params[watch] = ""
+ params[watch] = True
else:
params['watchlist'] = watch
elif watch:
@@ -4198,12 +4174,9 @@
token = self.tokens['move']
self.lock_page(page)
req = api.Request(site=self, action="move", to=newtitle,
- token=token, reason=summary)
+ token=token, reason=summary, movetalk=movetalk,
+ noredirect=noredirect)
req['from'] = oldtitle # "from" is a python keyword
- if movetalk:
- req['movetalk'] = ""
- if noredirect:
- req['noredirect'] = ""
try:
result = req.submit()
pywikibot.debug(u"movepage response: %s" % result,
@@ -4575,17 +4548,10 @@
token = self.tokens['block']
req = api.Request(site=self, action='block', user=user.username,
- expiry=expiry, reason=reason, token=token)
- if anononly:
- req['anononly'] = ''
- if nocreate:
- req['nocreate'] = ''
- if autoblock:
- req['autoblock'] = ''
- if noemail:
- req['noemail'] = ''
- if reblock:
- req['reblock'] = ''
+ expiry=expiry, reason=reason, token=token,
+ anononly=anononly, nocreate=nocreate,
+ autoblock=autoblock, noemail=noemail,
+ reblock=reblock)
data = req.submit()
return data
@@ -4617,9 +4583,7 @@
"""
token = self.tokens['watch']
req = api.Request(action="watch", token=token,
- title=page.title(withSection=False))
- if unwatch:
- req["unwatch"] = ""
+ title=page.title(withSection=False), unwatch=unwatch)
result = req.submit()
if "watch" not in result:
pywikibot.error(u"watchpage: Unexpected API response:\n%s" % result)
@@ -4801,8 +4765,9 @@
f.seek(offset)
chunk = f.read(chunk_size)
req = api.Request(site=self, action='upload', token=token,
- stash='1', offset=offset, filesize=filesize,
+ stash=True, offset=offset, filesize=filesize,
filename=file_page_title,
+ ignorewarnings=ignore_warnings,
mime_params={}, throttle=throttle)
req.mime_params['chunk'] = (chunk,
("application", "octet-stream"),
@@ -4858,10 +4823,8 @@
filename=file_page_title,
url=source_url, comment=comment, text=text)
if not result:
- if watch:
- req["watch"] = ""
- if ignore_warnings:
- req["ignorewarnings"] = ""
+ req['watch'] = watch
+ req['ignorewarnings'] = ignore_warnings
try:
result = req.submit()
self._uploaddisabled = False
diff --git a/tests/api_tests.py b/tests/api_tests.py
index c417356..fae6537 100644
--- a/tests/api_tests.py
+++ b/tests/api_tests.py
@@ -243,6 +243,69 @@
self.assertIn('revisions', pi.prefixes)
+class TestOptionSet(TestCase):
+
+ """OptionSet class test class."""
+
+ family = 'wikipedia'
+ code = 'en'
+
+ def test_non_lazy_load(self):
+ """Test OptionSet with initialised site."""
+ options = api.OptionSet(self.get_site(), 'recentchanges', 'show')
+ self.assertRaises(KeyError, options.__setitem__, 'invalid_name', True)
+ self.assertRaises(ValueError, options.__setitem__, 'anon', 'invalid_value')
+ options['anon'] = True
+ self.assertCountEqual(['anon'], options._enabled)
+ self.assertEqual(set(), options._disabled)
+ self.assertEqual(1, len(options))
+ self.assertEqual(['anon'], list(options))
+ self.assertEqual(['anon'], list(options.api_iter()))
+ options['bot'] = False
+ self.assertCountEqual(['anon'], options._enabled)
+ self.assertCountEqual(['bot'], options._disabled)
+ self.assertEqual(2, len(options))
+ self.assertEqual(['anon', 'bot'], list(options))
+ self.assertEqual(['anon', '!bot'], list(options.api_iter()))
+ options.clear()
+ self.assertEqual(set(), options._enabled)
+ self.assertEqual(set(), options._disabled)
+ self.assertEqual(0, len(options))
+ self.assertEqual([], list(options))
+ self.assertEqual([], list(options.api_iter()))
+
+ def test_lazy_load(self):
+ """Test OptionSet with delayed site initialisation."""
+ options = api.OptionSet()
+ options['invalid_name'] = True
+ options['anon'] = True
+ self.assertIn('invalid_name', options._enabled)
+ self.assertEqual(2, len(options))
+ self.assertRaises(KeyError, options._set_site, self.get_site(),
+ 'recentchanges', 'show')
+ self.assertEqual(2, len(options))
+ options._set_site(self.get_site(), 'recentchanges', 'show', True)
+ self.assertEqual(1, len(options))
+ self.assertRaises(TypeError, options._set_site, self.get_site(),
+ 'recentchanges', 'show')
+
+
+class TestDryOptionSet(DefaultDrySiteTestCase):
+
+ """OptionSet class test class."""
+
+ def test_mutable_mapping(self):
+ """Test keys, values and items from MutableMapping."""
+ options = api.OptionSet()
+ options['a'] = True
+ options['b'] = False
+ options['c'] = None
+ self.assertCountEqual(['a', 'b'], list(options.keys()))
+ self.assertCountEqual([True, False], list(options.values()))
+ self.assertEqual(set(), set(options.values()) - set([True, False]))
+ self.assertCountEqual([('a', True), ('b', False)], list(options.items()))
+
+
class TestDryPageGenerator(TestCase):
"""Dry API PageGenerator object test class."""
--
To view, visit https://gerrit.wikimedia.org/r/184643
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I1a61095168544334a0876aff927c4aa26e5fa125
Gerrit-PatchSet: 8
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
Build Update for wikimedia/pywikibot-core
-------------------------------------
Build: #1922
Status: Still Failing
Duration: 43 minutes and 38 seconds
Commit: a7004c9 (master)
Author: Fabian Neundorf
Message: [FIX] Flake8 on Python 3 fixes
This also changes how match_images in the match_images scripts return
the match value. Instead of being between 100 and 0 it's not between 1.0
and 0.0 so that the complete script is now using that range. It's in
theory breaking but because the name was pep8ified when converted to
core a few days ago it was broken anyway (and the old non-pep8 function
could be added to support the old 0-100 range if necessary). But because
of that it does now only say two images are the same, when they are
equal to 80 % and not 0.8 %.
This also renames the job from flake83 to flake8-py3.
Change-Id: I95ed0de30d2993d41a70c9b47b76c6461f40e8a0
View the changeset: https://github.com/wikimedia/pywikibot-core/compare/feb0aa48647f...a7004c9d…
View the full build log and details: https://travis-ci.org/wikimedia/pywikibot-core/builds/46923811
--
You can configure recipients for build notifications in your .travis.yml file. See http://docs.travis-ci.com/user/notifications
jenkins-bot has submitted this change and it was merged.
Change subject: [FIX] Flake8 on Python 3 fixes
......................................................................
[FIX] Flake8 on Python 3 fixes
This also changes how match_images in the match_images scripts return
the match value. Instead of being between 100 and 0 it's not between 1.0
and 0.0 so that the complete script is now using that range. It's in
theory breaking but because the name was pep8ified when converted to
core a few days ago it was broken anyway (and the old non-pep8 function
could be added to support the old 0-100 range if necessary). But because
of that it does now only say two images are the same, when they are
equal to 80 % and not 0.8 %.
This also renames the job from flake83 to flake8-py3.
Change-Id: I95ed0de30d2993d41a70c9b47b76c6461f40e8a0
---
M scripts/match_images.py
M scripts/upload.py
M tox.ini
3 files changed, 15 insertions(+), 13 deletions(-)
Approvals:
Ricordisamoa: Looks good to me, approved
jenkins-bot: Verified
diff --git a/scripts/match_images.py b/scripts/match_images.py
index 304583b..55076e1 100644
--- a/scripts/match_images.py
+++ b/scripts/match_images.py
@@ -87,21 +87,23 @@
averageScore = (wholeScore + topleftScore + toprightScore +
bottomleftScore + bottomrightScore + centerScore) / 6
- print (u'Whole image ' + str(wholeScore))
- print (u'Top left of image ' + str(topleftScore))
- print (u'Top right of image ' + str(toprightScore))
- print (u'Bottom left of image ' + str(bottomleftScore))
- print (u'Bottom right of image ' + str(bottomrightScore))
- print (u'Center of image ' + str(centerScore))
- print (u' -------------')
- print (u'Average ' + str(averageScore))
+ pywikibot.output('Whole image {0:>7.2%}\n'
+ 'Top left of image {1:>7.2%}\n'
+ 'Top right of image {2:>7.2%}\n'
+ 'Bottom left of image {3:>7.2%}\n'
+ 'Bottom right of image {4:>7.2%}\n'
+ 'Center of image {5:>7.2%}\n'
+ ' -------\n'
+ 'Average {6:>7.2%}'.format(
+ wholeScore, topleftScore, toprightScore, bottomleftScore,
+ bottomrightScore, centerScore, averageScore))
# Hard coded at 80%, change this later on.
if averageScore > 0.8:
- print (u'We have a match!')
+ pywikibot.output('We have a match!')
return True
else:
- print (u'Not the same.')
+ pywikibot.output('Not the same.')
return False
@@ -133,7 +135,7 @@
if totalPixels == 0:
return 0
- return totalMatch / totalPixels * 100
+ return totalMatch / totalPixels
def main(*args):
diff --git a/scripts/upload.py b/scripts/upload.py
index e3daa44..b02068b 100755
--- a/scripts/upload.py
+++ b/scripts/upload.py
@@ -57,6 +57,7 @@
if sys.version_info[0] > 2:
from urllib.parse import urlparse
from urllib.request import URLopener
+ basestring = (str,)
else:
from urlparse import urlparse
from urllib import URLopener
diff --git a/tox.ini b/tox.ini
index eb8cf2c..ae60d99 100644
--- a/tox.ini
+++ b/tox.ini
@@ -18,12 +18,11 @@
deps = flake8
flake8-docstrings
-[testenv:flake83]
+[testenv:flake8-py3]
commands = flake8 --ignore=D102,D103,E122,E127,E241 {posargs}
basepython = python3
deps = flake8
flake8-docstrings
-
[testenv:flake8-docstrings]
commands = flake8 {posargs}
--
To view, visit https://gerrit.wikimedia.org/r/184781
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I95ed0de30d2993d41a70c9b47b76c6461f40e8a0
Gerrit-PatchSet: 2
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Ricordisamoa <ricordisamoa(a)openmailbox.org>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>