jenkins-bot has submitted this change and it was merged. ( https://gerrit.wikimedia.org/r/538006 )
Change subject: [bugfix] Implement deletedrevisions api call
......................................................................
[bugfix] Implement deletedrevisions api call
deletedrevs is deprecated and might be removed.
site changes:
- With mw 1.25 a 'deletedrevisions' PropertyGenerator is used
to get deleted revisions.
- All API parameters can be used as keyword options
- The revids parameter is available too.
- page was renamed to titles. Both titles and revids are allowed to
be an iterable instead of a single item
- All properties can be set with prop parameter if others than the default
properties should be shown. If the content of a revision is requested,
this will be added to the given properties or added to the default.
- Fallback to 'deletedrevs' ListGenerator for mw_version prior than 1.25
- Take into account that the two request types have different dicts
Test changes:
- Replace "page" with "titles"
- Use subTest to separate tests inside test method
- The generator must be started to validate the AssertionError
since we have a real generator instead an iterable
Bug: T75370
Change-Id: I4c47abbc9ba0fcb03c20bcd4486020ac829a5b2b
---
M pywikibot/site.py
M tests/site_tests.py
2 files changed, 182 insertions(+), 91 deletions(-)
Approvals:
Huji: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/site.py b/pywikibot/site.py
index a55a66c..d1b6518 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -5015,69 +5015,129 @@
filters)
return wlgen
- # TODO: T75370
- @deprecated_args(step=None, get_text='content')
- def deletedrevs(self, page, start=None, end=None, reverse=False,
- content=False, total=None):
+ @deprecated_args(step=None, get_text='content', page='titles',
+ limit='total')
+ def deletedrevs(self, titles=None, start=None, end=None, reverse=False,
+ content=False, total=None, **kwargs):
"""Iterate deleted revisions.
Each value returned by the iterator will be a dict containing the
'title' and 'ns' keys for a particular Page and a 'revisions' key
whose value is a list of revisions in the same format as
- recentchanges (plus a 'content' element if requested). If get_text
- is true, the toplevel dict will contain a 'token' key as well.
+ recentchanges plus a 'content' element with key '*' if requested
+ when 'content' parameter is set. For older wikis a 'token' key is
+ also given with the content request.
- @see: U{https://www.mediawiki.org/wiki/API:Deletedrevs}
+ @see: U{https://www.mediawiki.org/wiki/API:Deletedrevisions}
- @param page: The page to check for deleted revisions
+ @param titles: The page titles to check for deleted revisions
+ @type titles: str (multiple titles delimited with '|')
+ or pywikibot.Page or typing.Iterable[pywikibot.Page]
+ or typing.Iterable[str]
+ @keyword revids: Get revisions by their ID
+
+ @note either titles or revids must be set but not both
+
@param start: Iterate revisions starting at this Timestamp
@param end: Iterate revisions ending at this Timestamp
@param reverse: Iterate oldest revisions first (default: newest)
@type reverse: bool
- @param content: If True, retrieve the content of each revision and
- an undelete token
+ @param content: If True, retrieve the content of each revision
+ @param total: number of revisions to retrieve
+ @keyword user: List revisions by this user
+ @keyword excludeuser: Exclude revisions by this user
+ @keyword tag: Only list revision tagged with this tag
+ @keyword prop: Which properties to get. Defaults are ids, user,
+ comment, flags and timestamp
"""
+ def handle_props(props):
+ """Translate deletedrev props to deletedrevisions props."""
+ if isinstance(props, UnicodeType):
+ props = props.split('|')
+ if self.mw_version >= '1.25':
+ return props
+
+ old_props = []
+ for item in props:
+ if item == 'ids':
+ old_props += ['revid', 'parentid']
+ elif item == 'flags':
+ old_props.append('minor')
+ elif item == 'timestamp':
+ pass
+ else:
+ old_props.append(item)
+ if item == 'content' and self.mw_version < '1.24':
+ old_props.append('token')
+ return old_props
+
if start and end:
self.assert_valid_iter_params('deletedrevs', start, end, reverse)
if not self.logged_in():
self.login()
+
+ err = ('deletedrevs: User:{} not authorized to '
+ .format(self.user()))
if 'deletedhistory' not in self.userinfo['rights']:
- try:
- self.login(True)
- except NoUsername:
- pass
- if 'deletedhistory' not in self.userinfo['rights']:
- raise Error(
- 'deletedrevs: '
- 'User:%s not authorized to access deleted revisions.'
- % self.user())
+ raise Error(err + 'access deleted revisions.')
if content:
if 'undelete' not in self.userinfo['rights']:
- try:
- self.login(True)
- except NoUsername:
- pass
- if 'undelete' not in self.userinfo['rights']:
- raise Error(
- 'deletedrevs: '
- 'User:%s not authorized to view deleted content.'
- % self.user())
+ raise Error(err + 'view deleted content.')
- drgen = self._generator(api.ListGenerator, type_arg='deletedrevs',
- titles=page.title(with_section=False),
- drprop='revid|user|comment|minor',
- total=total)
+ revids = kwargs.pop('revids', None)
+ if not (bool(titles) ^ (revids is not None)):
+ raise Error('deletedrevs: either "titles" or "revids" parameter '
+ 'must be given.')
+ if revids and self.mw_version < '1.25':
+ raise NotImplementedError(
+ 'deletedrevs: "revid" is not implemented with MediaWiki {}'
+ .format(self.mw_version))
+
+ if self.mw_version >= '1.25':
+ pre = 'drv'
+ type_arg = 'deletedrevisions'
+ generator = api.PropertyGenerator
+ else:
+ pre = 'dr'
+ type_arg = 'deletedrevs'
+ generator = api.ListGenerator
+
+ gen = self._generator(generator, type_arg=type_arg,
+ titles=titles, revids=revids,
+ total=total)
+
+ gen.request[pre + 'start'] = start
+ gen.request[pre + 'end'] = end
+
+ # handle properties
+ prop = kwargs.pop('prop',
+ ['ids', 'user', 'comment', 'flags', 'timestamp'])
if content:
- drgen.request['drprop'] = (drgen.request['drprop']
- + ['content', 'token'])
- if start is not None:
- drgen.request['drstart'] = start
- if end is not None:
- drgen.request['drend'] = end
+ prop.append('content')
+ prop = handle_props(prop)
+ gen.request[pre + 'prop'] = prop
+
+ # handle other parameters like user
+ for k, v in kwargs.items():
+ gen.request[pre + k] = v
+
if reverse:
- drgen.request['drdir'] = 'newer'
- return drgen
+ gen.request[pre + 'dir'] = 'newer'
+
+ if self.mw_version < '1.25':
+ # yield from gen
+ for data in gen:
+ yield data
+ else:
+ # The dict result is different for both generators
+ for data in gen:
+ try:
+ data['revisions'] = data.pop('deletedrevisions')
+ except KeyError:
+ pass
+ else:
+ yield data
def users(self, usernames):
"""Iterate info about a list of users by name or IP.
diff --git a/tests/site_tests.py b/tests/site_tests.py
index d4bf58c..4c84f3b 100644
--- a/tests/site_tests.py
+++ b/tests/site_tests.py
@@ -1885,66 +1885,97 @@
"""Test the site.deletedrevs() method."""
mysite = self.get_site()
mainpage = self.get_mainpage()
- gen = mysite.deletedrevs(total=10, page=mainpage)
+ gen = mysite.deletedrevs(total=10, titles=mainpage)
+
for dr in gen:
break
else:
self.skipTest(
'{0} contains no deleted revisions.'.format(mainpage))
self.assertLessEqual(len(dr['revisions']), 10)
- self.assertTrue(all(isinstance(rev, dict)
- for rev in dr['revisions']))
- for item in mysite.deletedrevs(start='2008-10-11T01:02:03Z',
- page=mainpage, total=5):
- for rev in item['revisions']:
- self.assertIsInstance(rev, dict)
- self.assertLessEqual(rev['timestamp'], '2008-10-11T01:02:03Z')
- for item in mysite.deletedrevs(end='2008-04-01T02:03:04Z',
- page=mainpage, total=5):
- for rev in item['revisions']:
- self.assertIsInstance(rev, dict)
- self.assertGreaterEqual(rev['timestamp'],
- '2008-10-11T02:03:04Z')
- for item in mysite.deletedrevs(start='2008-10-11T03:05:07Z',
- page=mainpage, total=5,
- reverse=True):
- for rev in item['revisions']:
- self.assertIsInstance(rev, dict)
- self.assertGreaterEqual(rev['timestamp'],
- '2008-10-11T03:05:07Z')
- for item in mysite.deletedrevs(end='2008-10-11T04:06:08Z',
- page=mainpage, total=5,
- reverse=True):
- for rev in item['revisions']:
- self.assertIsInstance(rev, dict)
- self.assertLessEqual(rev['timestamp'], '2008-10-11T04:06:08Z')
- for item in mysite.deletedrevs(start='2008-10-13T11:59:59Z',
- end='2008-10-13T00:00:01Z',
- page=mainpage, total=5):
- for rev in item['revisions']:
- self.assertIsInstance(rev, dict)
- self.assertLessEqual(rev['timestamp'], '2008-10-13T11:59:59Z')
- self.assertGreaterEqual(rev['timestamp'],
- '2008-10-13T00:00:01Z')
- for item in mysite.deletedrevs(start='2008-10-15T06:00:01Z',
- end='2008-10-15T23:59:59Z',
- page=mainpage, reverse=True,
- total=5):
- for rev in item['revisions']:
- self.assertIsInstance(rev, dict)
- self.assertLessEqual(rev['timestamp'], '2008-10-15T23:59:59Z')
- self.assertGreaterEqual(rev['timestamp'],
- '2008-10-15T06:00:01Z')
+ self.assertTrue(all(isinstance(rev, dict) for rev in dr['revisions']))
+
+ with self.subTest(start='2008-10-11T01:02:03Z', reverse=False):
+ for item in mysite.deletedrevs(start='2008-10-11T01:02:03Z',
+ titles=mainpage, total=5):
+ for rev in item['revisions']:
+ self.assertIsInstance(rev, dict)
+ self.assertLessEqual(rev['timestamp'],
+ '2008-10-11T01:02:03Z')
+
+ with self.subTest(end='2008-04-01T02:03:04Z', reverse=False):
+ for item in mysite.deletedrevs(end='2008-04-01T02:03:04Z',
+ titles=mainpage, total=5):
+ for rev in item['revisions']:
+ self.assertIsInstance(rev, dict)
+ self.assertGreaterEqual(rev['timestamp'],
+ '2008-10-11T02:03:04Z')
+
+ with self.subTest(start='2008-10-11T03:05:07Z', reverse=True):
+ for item in mysite.deletedrevs(start='2008-10-11T03:05:07Z',
+ titles=mainpage, total=5,
+ reverse=True):
+ for rev in item['revisions']:
+ self.assertIsInstance(rev, dict)
+ self.assertGreaterEqual(rev['timestamp'],
+ '2008-10-11T03:05:07Z')
+
+ with self.subTest(end='2008-10-11T04:06:08Z', reverse=True):
+ for item in mysite.deletedrevs(end='2008-10-11T04:06:08Z',
+ titles=mainpage, total=5,
+ reverse=True):
+ for rev in item['revisions']:
+ self.assertIsInstance(rev, dict)
+ self.assertLessEqual(rev['timestamp'],
+ '2008-10-11T04:06:08Z')
+
+ with self.subTest(start='2008-10-13T11:59:59Z',
+ end='2008-10-13T00:00:01Z',
+ reverse=False):
+ for item in mysite.deletedrevs(start='2008-10-13T11:59:59Z',
+ end='2008-10-13T00:00:01Z',
+ titles=mainpage, total=5):
+ for rev in item['revisions']:
+ self.assertIsInstance(rev, dict)
+ self.assertLessEqual(rev['timestamp'],
+ '2008-10-13T11:59:59Z')
+ self.assertGreaterEqual(rev['timestamp'],
+ '2008-10-13T00:00:01Z')
+
+ with self.subTest(start='2008-10-15T06:00:01Z',
+ end='2008-10-15T23:59:59Z',
+ reverse=True):
+ for item in mysite.deletedrevs(start='2008-10-15T06:00:01Z',
+ end='2008-10-15T23:59:59Z',
+ titles=mainpage, total=5,
+ reverse=True):
+ for rev in item['revisions']:
+ self.assertIsInstance(rev, dict)
+ self.assertLessEqual(rev['timestamp'],
+ '2008-10-15T23:59:59Z')
+ self.assertGreaterEqual(rev['timestamp'],
+ '2008-10-15T06:00:01Z')
# start earlier than end
- self.assertRaises(AssertionError, mysite.deletedrevs,
- page=mainpage, start='2008-09-03T00:00:01Z',
- end='2008-09-03T23:59:59Z', total=5)
+ with self.subTest(start='2008-09-03T00:00:01Z',
+ end='2008-09-03T23:59:59Z',
+ reverse=False):
+ with self.assertRaises(AssertionError):
+ gen = mysite.deletedrevs(titles=mainpage,
+ start='2008-09-03T00:00:01Z',
+ end='2008-09-03T23:59:59Z', total=5)
+ next(gen)
+
# reverse: end earlier than start
- self.assertRaises(AssertionError, mysite.deletedrevs,
- page=mainpage, start='2008-09-03T23:59:59Z',
- end='2008-09-03T00:00:01Z', reverse=True,
- total=5)
+ with self.subTest(start='2008-09-03T23:59:59Z',
+ end='2008-09-03T00:00:01Z',
+ reverse=True):
+ with self.assertRaises(AssertionError):
+ gen = mysite.deletedrevs(titles=mainpage,
+ start='2008-09-03T23:59:59Z',
+ end='2008-09-03T00:00:01Z', total=5,
+ reverse=True)
+ next(gen)
class TestSiteSysopWrite(TestCase):
--
To view, visit https://gerrit.wikimedia.org/r/538006
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-MessageType: merged
Gerrit-Change-Id: I4c47abbc9ba0fcb03c20bcd4486020ac829a5b2b
Gerrit-Change-Number: 538006
Gerrit-PatchSet: 16
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Huji <huji.huji(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot (75)
jenkins-bot has submitted this change and it was merged. ( https://gerrit.wikimedia.org/r/538421 )
Change subject: site_tests.py: fix test_preload_templates_and_langlinks
......................................................................
site_tests.py: fix test_preload_templates_and_langlinks
Test is failing because not all pages might have both templates
and langlinks.
Filter pages before test and preload only pages with both.
Skip test if no valid pages are available.
Change-Id: Ic0145b51dc82e7b06b0d86db0acdaa9903b58527
---
M tests/site_tests.py
1 file changed, 16 insertions(+), 9 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/tests/site_tests.py b/tests/site_tests.py
index 73b2da3..fbc85d5 100644
--- a/tests/site_tests.py
+++ b/tests/site_tests.py
@@ -3255,7 +3255,6 @@
if count >= 6:
break
- @allowed_failure
def test_preload_templates_and_langlinks(self):
"""Test preloading templates and langlinks works."""
mysite = self.get_site()
@@ -3263,15 +3262,23 @@
count = 0
# Use backlinks, as any backlink has at least one link
links = mysite.pagebacklinks(mainpage, total=10)
+ # Screen pages before test;
+ # it is not guaranteed that all pages will have both.
+ links = [l for l in links if (l.langlinks() and l.templates())]
+ # Skip test if no valid pages are found.
+ if not links:
+ self.SkipTest('No valid pages found to carry out test.')
+
for page in mysite.preloadpages(links, langlinks=True, templates=True):
- self.assertIsInstance(page, pywikibot.Page)
- self.assertIsInstance(page.exists(), bool)
- if page.exists():
- self.assertLength(page._revisions, 1)
- self.assertIsNotNone(page._revisions[page._revid].text)
- self.assertFalse(hasattr(page, '_pageprops'))
- self.assertTrue(hasattr(page, '_templates'))
- self.assertTrue(hasattr(page, '_langlinks'))
+ with self.subTest(page=page):
+ self.assertIsInstance(page, pywikibot.Page)
+ self.assertIsInstance(page.exists(), bool)
+ if page.exists():
+ self.assertLength(page._revisions, 1)
+ self.assertIsNotNone(page._revisions[page._revid].text)
+ self.assertFalse(hasattr(page, '_pageprops'))
+ self.assertTrue(hasattr(page, '_templates'))
+ self.assertTrue(hasattr(page, '_langlinks'))
count += 1
if count >= 6:
break
--
To view, visit https://gerrit.wikimedia.org/r/538421
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-MessageType: merged
Gerrit-Change-Id: Ic0145b51dc82e7b06b0d86db0acdaa9903b58527
Gerrit-Change-Number: 538421
Gerrit-PatchSet: 2
Gerrit-Owner: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot (75)
jenkins-bot has submitted this change and it was merged. ( https://gerrit.wikimedia.org/r/538369 )
Change subject: [bugfix] Fix method call for assert_valid_iter_param
......................................................................
[bugfix] Fix method call for assert_valid_iter_param
- Also revide assert_valid_iter_params test method because the required
start/end order depends on the content. For timestamp values the start
must be later than end whereas for titles the start value must be less
than end.
- Instead of raising a pywikibot.Error it raises an AssertionError if
the order is wrong. This sounds like a breaking change but
assert_valid_iter_params should break the script anyway if the values
are wrong.
- Add need_version 1.17 for filearchive method
Bug: T233476
Change-Id: I25ed6568bc1bbe3c09f4471061c533080c345a89
---
M pywikibot/site.py
1 file changed, 31 insertions(+), 12 deletions(-)
Approvals:
Mpaa: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/site.py b/pywikibot/site.py
index 48ccbb5..1889462 100644
--- a/pywikibot/site.py
+++ b/pywikibot/site.py
@@ -21,6 +21,7 @@
import mimetypes
import os
import re
+from textwrap import fill
import threading
import time
import uuid
@@ -2270,16 +2271,33 @@
'articlepath must end with /$1'
return self.siteinfo['general']['articlepath'][:-2]
- def assert_valid_iter_params(self, msg_prefix, start, end, reverse):
- """Validate iterating API parameters."""
- if reverse:
- if end < start:
- raise Error(msg_prefix
- + ': end must be later than start'
- ' with reverse=True')
- elif start < end:
- raise Error(msg_prefix
- + ': start must be later than end with reverse=False')
+ def assert_valid_iter_params(self, msg_prefix, start, end, reverse,
+ is_ts=True):
+ """Validate iterating API parameters.
+
+ @param msg_prefix: The calling method name
+ @type msg_prefix: str
+ @param start: The start value to compare
+ @param end: The end value to compare
+ @param reverse: The reverse option
+ @type reverse: bool
+ @param is_ts: When comparing timestamps (with is_ts=True) the start
+ is usually greater than end. Comparing titles this is vice versa.
+ type is_ts: bool
+ @raises AssertionError: start/end values are in wrong order
+ """
+ if reverse ^ is_ts:
+ low, high = end, start
+ order = 'follow'
+ else:
+ low, high = start, end
+ order = 'precede'
+ msg = ('{method}: "start" must {order} "end" '
+ 'with reverse={reverse} and is_ts={is_ts} '
+ 'but "start" is "{start}" and "end" is "{end}".')
+ assert low < high, fill(msg.format(method=msg_prefix, order=order,
+ start=start, end=end,
+ reverse=reverse, is_ts=is_ts))
def has_right(self, right, sysop=False):
"""Return true if and only if the user has a specific right.
@@ -4470,6 +4488,7 @@
aigen.request['gaisha1base36'] = sha1base36
return aigen
+ @need_version('1.17')
@deprecated_args(limit='total') # ignore falimit setting
def filearchive(self, start=None, end=None, reverse=False, total=None,
**kwargs):
@@ -4489,8 +4508,8 @@
@keyword prop: Image information to get. Default is timestamp
"""
if start and end:
- self.assert_valid_iter_params(self, 'filearchive', start, end,
- reverse)
+ self.assert_valid_iter_params(
+ 'filearchive', start, end, reverse, is_ts=False)
fagen = self._generator(api.ListGenerator,
type_arg='filearchive',
fafrom=start,
--
To view, visit https://gerrit.wikimedia.org/r/538369
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-MessageType: merged
Gerrit-Change-Id: I25ed6568bc1bbe3c09f4471061c533080c345a89
Gerrit-Change-Number: 538369
Gerrit-PatchSet: 5
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Multichill <maarten(a)mdammers.nl>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot (75)
jenkins-bot has submitted this change and it was merged. ( https://gerrit.wikimedia.org/r/538386 )
Change subject: Remove list of wikis in documentation of welcome.py
......................................................................
Remove list of wikis in documentation of welcome.py
Change-Id: I2f2ca6e408aef2d62d2d804a1384d9dd19188030
---
M scripts/welcome.py
1 file changed, 2 insertions(+), 3 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/scripts/welcome.py b/scripts/welcome.py
index 9da4866..b4f6ee8 100755
--- a/scripts/welcome.py
+++ b/scripts/welcome.py
@@ -3,9 +3,8 @@
"""
Script to welcome new users.
-This script works out of the box for Wikis that
-have been defined in the script. It is currently used on the Dutch, Norwegian,
-Albanian, Italian Wikipedia, Wikimedia Commons and English Wikiquote.
+This script works out of the box for Wikis that have been
+defined in the script.
Ensure you have community support before running this bot!
--
To view, visit https://gerrit.wikimedia.org/r/538386
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-MessageType: merged
Gerrit-Change-Id: I2f2ca6e408aef2d62d2d804a1384d9dd19188030
Gerrit-Change-Number: 538386
Gerrit-PatchSet: 4
Gerrit-Owner: Zoranzoki21 <zorandori4444(a)gmail.com>
Gerrit-Reviewer: D3r1ck01 <xsavitar.wiki(a)aol.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: Zoranzoki21 <zorandori4444(a)gmail.com>
Gerrit-Reviewer: jenkins-bot (75)
jenkins-bot has submitted this change and it was merged. ( https://gerrit.wikimedia.org/r/538383 )
Change subject: [tests] use subTest to detect failing tests more specific
......................................................................
[tests] use subTest to detect failing tests more specific
Bug: T233484
Change-Id: I9429388c1d948753f4c055b7a48be30f4dca492a
---
M tests/generate_family_files_tests.py
1 file changed, 3 insertions(+), 2 deletions(-)
Approvals:
Mpaa: Looks good to me, approved
jenkins-bot: Verified
diff --git a/tests/generate_family_files_tests.py b/tests/generate_family_files_tests.py
index 1c54bb8..7c4f008 100644
--- a/tests/generate_family_files_tests.py
+++ b/tests/generate_family_files_tests.py
@@ -61,8 +61,9 @@
self.assertIn(lang, self.generator_instance.wikis)
for i in range(10):
lang = choice(self.generator_instance.langs)
- site = Site(url=lang['url'])
- self.assertEqual(site.lang, lang['prefix'])
+ with self.subTest(lang=lang['prefix']):
+ site = Site(url=lang['url'])
+ self.assertEqual(site.lang, lang['prefix'])
if __name__ == '__main__': # pragma: no cover
--
To view, visit https://gerrit.wikimedia.org/r/538383
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-MessageType: merged
Gerrit-Change-Id: I9429388c1d948753f4c055b7a48be30f4dca492a
Gerrit-Change-Number: 538383
Gerrit-PatchSet: 1
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: jenkins-bot (75)