jenkins-bot has submitted this change and it was merged.
Change subject: Fix warning for intersect mode with only one gen
......................................................................
Fix warning for intersect mode with only one gen
The warning message was using pywikibot.input
instead of pywikibot.warning.
Change-Id: Iefc5c8b7885d73130066a6142b67e5ee7ecfa5a3
---
M pywikibot/pagegenerators.py
1 file changed, 2 insertions(+), 2 deletions(-)
Approvals:
XZise: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/pagegenerators.py b/pywikibot/pagegenerators.py
index 0f6e8e5..e07bf44 100644
--- a/pywikibot/pagegenerators.py
+++ b/pywikibot/pagegenerators.py
@@ -375,8 +375,8 @@
gensList = self.gens[0]
dupfiltergen = gensList
if self.intersect:
- pywikibot.input(u'Only one generator. '
- u'Param "-intersect" has no meaning or effect.')
+ pywikibot.warning(
+ '"-intersect" ignored as only one generator is specified.')
else:
if self.intersect:
gensList = intersect_generators(self.gens)
--
To view, visit https://gerrit.wikimedia.org/r/202933
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Iefc5c8b7885d73130066a6142b67e5ee7ecfa5a3
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: [FIX] site tests: Correct debug output
......................................................................
[FIX] site tests: Correct debug output
The patch 3fd46e9a assumed that the links were actual Link instances but
are actually Page instances so the correct methods need to be used.
Change-Id: I7617e643d568973b76322fd3cfe97d0e930c25f3
---
M tests/site_tests.py
1 file changed, 1 insertion(+), 1 deletion(-)
Approvals:
John Vandenberg: Looks good to me, approved
jenkins-bot: Verified
diff --git a/tests/site_tests.py b/tests/site_tests.py
index b58bbd5..50ec5d5 100644
--- a/tests/site_tests.py
+++ b/tests/site_tests.py
@@ -374,7 +374,7 @@
print('FAILURE wrt T92856:')
print(u'Sym. difference: "{0}"'.format(
u'", "'.join(
- u'{0}@{1}'.format(link.namespace, link.title)
+ u'{0}@{1}'.format(link.namespace(), link.title(withNamespace=False))
for link in namespace_links ^ links)))
self.assertCountEqual(
set(mysite.pagelinks(mainpage, namespaces=[0, 1])) - links, [])
--
To view, visit https://gerrit.wikimedia.org/r/202696
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I7617e643d568973b76322fd3cfe97d0e930c25f3
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: [FIX] BasePage: Raise exception on redirect
......................................................................
[FIX] BasePage: Raise exception on redirect
The pageAPInfo method in compat does raises a NoPage and IsRedirectPage
when that applies to the page. The port in 7362b605 does raise NoPage
indirectly (as already exising methods handle that) but doesn't raise
IsRedirectPage (because a redirect page can have a latest revision).
Change-Id: Ifdd8f4bf1d0a635a998375119715cab6ffc1f8e1
---
M pywikibot/page.py
1 file changed, 2 insertions(+), 0 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/page.py b/pywikibot/page.py
index 6ff6d3a..357ec3f 100644
--- a/pywikibot/page.py
+++ b/pywikibot/page.py
@@ -430,6 +430,8 @@
@deprecated('latest_revision_id')
def pageAPInfo(self):
"""Return the current revision id for this page."""
+ if self.isRedirectPage():
+ raise pywikibot.IsRedirectPage(self)
return self.latest_revision_id
@property
--
To view, visit https://gerrit.wikimedia.org/r/202439
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Ifdd8f4bf1d0a635a998375119715cab6ffc1f8e1
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: WDQ: yield ItemPages for DataSite
......................................................................
WDQ: yield ItemPages for DataSite
Add a mode that WikidataQueryPageGenerator will generate ItemPage
from the WikidataQuery, if a DataSite is given as site.
Change-Id: I04010c199bb346f54554567166b36a64d5eabec3
---
M pywikibot/pagegenerators.py
1 file changed, 4 insertions(+), 0 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/pagegenerators.py b/pywikibot/pagegenerators.py
index 6f01605..68cae1b 100644
--- a/pywikibot/pagegenerators.py
+++ b/pywikibot/pagegenerators.py
@@ -2251,6 +2251,10 @@
pywikibot.output(u'retrieved %d items' % data[u'status'][u'items'])
for item in data[u'items']:
page = pywikibot.ItemPage(repo, u'Q{0}'.format(item))
+ if isinstance(site, pywikibot.site.DataSite):
+ yield page
+ continue
+
try:
link = page.getSitelink(site)
except pywikibot.NoPage:
--
To view, visit https://gerrit.wikimedia.org/r/130320
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: I04010c199bb346f54554567166b36a64d5eabec3
Gerrit-PatchSet: 3
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: FelixReimann <felix(a)fex-it.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Ricordisamoa <ricordisamoa(a)openmailbox.org>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: [IMPROV] site tests: More debug output
......................................................................
[IMPROV] site tests: More debug output
To further hunt down T92856 it is now outputing the symmetric
difference. That way it's possible to determine if the same page but
with a different 'case' is in the other set. It also explicitly states
the namespaces it uses.
Change-Id: Ib55b4927712400fda0fb16f6af272da0f977e21f
---
M tests/site_tests.py
1 file changed, 7 insertions(+), 0 deletions(-)
Approvals:
John Vandenberg: Looks good to me, approved
jenkins-bot: Verified
diff --git a/tests/site_tests.py b/tests/site_tests.py
index 762f91f..c7b2c90 100644
--- a/tests/site_tests.py
+++ b/tests/site_tests.py
@@ -369,6 +369,13 @@
# TODO: There have been build failures because the following assertion
# wasn't true. Bug: T92856
# Example: https://travis-ci.org/wikimedia/pywikibot-core/jobs/54552081#L505
+ namespace_links = set(mysite.pagelinks(mainpage, namespaces=[0, 1]))
+ if namespace_links - links:
+ print('FAILURE wrt T92856:')
+ print(u'Sym. difference: "{0}"'.format(
+ u'", "'.join(
+ u'{0}@{1}'.format(link.namespace, link.title)
+ for link in namespace_links ^ links)))
self.assertCountEqual(
set(mysite.pagelinks(mainpage, namespaces=[0, 1])) - links, [])
for target in mysite.preloadpages(mysite.pagelinks(mainpage,
--
To view, visit https://gerrit.wikimedia.org/r/198172
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: Ib55b4927712400fda0fb16f6af272da0f977e21f
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: jenkins-bot <>
jenkins-bot has submitted this change and it was merged.
Change subject: [IMPROV] Wikibase: Remove code duplication
......................................................................
[IMPROV] Wikibase: Remove code duplication
There was a bit code duplication in the conversion into a difference.
It also iterated through an dictionary manually, although the 'in'
operator does the same.
Change-Id: If0bee33be087dd2e557d37ba59ac8938ca0b8d6c
---
M pywikibot/page.py
1 file changed, 29 insertions(+), 50 deletions(-)
Approvals:
Ricordisamoa: Looks good to me, approved
XZise: Looks good to me, but someone else must approve
jenkins-bot: Verified
diff --git a/pywikibot/page.py b/pywikibot/page.py
index 1ba050e..53dba3a 100644
--- a/pywikibot/page.py
+++ b/pywikibot/page.py
@@ -2834,30 +2834,27 @@
'descriptions': self.descriptions,
}
- def toJSON(self, diffto=None):
- labels = self._normalizeLanguages(self.labels).copy()
- if diffto and 'labels' in diffto:
- old = set(diffto['labels'].keys())
- new = set(labels.keys())
- for lang in old - new:
- labels[lang] = ''
- for lang in old.intersection(new):
- if labels[lang] == diffto['labels'][lang]['value']:
- del labels[lang]
- for lang in labels:
- labels[lang] = {'language': lang, 'value': labels[lang]}
+ def _diff_to(self, type_key, key_name, value_name, diffto, data):
+ assert(type_key not in data)
+ source = self._normalizeLanguages(getattr(self, type_key)).copy()
+ diffto = {} if not diffto else diffto.get(type_key, {})
+ new = set(source.keys())
+ for key in diffto:
+ if key in new:
+ if source[key] == diffto[key][value_name]:
+ del source[key]
+ else:
+ source[key] = ''
+ for key, value in source.items():
+ source[key] = {key_name: key, value_name: value}
+ if source:
+ data[type_key] = source
- descriptions = self._normalizeLanguages(self.descriptions).copy()
- if diffto and 'descriptions' in diffto:
- old = set(diffto['descriptions'].keys())
- new = set(descriptions.keys())
- for lang in old - new:
- descriptions[lang] = ''
- for lang in old.intersection(new):
- if descriptions[lang] == diffto['descriptions'][lang]['value']:
- del descriptions[lang]
- for lang in descriptions:
- descriptions[lang] = {'language': lang, 'value': descriptions[lang]}
+ def toJSON(self, diffto=None):
+ data = {}
+ self._diff_to('labels', 'language', 'value', diffto, data)
+
+ self._diff_to('descriptions', 'language', 'value', diffto, data)
aliases = self._normalizeLanguages(self.aliases).copy()
if diffto and 'aliases' in diffto:
@@ -2874,10 +2871,8 @@
if lang in aliases:
aliases[lang] = [{'language': lang, 'value': i} for i in strings]
- data = {}
- for key in ('labels', 'descriptions', 'aliases'):
- if len(locals()[key]) > 0:
- data[key] = locals()[key]
+ if aliases:
+ data['aliases'] = aliases
return data
def getID(self, numeric=False, force=False):
@@ -3176,17 +3171,7 @@
def toJSON(self, diffto=None):
data = super(ItemPage, self).toJSON(diffto=diffto)
- sitelinks = self.sitelinks.copy()
- if diffto and 'sitelinks' in diffto:
- old = set(diffto['sitelinks'].keys())
- new = set(sitelinks.keys())
- for dbName in old - new:
- sitelinks[dbName] = ''
- for dbName in old.intersection(new):
- if sitelinks[dbName] == diffto['sitelinks'][dbName]['title']:
- del sitelinks[dbName]
- for dbName in sitelinks:
- sitelinks[dbName] = {'site': dbName, 'title': sitelinks[dbName]}
+ self._diff_to('sitelinks', 'site', 'title', diffto, data)
claims = self.claims.copy()
for prop in claims.keys():
@@ -3198,17 +3183,12 @@
if diffto and 'claims' in diffto:
temp = {}
for prop in claims:
- for claim1 in claims[prop]:
- seen = False
- if prop in diffto['claims']:
- for claim2 in diffto['claims'][prop]:
- if claim2 == claim1:
- seen = True
- break
- if not seen:
+ for claim in claims[prop]:
+ if (prop not in diffto['claims'] or
+ claim not in diffto['claims'][prop]):
if prop not in temp:
temp[prop] = []
- temp[prop].append(claim1)
+ temp[prop].append(claim)
for prop in diffto['claims']:
if prop not in claims:
claims[prop] = []
@@ -3220,9 +3200,8 @@
temp[prop].append({'id': claim1['id'], 'remove': ''})
claims = temp
- for key in ('sitelinks', 'claims'):
- if len(locals()[key]) > 0:
- data[key] = locals()[key]
+ if claims:
+ data['claims'] = claims
return data
def iterlinks(self, family=None):
--
To view, visit https://gerrit.wikimedia.org/r/159710
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: merged
Gerrit-Change-Id: If0bee33be087dd2e557d37ba59ac8938ca0b8d6c
Gerrit-PatchSet: 5
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: John Vandenberg <jayvdb(a)gmail.com>
Gerrit-Reviewer: Ladsgroup <ladsgroup(a)gmail.com>
Gerrit-Reviewer: Merlijn van Deen <valhallasw(a)arctus.nl>
Gerrit-Reviewer: Ricordisamoa <ricordisamoa(a)openmailbox.org>
Gerrit-Reviewer: XZise <CommodoreFabianus(a)gmx.de>
Gerrit-Reviewer: jenkins-bot <>