jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/651899 )
Change subject: [5.4] Update ROADMAP.rst
......................................................................
[5.4] Update ROADMAP.rst
Also increase version number to 5.4
Change-Id: Ie155acdf49815c4770f46465e2fc708c4af9235f
---
M .appveyor.yml
M ROADMAP.rst
M pywikibot/__metadata__.py
3 files changed, 8 insertions(+), 5 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/.appveyor.yml b/.appveyor.yml
index a096131..52f3b93 100644
--- a/.appveyor.yml
+++ b/.appveyor.yml
@@ -1,6 +1,6 @@
clone_depth: 50
skip_tags: true
-version: 5.3.{build}
+version: 5.4.{build}
environment:
APPVEYOR_PYTHON_URL: "https://raw.githubusercontent.com/dvorapa/python-appveyor-demo/master/appve…"
diff --git a/ROADMAP.rst b/ROADMAP.rst
index fb8ede4..5b25224 100644
--- a/ROADMAP.rst
+++ b/ROADMAP.rst
@@ -1,7 +1,12 @@
Current release changes
~~~~~~~~~~~~~~~~~~~~~~~
-* (no changes yet)
+* Desupported shared_image_repository() and nocapitalize() methods were removed (T89451)
+* pywikibot.cookie_jar was removed in favour of pywikibot.comms.http.cookie_jar
+* Align http.fetch() params with requests and rename 'disable_ssl_certificate_validation' to 'verify' (T265206)
+* Deprecated compat BasePage.getRestrictions() method was removed
+* Outdated Site.recentchanges() parameters has been dropped
+* site.LoginStatus has been removed in favour of login.LoginStatus
Future release notes
~~~~~~~~~~~~~~~~~~~~
@@ -13,6 +18,4 @@
* 5.0.0: HttpRequest result of http.fetch() will be replaced by requests.Response (T265206)
* 5.0.0: OptionHandler.options dict will be removed in favour of OptionHandler.opt
* 5.0.0: Methods deprecated for 5 years or longer will be removed
-* 5.0.0: Outdated Site.recentchanges() parameters will be removed
-* 5.0.0: site.LoginStatus will be removed in favour of login.LoginStatus
* 5.0.0: pagegenerators.ReferringPageGenerator is desupported and will be removed
diff --git a/pywikibot/__metadata__.py b/pywikibot/__metadata__.py
index 31db005..c5c09cb 100644
--- a/pywikibot/__metadata__.py
+++ b/pywikibot/__metadata__.py
@@ -6,7 +6,7 @@
# Distributed under the terms of the MIT license.
#
__name__ = 'pywikibot'
-__version__ = '5.3.1.dev0'
+__version__ = '5.4.0.dev0'
__description__ = 'Python MediaWiki Bot Framework'
__maintainer__ = 'The Pywikibot team'
__maintainer_email__ = 'pywikibot(a)lists.wikimedia.org'
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/651899
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: Ie155acdf49815c4770f46465e2fc708c4af9235f
Gerrit-Change-Number: 651899
Gerrit-PatchSet: 1
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
Gerrit-MessageType: merged
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/651272 )
Change subject: [bugfix] fix api_tests.TestAPIMWException
......................................................................
[bugfix] fix api_tests.TestAPIMWException
Fix tests broken after:
[IMPR] remove unneeded explicit params in fetch() and request()
I335058809bf1c1732b305eac12a70cd1b82996d4
Bug: T270611
Change-Id: I559aa41b986f981e0cfcb514b5a935d4ce4c42fc
---
M tests/api_tests.py
1 file changed, 9 insertions(+), 7 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/tests/api_tests.py b/tests/api_tests.py
index eca078d..2c0b83a 100644
--- a/tests/api_tests.py
+++ b/tests/api_tests.py
@@ -42,25 +42,27 @@
'servedby': 'unittest',
}
- def _dummy_request(self, **kwargs):
- self.assertIn('body', kwargs)
+ def _dummy_request(self, *args, **kwargs):
+ self.assertLength(args, 1) # one positional argument for http.request
+ site = args[0]
+ self.assertIsInstance(site, pywikibot.BaseSite)
+ self.assertIn('data', kwargs)
self.assertIn('uri', kwargs)
- self.assertIn('site', kwargs)
- if kwargs['body'] is None:
+ if kwargs['data'] is None:
# use uri and remove script path
parameters = kwargs['uri']
- prefix = kwargs['site'].scriptpath() + '/api.php?'
+ prefix = site.scriptpath() + '/api.php?'
self.assertEqual(prefix, parameters[:len(prefix)])
parameters = parameters[len(prefix):]
else:
- parameters = kwargs['body']
+ parameters = kwargs['data']
parameters = parameters.encode('ascii') # it should be bytes anyway
# Extract parameter data from the body, it's ugly but allows us
# to verify that we actually test the right request
parameters = [p.split(b'=', 1) for p in parameters.split(b'&')]
keys = [p[0].decode('ascii') for p in parameters]
values = [unquote_to_bytes(p[1]) for p in parameters]
- values = [v.decode(kwargs['site'].encoding()) for v in values]
+ values = [v.decode(site.encoding()) for v in values]
values = [v.replace('+', ' ') for v in values]
values = [set(v.split('|')) for v in values]
parameters = dict(zip(keys, values))
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/651272
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I559aa41b986f981e0cfcb514b5a935d4ce4c42fc
Gerrit-Change-Number: 651272
Gerrit-PatchSet: 4
Gerrit-Owner: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
Gerrit-MessageType: merged
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/649384 )
Change subject: [cleanup] Remove desupported Family methods and attributes
......................................................................
[cleanup] Remove desupported Family methods and attributes
- remove nocapitalize attribute
- remove shared_image_repository method
Bug: T89451
Change-Id: Ia613f763f5180dba6ec07b766ef606958bdd0334
---
M pywikibot/families/omegawiki_family.py
M pywikibot/families/wikipedia_family.py
M pywikibot/families/wiktionary_family.py
M pywikibot/family.py
M pywikibot/site/_basesite.py
M tests/site_tests.py
6 files changed, 1 insertion(+), 77 deletions(-)
Approvals:
Mpaa: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/families/omegawiki_family.py b/pywikibot/families/omegawiki_family.py
index 8edd09c..eecdd45 100644
--- a/pywikibot/families/omegawiki_family.py
+++ b/pywikibot/families/omegawiki_family.py
@@ -16,10 +16,6 @@
name = 'omegawiki'
domain = 'www.omegawiki.org'
- # On most Wikipedias page names must start with a capital letter, but some
- # languages don't use this.
- nocapitalize = ['omegawiki']
-
def scriptpath(self, code):
"""Return the script path for this family."""
return ''
diff --git a/pywikibot/families/wikipedia_family.py b/pywikibot/families/wikipedia_family.py
index ee0371b..8cc954f 100644
--- a/pywikibot/families/wikipedia_family.py
+++ b/pywikibot/families/wikipedia_family.py
@@ -142,10 +142,6 @@
'zh-classical', 'zh-min-nan', 'zh-yue', 'zu',
]
- # On most Wikipedias page names must start with a capital letter,
- # but some languages don't use this.
- nocapitalize = ['jbo']
-
# Languages that used to be coded in iso-8859-1
latin1old = {
'af', 'bs', 'co', 'cs', 'da', 'de', 'en', 'es', 'es', 'et', 'eu', 'fi',
diff --git a/pywikibot/families/wiktionary_family.py b/pywikibot/families/wiktionary_family.py
index 938fe2e..4d23a4e 100644
--- a/pywikibot/families/wiktionary_family.py
+++ b/pywikibot/families/wiktionary_family.py
@@ -70,12 +70,6 @@
'yi', 'zh', 'zh-min-nan', 'zu',
]
- # Other than most Wikipedias, page names must not start with a capital
- # letter on ALL Wiktionaries.
- @classproperty
- def nocapitalize(cls):
- return list(cls.langs.keys())
-
# Which languages have a special order for putting interlanguage links,
# and what order is it? If a language is not in interwiki_putfirst,
# alphabetical order on language code is used. For languages that are in
diff --git a/pywikibot/family.py b/pywikibot/family.py
index 7d79448..947119e 100644
--- a/pywikibot/family.py
+++ b/pywikibot/family.py
@@ -24,7 +24,6 @@
from pywikibot.tools import (
classproperty, deprecated, deprecated_args, frozenmap,
issue_deprecation_warning, ModuleDeprecationWrapper, PYTHON_VERSION,
- remove_last_args,
)
if PYTHON_VERSION >= (3, 9):
@@ -504,10 +503,6 @@
# 'en': "Disambiguation"
disambcatname = {} # type: Dict[str, str]
- # DEPRECATED, stores the code of the site which have a case sensitive
- # main namespace. Use the Namespace given from the Site instead
- nocapitalize = [] # type: List[str]
-
# attop is a list of languages that prefer to have the interwiki
# links at the top of the page.
interwiki_attop = [] # type: List[str]
@@ -660,22 +655,11 @@
class variable. Only penalize getting it because it must be set so that
the backwards compatibility is still available.
"""
- if name == 'nocapitalize':
- issue_deprecation_warning('nocapitalize',
- "APISite.siteinfo['case'] or "
- "Namespace.case == 'case-sensitive'",
- warning_class=FutureWarning,
- since='20150214')
- elif name == 'known_families':
+ if name == 'known_families':
issue_deprecation_warning('known_families',
'APISite.interwiki(prefix)',
warning_class=FutureWarning,
since='20150503')
- elif name == 'shared_data_repository':
- issue_deprecation_warning('shared_data_repository',
- 'APISite.data_repository()',
- warning_class=FutureWarning,
- since='20151023')
return super().__getattribute__(name)
@staticmethod
@@ -738,15 +722,6 @@
Family._families[fam] = cls
return cls
- @classproperty
- @deprecated('Family.codes or APISite.validLanguageLinks', since='20151014',
- future_warning=True)
- def iwkeys(cls):
- """DEPRECATED: List of (interwiki_forward's) family codes."""
- if cls.interwiki_forward:
- return list(pywikibot.Family(cls.interwiki_forward).langs.keys())
- return list(cls.langs.keys())
-
@deprecated('APISite.interwiki', since='20151014', future_warning=True)
def get_known_families(self, code):
"""DEPRECATED: Return dict of inter-family interwiki links."""
@@ -912,11 +887,6 @@
"""Return path to api.php."""
return '%s/api.php' % self.scriptpath(code)
- @deprecated('APISite.article_path', since='20150905', future_warning=True)
- def nicepath(self, code):
- """DEPRECATED: Return nice path prefix, e.g. '/wiki/'."""
- return '/wiki/'
-
def eventstreams_host(self, code):
"""Hostname for EventStreams."""
raise NotImplementedError('This family does not support EventStreams')
@@ -930,12 +900,6 @@
"""Return the path to title using index.php with redirects disabled."""
return '%s?title=%s&redirect=no' % (self.path(code), title)
- @deprecated('APISite.nice_get_address(title)', since='20150628',
- future_warning=True)
- def nice_get_address(self, code, title):
- """DEPRECATED: Return the nice path to title using index.php."""
- return '%s%s' % (self.nicepath(code), title)
-
def interface(self, code):
"""
Return interface to use for code.
@@ -1066,15 +1030,6 @@
"""Return the shared image repository, if any."""
return (None, None)
- # Deprecated via __getattribute__
- @remove_last_args(['transcluded'])
- def shared_data_repository(self, code):
- """Return the shared Wikibase repository, if any."""
- repo = pywikibot.Site(code, self).data_repository()
- if repo is not None:
- return repo.code, repo.family.name
- return (None, None)
-
def isPublic(self, code):
"""Check the wiki require logging in before viewing it."""
return True
diff --git a/pywikibot/site/_basesite.py b/pywikibot/site/_basesite.py
index 493493f..ce95c52 100644
--- a/pywikibot/site/_basesite.py
+++ b/pywikibot/site/_basesite.py
@@ -100,18 +100,6 @@
self._locked_pages = set()
@property
- @deprecated(
- "APISite.siteinfo['case'] or Namespace.case == 'case-sensitive'",
- since='20170504', future_warning=True)
- def nocapitalize(self):
- """
- Return whether this site's default title case is case-sensitive.
-
- DEPRECATED.
- """
- return self.siteinfo['case'] == 'case-sensitive'
-
- @property
def throttle(self):
"""Return this Site's throttle. Initialize a new one if needed."""
if not hasattr(self, '_throttle'):
diff --git a/tests/site_tests.py b/tests/site_tests.py
index b4758ea..ed9a4dd 100644
--- a/tests/site_tests.py
+++ b/tests/site_tests.py
@@ -48,11 +48,6 @@
self.assertEqual(self.site.case(), self.site.siteinfo['case'])
self.assertOneDeprecationParts('pywikibot.site.APISite.case',
'siteinfo or Namespace instance')
- self.assertIs(self.site.nocapitalize,
- self.site.siteinfo['case'] == 'case-sensitive')
- self.assertOneDeprecationParts(
- 'pywikibot.site._basesite.BaseSite.nocapitalize',
- "APISite.siteinfo['case'] or Namespace.case == 'case-sensitive'")
def test_siteinfo_normal_call(self):
"""Test calling the Siteinfo without setting dump."""
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/649384
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: Ia613f763f5180dba6ec07b766ef606958bdd0334
Gerrit-Change-Number: 649384
Gerrit-PatchSet: 2
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: jenkins-bot
Gerrit-MessageType: merged
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/650976 )
Change subject: [IMPR] remove unneeded explicit params in fetch() and request()
......................................................................
[IMPR] remove unneeded explicit params in fetch() and request()
Remove those parameters that are explicit in http.fetch() and
http.requests() and instead can be embedded in **kwargs.
Also cleanup fetch() calls that use 'uri' as a keyword paramenter.
Change-Id: I335058809bf1c1732b305eac12a70cd1b82996d4
---
M pywikibot/comms/http.py
M pywikibot/data/api.py
M pywikibot/version.py
M tests/aspects.py
M tests/http_tests.py
M tests/site_tests.py
6 files changed, 29 insertions(+), 41 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/comms/http.py b/pywikibot/comms/http.py
index 5a103be..aa8b6fa 100644
--- a/pywikibot/comms/http.py
+++ b/pywikibot/comms/http.py
@@ -218,8 +218,8 @@
return UserAgent().random
-def request(site, uri: Optional[str] = None, method='GET', params=None,
- body=None, headers=None, data=None, **kwargs) -> str:
+@deprecated_args(body='data')
+def request(site, uri: Optional[str] = None, headers=None, **kwargs) -> str:
"""
Request to Site with default error handling and response decoding.
@@ -239,11 +239,6 @@
@type charset: CodecInfo, str, None
@return: The received data
"""
- # body and data parameters both map to the data parameter of
- # requests.Session.request.
- if data:
- body = data
-
kwargs.setdefault('verify', site.verify_SSL_certificate())
old_validation = kwargs.pop('disable_ssl_certificate_validation', None)
if old_validation is not None:
@@ -261,7 +256,7 @@
headers['user-agent'] = user_agent(site, format_string)
baseuri = site.base_url(uri)
- r = fetch(baseuri, method, params, body, headers, **kwargs)
+ r = fetch(baseuri, headers=headers, **kwargs)
site.throttle.retry_after = int(r.response_headers.get('retry-after', 0))
return r.text
@@ -321,11 +316,9 @@
warning('Http response status {}'.format(request.status_code))
-@deprecated_args(callback=True)
-def fetch(uri, method='GET', params=None, body=None, headers=None,
- default_error_handling: bool = True,
- use_fake_user_agent: Union[bool, str] = False,
- data=None, **kwargs):
+@deprecated_args(callback=True, body='data')
+def fetch(uri, method='GET', headers=None, default_error_handling: bool = True,
+ use_fake_user_agent: Union[bool, str] = False, **kwargs):
"""
HTTP request.
@@ -346,11 +339,6 @@
@type callbacks: list of callable
@rtype: L{threadedhttp.HttpRequest}
"""
- # body and data parameters both map to the data parameter of
- # requests.Session.request.
- if data:
- body = data
-
# Change user agent depending on fake UA settings.
# Set header to new UA if needed.
headers = headers or {}
@@ -416,7 +404,7 @@
# Note that the connections are pooled which mean that a future
# HTTPS request can succeed even if the certificate is invalid and
# verify=True, when a request with verify=False happened before
- response = session.request(method, uri, params=params, data=body,
+ response = session.request(method, uri,
headers=headers, auth=auth, timeout=timeout,
**kwargs)
except Exception as e:
diff --git a/pywikibot/data/api.py b/pywikibot/data/api.py
index 4b15ec9..b5e35bc 100644
--- a/pywikibot/data/api.py
+++ b/pywikibot/data/api.py
@@ -1605,9 +1605,9 @@
"""
try:
data = http.request(
- site=self.site, uri=uri,
+ self.site, uri=uri,
method='GET' if use_get else 'POST',
- body=body, headers=headers)
+ data=body, headers=headers)
except Server504Error:
pywikibot.log('Caught HTTP 504 error; retrying')
except Server414Error:
diff --git a/pywikibot/version.py b/pywikibot/version.py
index 3656cd3..c308a73 100644
--- a/pywikibot/version.py
+++ b/pywikibot/version.py
@@ -198,8 +198,8 @@
@return: the git hash
"""
uri = 'https://github.com/wikimedia/{}/!svn/vcc/default'.format(tag)
- request = fetch(uri=uri, method='PROPFIND',
- body="<?xml version='1.0' encoding='utf-8'?>"
+ request = fetch(uri, method='PROPFIND',
+ data="<?xml version='1.0' encoding='utf-8'?>"
'<propfind xmlns=\"DAV:\"><allprop/></propfind>',
headers={'label': str(rev),
'user-agent': 'SVN/1.7.5 {pwb}'})
@@ -374,7 +374,7 @@
# Gerrit API responses include )]}' at the beginning,
# make sure to strip it out
buf = http.fetch(
- uri='https://gerrit.wikimedia.org/r/projects/pywikibot%2Fcore/' + path,
+ 'https://gerrit.wikimedia.org/r/projects/pywikibot%2Fcore/' + path,
headers={'user-agent': '{pwb}'}).text[4:]
try:
hsh = json.loads(buf)['revision']
diff --git a/tests/aspects.py b/tests/aspects.py
index 8a8d29b..de69f64 100644
--- a/tests/aspects.py
+++ b/tests/aspects.py
@@ -484,7 +484,7 @@
try:
if '://' not in hostname:
hostname = 'http://' + hostname
- r = http.fetch(uri=hostname,
+ r = http.fetch(hostname,
method='HEAD',
default_error_handling=False)
if r.exception:
diff --git a/tests/http_tests.py b/tests/http_tests.py
index b5f2fab..a6880fe 100644
--- a/tests/http_tests.py
+++ b/tests/http_tests.py
@@ -100,12 +100,12 @@
"""Test if http.fetch respects disable_ssl_certificate_validation."""
self.assertRaisesRegex(
pywikibot.FatalServerError, self.CERT_VERIFY_FAILED_RE, http.fetch,
- uri='https://testssl-expire-r2i2.disig.sk/index.en.html')
+ 'https://testssl-expire-r2i2.disig.sk/index.en.html')
http.session.close() # clear the connection
with warnings.catch_warnings(record=True) as warning_log:
response = http.fetch(
- uri='https://testssl-expire-r2i2.disig.sk/index.en.html',
+ 'https://testssl-expire-r2i2.disig.sk/index.en.html',
verify=False)
r = response.text
self.assertIsInstance(r, str)
@@ -115,7 +115,7 @@
# Verify that it now fails again
self.assertRaisesRegex(
pywikibot.FatalServerError, self.CERT_VERIFY_FAILED_RE, http.fetch,
- uri='https://testssl-expire-r2i2.disig.sk/index.en.html')
+ 'https://testssl-expire-r2i2.disig.sk/index.en.html')
http.session.close() # clear the connection
# Verify that the warning occurred
@@ -144,14 +144,14 @@
self.assertRaisesRegex(pywikibot.Server504Error,
r'Server ([^\:]+|[^\:]+:[0-9]+) timed out',
http.fetch,
- uri=self.get_httpbin_url('/status/504'))
+ self.get_httpbin_url('/status/504'))
def test_server_not_found(self):
"""Test server not found exception."""
self.assertRaisesRegex(requests.exceptions.ConnectionError,
'Max retries exceeded with url: /w/api.php',
http.fetch,
- uri='http://ru-sib.wikipedia.org/w/api.php',
+ 'http://ru-sib.wikipedia.org/w/api.php',
default_error_handling=True)
def test_invalid_scheme(self):
@@ -160,18 +160,18 @@
self.assertRaisesRegex(
requests.exceptions.InvalidSchema,
"No connection adapters were found for u?'invalid://url'",
- http.fetch, uri='invalid://url')
+ http.fetch, 'invalid://url')
def test_follow_redirects(self):
"""Test follow 301 redirects correctly."""
# The following will redirect from ' ' -> '_', and maybe to https://
- r = http.fetch(uri='http://en.wikipedia.org/wiki/Main%20Page')
+ r = http.fetch('http://en.wikipedia.org/wiki/Main%20Page')
self.assertEqual(r.status_code, 200)
self.assertIsNotNone(r.data.history)
self.assertIn('//en.wikipedia.org/wiki/Main_Page',
r.data.url)
- r = http.fetch(uri='http://en.wikia.com')
+ r = http.fetch('http://en.wikia.com')
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data.url,
'https://www.fandom.com/explore')
@@ -547,7 +547,7 @@
def test_http(self):
"""Test with http, standard http interface for pywikibot."""
- r = http.fetch(uri=self.url)
+ r = http.fetch(self.url)
self.assertEqual(r.content, self.png)
@@ -585,7 +585,7 @@
def test_no_params(self):
"""Test fetch method with no parameters."""
- r = http.fetch(uri=self.url, params={})
+ r = http.fetch(self.url, params={})
if r.status_code == 503: # T203637
self.skipTest(
'503: Service currently not available for ' + self.url)
@@ -601,7 +601,7 @@
HTTPBin returns the args in their urldecoded form, so what we put in
should be the same as what we get out.
"""
- r = http.fetch(uri=self.url, params={'fish&chips': 'delicious'})
+ r = http.fetch(self.url, params={'fish&chips': 'delicious'})
if r.status_code == 503: # T203637
self.skipTest(
'503: Service currently not available for ' + self.url)
@@ -617,7 +617,7 @@
HTTPBin returns the args in their urldecoded form, so what we put in
should be the same as what we get out.
"""
- r = http.fetch(uri=self.url, params={'fish%26chips': 'delicious'})
+ r = http.fetch(self.url, params={'fish%26chips': 'delicious'})
if r.status_code == 503: # T203637
self.skipTest(
'503: Service currently not available for ' + self.url)
@@ -638,12 +638,12 @@
'X-Amzn-Trace-Id', 'X-B3-Parentspanid', 'X-B3-Spanid',
'X-B3-Traceid', 'X-Forwarded-Client-Cert',
)
- r_data_request = http.fetch(uri=self.get_httpbin_url('/post'),
+ r_data_request = http.fetch(self.get_httpbin_url('/post'),
method='POST',
data={'fish&chips': 'delicious'})
- r_body_request = http.fetch(uri=self.get_httpbin_url('/post'),
+ r_body_request = http.fetch(self.get_httpbin_url('/post'),
method='POST',
- body={'fish&chips': 'delicious'})
+ data={'fish&chips': 'delicious'})
r_data = json.loads(r_data_request.text)
r_body = json.loads(r_body_request.text)
diff --git a/tests/site_tests.py b/tests/site_tests.py
index b4758ea..e57504f 100644
--- a/tests/site_tests.py
+++ b/tests/site_tests.py
@@ -2982,7 +2982,7 @@
self.assertIsInstance(site.obsolete, bool)
self.assertTrue(site.obsolete)
self.assertEqual(site.hostname(), 'mh.wikipedia.org')
- r = http.fetch(uri='http://mh.wikipedia.org/w/api.php',
+ r = http.fetch('http://mh.wikipedia.org/w/api.php',
default_error_handling=False)
self.assertEqual(r.status_code, 200)
self.assertEqual(site.siteinfo['lang'], 'mh')
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/650976
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I335058809bf1c1732b305eac12a70cd1b82996d4
Gerrit-Change-Number: 650976
Gerrit-PatchSet: 2
Gerrit-Owner: Mpaa <mpaa.wiki(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
Gerrit-MessageType: merged