jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225150?usp=email )
Change subject: FIX: Remove invisible chars from Section.heading
......................................................................
FIX: Remove invisible chars from Section.heading
Section.heading is used to validate whether a Page.section() exists.
This may fail if the section contains invisible chars like LTR or RTL
but the MediaWiki link works. This patch removes invisible chars from
Section.heading but keeps Section title unchanged.
Bug: T411307
Change-Id: Ide04ca1b1b4eac05ae4ae7211db8496e5200f87a
---
M pywikibot/textlib.py
1 file changed, 5 insertions(+), 1 deletion(-)
Approvals:
JJMC89: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/textlib.py b/pywikibot/textlib.py
index 942cef3..5e08c36 100644
--- a/pywikibot/textlib.py
+++ b/pywikibot/textlib.py
@@ -22,6 +22,7 @@
from pywikibot.family import Family
from pywikibot.time import TZoneFixedOffset
from pywikibot.tools import ModuleDeprecationWrapper, first_lower, first_upper
+from pywikibot.tools.chars import INVISIBLE_REGEX
from pywikibot.userinterfaces.transliteration import NON_ASCII_DIGITS
@@ -1112,9 +1113,12 @@
"""Return the section title without equal signs.
.. versionadded:: 8.2
+ .. versionchanged:: 11.0
+ Invisible chars like LTR or RTO are removed.
"""
level = self.level
- return self.title[level:-level].strip()
+ title = self.title[level:-level].strip()
+ return INVISIBLE_REGEX.sub('', title)
class SectionList(list):
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225150?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: Ide04ca1b1b4eac05ae4ae7211db8496e5200f87a
Gerrit-Change-Number: 1225150
Gerrit-PatchSet: 3
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: JJMC89 <JJMC89.Wikimedia(a)gmail.com>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1227259?usp=email )
Change subject: tests: Fix ua format string result for in TestDrySite.test_user_agent
......................................................................
tests: Fix ua format string result for in TestDrySite.test_user_agent
user_agent_username() was changed to return a username set in
PYWIKIBOT_USERNAME or PWB_USERNAME environment variables
Bug: T414657
Change-Id: I6e244497844d0c6da00e13603e6a0e7c32e0ddc2
---
M tests/dry_site_tests.py
1 file changed, 3 insertions(+), 3 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/tests/dry_site_tests.py b/tests/dry_site_tests.py
index f9f0652..c8e492e 100755
--- a/tests/dry_site_tests.py
+++ b/tests/dry_site_tests.py
@@ -106,9 +106,9 @@
ua_username = user_agent_username()
self.assertEqual(f'Foo {ua_username}'.strip(),
user_agent(x, format_string='Foo {username}'))
- self.assertEqual(f'Foo ({x})',
- user_agent(
- x, format_string='Foo ({script_comments})'))
+ res = f'Foo ({x}; User:{ua_username})' if ua_username else f'Foo ({x})'
+ self.assertEqual(
+ res, user_agent(x, format_string='Foo ({script_comments})'))
if __name__ == '__main__':
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1227259?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I6e244497844d0c6da00e13603e6a0e7c32e0ddc2
Gerrit-Change-Number: 1227259
Gerrit-PatchSet: 2
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
Xqt has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1226872?usp=email )
Change subject: cleanup: SVN support was already dropped. Remove hints from documentations.
......................................................................
cleanup: SVN support was already dropped. Remove hints from documentations.
Bug: T362484
Change-Id: I62b439f0d73fb7bd0389f6e4c8ea47037111ad75
---
M pywikibot/README.rst
M pywikibot/version.py
M tests/http_tests.py
3 files changed, 3 insertions(+), 3 deletions(-)
Approvals:
Xqt: Verified; Looks good to me, approved
diff --git a/pywikibot/README.rst b/pywikibot/README.rst
index 959af38..0ea442e 100644
--- a/pywikibot/README.rst
+++ b/pywikibot/README.rst
@@ -28,7 +28,7 @@
You do not need to "install" this package to be able to make use of
it. You can actually just run it from the directory where you unpacked
-it or where you have your copy of the SVN or git sources.
+it or where you have your copy of the git sources.
The first time you run a script, the package creates a file named user-config.py
diff --git a/pywikibot/version.py b/pywikibot/version.py
index 084514b..b3fdf6b 100644
--- a/pywikibot/version.py
+++ b/pywikibot/version.py
@@ -108,7 +108,7 @@
UserWarning, stacklevel=2)
exceptions = None
- # Git and SVN can silently fail, as it may be a nightly.
+ # Git can silently fail, as it may be a nightly.
if exceptions: # pragma: no cover
pywikibot.debug(f'version algorithm exceptions:\n{exceptions!r}')
diff --git a/tests/http_tests.py b/tests/http_tests.py
index 63fe428..6cccab3 100755
--- a/tests/http_tests.py
+++ b/tests/http_tests.py
@@ -198,7 +198,7 @@
self.assertNotIn(' ', http.user_agent(format_string=' {pwb} '))
self.assertIn('Pywikibot/' + pywikibot.__version__,
- http.user_agent(format_string='SVN/1.7.5 {pwb}'))
+ http.user_agent(format_string='Git/2.52.0 {pwb}'))
def test_user_agent_username(self) -> None:
"""Test http.user_agent_username function."""
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1226872?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I62b439f0d73fb7bd0389f6e4c8ea47037111ad75
Gerrit-Change-Number: 1226872
Gerrit-PatchSet: 2
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1226297?usp=email )
Change subject: IMPR: Use environment variables is usernames aren't within config.py
......................................................................
IMPR: Use environment variables is usernames aren't within config.py
If no username is given for a site in user4-config.py use the
environment variables 'PYWIKIBOT_USERNAME' usually used by tests or
'PWB_USERNAME' used on toolforge instead.
Bug: T414201
Change-Id: Ib08ac40ba8da795b48e59b57f21b58d12bddfd9f
---
M pywikibot/comms/http.py
1 file changed, 10 insertions(+), 4 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/pywikibot/comms/http.py b/pywikibot/comms/http.py
index 3686317..a972e1a 100644
--- a/pywikibot/comms/http.py
+++ b/pywikibot/comms/http.py
@@ -33,6 +33,7 @@
import atexit
import codecs
+import os
import re
import sys
import threading
@@ -176,8 +177,15 @@
- replaces spaces (' ') with '_'
- encodes the username as 'utf-8' and if the username is not ASCII
- URL encodes the username if it is not ASCII, or contains '%'
+
+ .. versionchanged:: 11.0
+ If *username* is not given, get it from environment variables
+ 'PYWIKIBOT_USERNAME' usually used by tests or 'PWB_USERNAME' used
+ on toolforge.
"""
if not username:
+ username = os.getenv('PYWIKIBOT_USERNAME') or os.getenv('PWB_USERNAME')
+ if not username:
return ''
username = username.replace(' ', '_') # Avoid spaces or %20.
@@ -215,13 +223,11 @@
if config.user_agent_description:
script_comments.append(config.user_agent_description)
- username = ''
+ username = user_agent_username(site.username() if site else None)
if site:
script_comments.append(str(site))
- # TODO: there are several ways of identifying a user, and username
- # is not the best for a HTTP header if the username isn't ASCII.
- if site.username():
+ if username:
username = user_agent_username(site.username())
script_comments.append('User:' + username)
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1226297?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: Ib08ac40ba8da795b48e59b57f21b58d12bddfd9f
Gerrit-Change-Number: 1226297
Gerrit-PatchSet: 3
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Fabfur <ffurnari(a)wikimedia.org>
Gerrit-Reviewer: JAn DudÃk <jan.dudik(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1226825?usp=email )
Change subject: decrease wait cycles and increas wait time for test_watch
......................................................................
decrease wait cycles and increas wait time for test_watch
Change-Id: Ic05d9741a3e17f0aa0c82a9c4d6191a07e8be313
Signed-off-by: Xqt <info(a)gno.de>
---
M tests/page_tests.py
1 file changed, 2 insertions(+), 4 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/tests/page_tests.py b/tests/page_tests.py
index 6de059a..8d42776 100755
--- a/tests/page_tests.py
+++ b/tests/page_tests.py
@@ -1105,11 +1105,9 @@
rv = userpage.watch(expiry='5 seconds')
self.assertTrue(rv)
self.assertIn(userpage, userpage.site.watched_pages(**wp_params))
- # Wait for the expiry to pass
- time.sleep(5)
# Retry check for unwatch to propagate for up to 30 seconds
- for _ in range(30):
- time.sleep(1)
+ for _ in range(6):
+ time.sleep(5) # Wait for the expiry to pass
if userpage not in userpage.site.watched_pages(**wp_params):
break
else:
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1226825?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: Ic05d9741a3e17f0aa0c82a9c4d6191a07e8be313
Gerrit-Change-Number: 1226825
Gerrit-PatchSet: 3
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1009778?usp=email )
Change subject: [IMPR] use global -code instead of -lang to determine a site
......................................................................
[IMPR] use global -code instead of -lang to determine a site
site.code is the identifying code of a site presumed to be equal to the
wiki prefix whereas site language is the ISO language code for a site.
e.g. 'als' is a site code whereas its site language is 'gsw'
'meta' is a site code whereas its language is 'en'
The global option should follow this difference.
The old -lang option is kept for backward compatibility.
Patch detached from I145c0dc283ff597a697752aec901c1a14e6b1416
Change-Id: Ieb78b3a99505b28f99ced4ba912c3b11431063de
---
M .github/workflows/login_tests-ci.yml
M .github/workflows/oauth_tests-ci.yml
M .github/workflows/pywikibot-ci.yml
M HISTORY.rst
M pywikibot/bot.py
M pywikibot/page/_basepage.py
M pywikibot/scripts/generate_user_files.py
M pywikibot/scripts/login.py
M pywikibot/scripts/wrapper.py
M scripts/coordinate_import.py
M scripts/harvest_template.py
M scripts/replicate_wiki.py
M scripts/transferbot.py
M tests/README.rst
M tox.ini
15 files changed, 41 insertions(+), 39 deletions(-)
Approvals:
jenkins-bot: Verified
Xqt: Looks good to me, approved
diff --git a/.github/workflows/login_tests-ci.yml b/.github/workflows/login_tests-ci.yml
index 035c9a7..cf2b878 100644
--- a/.github/workflows/login_tests-ci.yml
+++ b/.github/workflows/login_tests-ci.yml
@@ -102,7 +102,7 @@
if [ ${{matrix.site}} != false ]; then
python -Werror::UserWarning -m pwb generate_user_files -site:${{matrix.site}} -user:${{ env.PYWIKIBOT_USERNAME }} -v -debug;
else
- python -Werror::UserWarning -m pwb generate_user_files -family:${{matrix.family}} -lang:${{matrix.code}} -user:${{ env.PYWIKIBOT_USERNAME }} -v -debug;
+ python -Werror::UserWarning -m pwb generate_user_files -family:${{matrix.family}} -code:${{matrix.code}} -user:${{ env.PYWIKIBOT_USERNAME }} -v -debug;
fi
echo "usernames['wikipedia']['en'] = '${{ env.PYWIKIBOT_USERNAME }}'" >> user-config.py
echo "usernames['wikisource']['zh'] = '${{ env.PYWIKIBOT_USERNAME }}'" >> user-config.py
diff --git a/.github/workflows/oauth_tests-ci.yml b/.github/workflows/oauth_tests-ci.yml
index 381ee95..7ba998d 100644
--- a/.github/workflows/oauth_tests-ci.yml
+++ b/.github/workflows/oauth_tests-ci.yml
@@ -79,7 +79,7 @@
python pwb.py generate_family_file http://${{matrix.code}}.wikipedia.beta.wmcloud.org/ wpbeta y
- name: Generate user files
run: |
- python -Werror::UserWarning -m pwb generate_user_files -family:${{matrix.family}} -lang:${{matrix.code}} -user:${{ env.PYWIKIBOT_USERNAME }} -v -debug;
+ python -Werror::UserWarning -m pwb generate_user_files -family:${{matrix.family}} -code:${{matrix.code}} -user:${{ env.PYWIKIBOT_USERNAME }} -v -debug;
echo "usernames['commons']['beta'] = '${{ env.PYWIKIBOT_USERNAME }}'" >> user-config.py
echo "usernames['meta']['meta'] = '${{ env.PYWIKIBOT_USERNAME }}'" >> user-config.py
echo "authenticate['${{ matrix.domain }}'] = ('${{ steps.split.outputs._0 }}', '${{ steps.split.outputs._1 }}', '${{ steps.split.outputs._2 }}', '${{ steps.split.outputs._3 }}')" >> user-config.py
diff --git a/.github/workflows/pywikibot-ci.yml b/.github/workflows/pywikibot-ci.yml
index a0ca8af..342ba01 100644
--- a/.github/workflows/pywikibot-ci.yml
+++ b/.github/workflows/pywikibot-ci.yml
@@ -97,7 +97,7 @@
if [ ${{matrix.site}} != false ]; then
python -Werror::UserWarning -m pwb generate_user_files -site:${{matrix.site}} -user:${{ env.PYWIKIBOT_USERNAME }} -v -debug;
else
- python -Werror::UserWarning -m pwb generate_user_files -family:${{matrix.family}} -lang:${{matrix.code}} -user:${{ env.PYWIKIBOT_USERNAME }} -v -debug;
+ python -Werror::UserWarning -m pwb generate_user_files -family:${{matrix.family}} -code:${{matrix.code}} -user:${{ env.PYWIKIBOT_USERNAME }} -v -debug;
fi
echo "usernames['wikipedia']['en'] = '${{ env.PYWIKIBOT_USERNAME }}'" >> user-config.py
echo "usernames['wikisource']['zh'] = '${{ env.PYWIKIBOT_USERNAME }}'" >> user-config.py
diff --git a/HISTORY.rst b/HISTORY.rst
index 4736e14..8437e93 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -1578,7 +1578,7 @@
* Do not strip all whitespaces from Link.title (:phab:`T197642`)
* Introduce a common BaseDataDict as parent for LanguageDict and AliasesDict
* Replaced PageNotSaved by PageSaveRelatedError (:phab:`T267821`)
-* Add -site option as -family -lang shortcut
+* Add -site option as -family -code shortcut
* Enable APISite.exturlusage() with default parameters (:phab:`T266989`)
* Update tools._unidata._category_cf from Unicode version 13.0.0
* Move TokenWallet to site/_tokenwallet.py file
@@ -1694,7 +1694,7 @@
* Add support for ja.wikivoyage (:phab:`T261450`)
* Only run cosmetic changes on wikitext pages (:phab:`T260489`)
-* Leave a script gracefully for wrong -lang and -family option (:phab:`T259756`)
+* Leave a script gracefully for wrong -code and -family option (:phab:`T259756`)
* Change meaning of BasePage.text (:phab:`T260472`)
* site/family methods code2encodings() and code2encoding() has been removed in favour of encoding()/encodings() methods
* Site.getExpandedString() method was removed in favour of expand_text
@@ -2549,7 +2549,7 @@
- added ISBN support
- added redirect support
* Optionally uses external library for improved isbn validation
-* Automatically generating user files when -user, -family and -lang are
+* Automatically generating user files when -user, -family and -code are
provided to a script
* Page.content_model added
* Page.contributors() and Page.revision_count() added
diff --git a/pywikibot/bot.py b/pywikibot/bot.py
index 96b2695..0d28d98 100644
--- a/pywikibot/bot.py
+++ b/pywikibot/bot.py
@@ -205,9 +205,8 @@
-config:xyn The user config filename. Default is user-config.py.
--lang:xx Set the language of the wiki you want to work on,
+-code:xx Set the site code of the wiki you want to work on,
overriding the configuration in user config file.
- xx should be the site code.
-family:xyz Set the family of the wiki you want to work on, e.g.
wikipedia, wiktionary, wikivoyage, ... This will
@@ -689,7 +688,7 @@
"""Handle global command line arguments and return the rest as a list.
Takes the command line arguments as strings, processes all
- :ref:`global parameters<global options>` such as ``-lang`` or
+ :ref:`global parameters<global options>` such as ``-code`` or
``-log``, initialises the logging layer, which emits startup
information into log at level 'verbose'. This function makes sure
that global arguments are applied first, regardless of the order in
@@ -725,17 +724,20 @@
script.
.. versionchanged:: 5.2
- *-site* global option was added
+ ``-site`` global option was added
.. versionchanged:: 7.1
- *-cosmetic_changes* and *-cc* may be set directly instead of
+ ``-cosmetic_changes`` and ``-cc`` may be set directly instead of
toggling the value. Refer :func:`tools.strtobool` for valid values.
.. versionchanged:: 7.7
- *-config* global option was added.
+ ``-config`` global option was added.
.. versionchanged:: 8.0
Short site value can be given if site code is equal to family
like ``-site:meta``.
.. versionchanged:: 8.1
``-nolog`` option also discards command.log.
+ .. versionchanged:: 11.0
+ ``-lang`` option was replaced by ``-code`` but kept for backward
+ compatibility.
:param args: Command line arguments. If None,
:meth:`pywikibot.argvu<userinterfaces._interface_base.ABUIC.argvu>`
@@ -779,7 +781,7 @@
config.family = config.mylang = value
elif option == '-family':
config.family = value
- elif option == '-lang':
+ elif option in ('-code', '-lang'): # -lang might be deprecated later
config.mylang = value
elif option == '-user':
username = value
diff --git a/pywikibot/page/_basepage.py b/pywikibot/page/_basepage.py
index d6d874e..51f6449 100644
--- a/pywikibot/page/_basepage.py
+++ b/pywikibot/page/_basepage.py
@@ -211,7 +211,7 @@
) -> str:
"""Return the title of this Page, as a string.
- :param underscore: Not used with as_link, If true, replace all
+ :param underscore: Not used with *as_link*, If true, replace all
spaces with underscores.
:param with_ns: If false, omit the namespace prefix. If this
option is False and used together with *as_link* return a
@@ -229,9 +229,9 @@
a ':' before ``Category:`` and ``Image:`` links.
:param as_filename: Not used with *as_link*, If true, replace
any characters that are unsafe in filenames.
- :param insite: Only used if *as_link* is true, A site object
+ :param insite: Only used if *as_link* is true, a site object
where the title is to be shown. Default is the current
- family/lang given by ``-family`` and ``-lang`` or ``-site``
+ family/code given by ``-family`` and ``-code`` or ``-site``
option i.e. config.family and config.mylang.
:param without_brackets: Cannot be used with *as_link*, If true,
remove the last pair of brackets (usually removes
diff --git a/pywikibot/scripts/generate_user_files.py b/pywikibot/scripts/generate_user_files.py
index 8676088..be8ad99 100755
--- a/pywikibot/scripts/generate_user_files.py
+++ b/pywikibot/scripts/generate_user_files.py
@@ -508,7 +508,7 @@
:param args: Command line arguments
"""
# set the config family and mylang values to an invalid state so that
- # the script can detect that the command line arguments -family & -lang
+ # the script can detect that the command line arguments -family & -code
# or -site were used and handle_args has updated these config values,
# and 'force' mode can be activated below.
config.family, config.mylang = 'wikipedia', None
diff --git a/pywikibot/scripts/login.py b/pywikibot/scripts/login.py
index fe430b9..e8110da 100755
--- a/pywikibot/scripts/login.py
+++ b/pywikibot/scripts/login.py
@@ -13,7 +13,7 @@
-logout Log out of the current site. Combine with ``-all`` to log
out of all sites, or with :ref:`global options` ``-family``,
- ``-lang`` or ``-site`` to log out of a specific site.
+ ``-code`` or ``-site`` to log out of a specific site.
-oauth Generate OAuth authentication information.
diff --git a/pywikibot/scripts/wrapper.py b/pywikibot/scripts/wrapper.py
index f777fa1..5bdcbce 100755
--- a/pywikibot/scripts/wrapper.py
+++ b/pywikibot/scripts/wrapper.py
@@ -29,7 +29,7 @@
Currently, `<pwb options>` are :ref:`global options`. This can be used
for tests to set the default site (see :phab:`T216825`)::
- python pwb.py -lang:de bot_tests -v
+ python pwb.py -code:de bot_tests -v
.. seealso:: :mod:`pwb` entry point
.. versionchanged:: 7.0
diff --git a/scripts/coordinate_import.py b/scripts/coordinate_import.py
index aaba65d..01225e7 100755
--- a/scripts/coordinate_import.py
+++ b/scripts/coordinate_import.py
@@ -16,7 +16,7 @@
You can use any typical pagegenerator to provide with a list of pages:
- python pwb.py coordinate_import -lang:it -family:wikipedia -namespace:0 \
+ python pwb.py coordinate_import -site:wikipedia:it -namespace:0 \
-transcludes:Infobox_stazione_ferroviaria
You can also run over a set of items on the repo without coordinates and
@@ -42,7 +42,7 @@
¶ms;
"""
#
-# (C) Pywikibot team, 2013-2024
+# (C) Pywikibot team, 2013-2026
#
# Distributed under the terms of MIT license.
#
diff --git a/scripts/harvest_template.py b/scripts/harvest_template.py
index 9ac823e..3e8ac9b 100755
--- a/scripts/harvest_template.py
+++ b/scripts/harvest_template.py
@@ -52,27 +52,27 @@
parameter of "Infobox person" on English Wikipedia as Wikidata property
"P18" (image):
- python pwb.py harvest_template -lang:en -family:wikipedia -namespace:0 \
+ python pwb.py harvest_template -site:wikipedia:en -namespace:0 \
-template:"Infobox person" image P18
The following command will behave the same as the previous example and
also try to import [[links]] from "birth_place" parameter of the same
template as Wikidata property "P19" (place of birth):
- python pwb.py harvest_template -lang:en -family:wikipedia -namespace:0 \
+ python pwb.py harvest_template -site:wikipedia:en -namespace:0 \
-template:"Infobox person" image P18 birth_place P19
The following command will import both "birth_place" and "death_place"
params with -islink modifier, ie. the bot will try to import values,
even if it doesn't find a [[link]]:
- python pwb.py harvest_template -lang:en -family:wikipedia -namespace:0 \
+ python pwb.py harvest_template -site:wikipedia:en -namespace:0 \
-template:"Infobox person" -islink birth_place P19 death_place P20
The following command will do the same but only "birth_place" can be
imported without a link:
- python pwb.py harvest_template -lang:en -family:wikipedia -namespace:0 \
+ python pwb.py harvest_template -site:wikipedia:en -namespace:0 \
-template:"Infobox person" birth_place P19 -islink death_place P20
The following command will import an occupation from "occupation"
@@ -80,7 +80,7 @@
"P106" (occupation). The page won't be skipped if the item already has
that property but there is not the new value:
- python pwb.py harvest_template -lang:en -family:wikipedia -namespace:0 \
+ python pwb.py harvest_template -site:wikipedia:en -namespace:0 \
-template:"Infobox person" occupation P106 -exists:p
The following command will import band members from the "current_members"
@@ -88,7 +88,7 @@
property "P527" (has part). This will only extract multiple band members
if each is linked, and will not add duplicate claims for the same member:
- python pwb.py harvest_template -lang:en -family:wikipedia -namespace:0 \
+ python pwb.py harvest_template -site:wikipedia:en -namespace:0 \
-template:"Infobox musical artist" current_members P527 -exists:p -multi
The following command will import the category's main topic from the
@@ -98,7 +98,7 @@
property "P910" (topic's main category) unless a claim of that property
is already there:
- python pwb.py harvest_template -lang:en -family:wikipedia -namespace:14 \
+ python pwb.py harvest_template -site:wikipedia:en -namespace:14 \
-template:"Cat main" 1 P301 -inverse:P910 -islink
@@ -109,7 +109,7 @@
the -inverse option.
"""
#
-# (C) Pywikibot team, 2013-2025
+# (C) Pywikibot team, 2013-2026
#
# Distributed under the terms of MIT license.
#
diff --git a/scripts/replicate_wiki.py b/scripts/replicate_wiki.py
index d77ee59..9762e50 100755
--- a/scripts/replicate_wiki.py
+++ b/scripts/replicate_wiki.py
@@ -9,7 +9,7 @@
or::
- python pwb.py replicate_wiki [-r] -ns 10 -family:wikipedia -lang:nl li fy
+ python pwb.py replicate_wiki [-r] -ns 10 -family:wikipedia -code:nl li fy
to copy all templates from nlwiki to liwiki and fywiki. It will show
which pages have to be changed if -r is not present, and will only
@@ -33,7 +33,7 @@
-r, --replace actually replace pages (without this option
you will only get an overview page)
--o, --original original wiki (you may use -lang:<code> option
+-o, --original original wiki (you may use -code:<code> option
instead)
-ns, --namespace specify namespace
@@ -42,7 +42,7 @@
destination_wiki destination wiki(s)
"""
#
-# (C) Pywikibot team, 2012-2024
+# (C) Pywikibot team, 2012-2026
#
# Distributed under the terms of the MIT license.
#
diff --git a/scripts/transferbot.py b/scripts/transferbot.py
index 252bde8..c96f576 100755
--- a/scripts/transferbot.py
+++ b/scripts/transferbot.py
@@ -29,22 +29,22 @@
Transfer all pages in category "Query service" from the English Wikipedia to
the Arabic Wiktionary, adding "Wiktionary:Import enwp/" as prefix:
- python pwb.py transferbot -family:wikipedia -lang:en -cat:"Query service" \
+ python pwb.py transferbot -site:wikipedia:en -cat:"Query service" \
-tofamily:wiktionary -tolang:ar -prefix:"Wiktionary:Import enwp/"
Copy the template "Query service" from the English Wikipedia to the
Arabic Wiktionary:
- python pwb.py transferbot -family:wikipedia -lang:en -tofamily:wiktionary \
+ python pwb.py transferbot -site:wikipedia:en -tofamily:wiktionary \
-tolang:ar -page:"Template:Query service"
Copy 10 wanted templates of German Wikipedia from English Wikipedia to German:
- python pwb.py transferbot -family:wikipedia -lang:en -tolang:de \
+ python pwb.py transferbot -site:wikipedia:en -tolang:de \
-wantedtemplates:10 -target
"""
#
-# (C) Pywikibot team, 2014-2024
+# (C) Pywikibot team, 2014-2026
#
# Distributed under the terms of the MIT license.
#
diff --git a/tests/README.rst b/tests/README.rst
index b8a61fc..b58d311 100644
--- a/tests/README.rst
+++ b/tests/README.rst
@@ -45,7 +45,7 @@
------------------
Individual test components can be run using unittest, pytest or pwb.
-With -lang and -family or -site options pwb can be used to specify a site.
+With -code and -family or -site options pwb can be used to specify a site.
**unittest**
@@ -69,7 +69,7 @@
python pwb.py tests/api_tests -v
python pwb.py tests/site_tests -v
python pwb.py tests/api_tests -v TestParamInfo.test_init
- python pwb.py -lang:de -family:wikipedia tests/page_tests -v TestPageObject
+ python pwb.py -site:wikipedia:de tests/page_tests -v TestPageObject
**env**
diff --git a/tox.ini b/tox.ini
index 8231b1c..5a35511 100644
--- a/tox.ini
+++ b/tox.ini
@@ -9,7 +9,7 @@
[params]
# Note: tox 4 does not support multiple lines when doing parameters
# substitution.
-generate_user_files = -W error::UserWarning -m pwb generate_user_files -family:wikipedia -lang:test -v
+generate_user_files = -W error::UserWarning -m pwb generate_user_files -site:wikipedia:test -v
# ignores: gui.py (needs tkinter), memento.py (has too many timeouts)
DOCTEST_IGNORES = --ignore-glob=*gui.py --ignore-glob=*memento.py
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1009778?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: Ieb78b3a99505b28f99ced4ba912c3b11431063de
Gerrit-Change-Number: 1009778
Gerrit-PatchSet: 7
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1226226?usp=email )
Change subject: Fix docstrings
......................................................................
Fix docstrings
Change-Id: I779c82e2abdfd465c9d501b24919e1df45398b9d
---
M pywikibot/__init__.py
M pywikibot/_wbtypes.py
M pywikibot/bot.py
M pywikibot/comms/eventstreams.py
M pywikibot/date.py
M pywikibot/diff.py
M pywikibot/logentries.py
M pywikibot/login.py
M pywikibot/page/_basepage.py
M pywikibot/page/_collections.py
M pywikibot/page/_filepage.py
M pywikibot/page/_links.py
M pywikibot/page/_user.py
M pywikibot/page/_wikibase.py
M pywikibot/pagegenerators/_filters.py
M pywikibot/pagegenerators/_generators.py
M pywikibot/proofreadpage.py
M pywikibot/scripts/generate_user_files.py
M pywikibot/site/_apisite.py
M pywikibot/site/_basesite.py
M pywikibot/site/_extensions.py
M pywikibot/site/_generators.py
M pywikibot/textlib.py
M pywikibot/time.py
M pywikibot/tools/__init__.py
M pywikibot/tools/_deprecate.py
M pywikibot/tools/djvu.py
27 files changed, 271 insertions(+), 305 deletions(-)
Approvals:
jenkins-bot: Verified
Xqt: Looks good to me, approved
diff --git a/pywikibot/__init__.py b/pywikibot/__init__.py
index 21e5f87..a9ae08b 100644
--- a/pywikibot/__init__.py
+++ b/pywikibot/__init__.py
@@ -194,19 +194,19 @@
.. versionchanged:: 10.3
accept a trailing slash in *url* after domain.
- :param code: Language code (override config.mylang)
- code may also be a sitename like 'wikipedia:test'
+ :param code: Site code (override config.mylang); *code* may also be
+ a sitename like 'wikipedia:test'
:param fam: Family name or object (override config.family)
:param user: Bot user name to use on this site (override
config.usernames)
- :param interface: Site class or name of class in :py:obj:`pywikibot.site`
- (override config.site_interface)
- :param url: Instead of code and fam, does try to get a Site based on
- the URL.
- :raises ValueError: URL and pair of code and family given
- :raises ValueError: Invalid interface name
- :raises ValueError: Missing Site code
- :raises ValueError: Missing Site family
+ :param interface: Site class or name of class in
+ :py:obj:`pywikibot.site` (override config.site_interface)
+ :param url: Instead of *code* and *fam*, does try to get a Site
+ based on the URL.
+ :raises ValueError: *url* and pair of *code* and *fam* given
+ :raises ValueError: Invalid *interface* name
+ :raises ValueError: Missing Site *code*
+ :raises ValueError: Missing Site *fam*
"""
if url:
# Either code and fam or url with optional fam for AutoFamily name
diff --git a/pywikibot/_wbtypes.py b/pywikibot/_wbtypes.py
index 92a2cda..fe661c0 100644
--- a/pywikibot/_wbtypes.py
+++ b/pywikibot/_wbtypes.py
@@ -304,7 +304,7 @@
precision)*radius*math.cos(math.radians(self.lat))))
:return: Dimension in meters
- :raises ValueError: If neither dim nor precision is set
+ :raises ValueError: If neither *dim* nor *precision* is set
"""
if self._dim is not None:
return self._dim
@@ -339,7 +339,7 @@
that provided with the Coordinate
:param lazy_load: Do not raise :exc:`exceptions.NoPageError` if
ItemPage does not exist
- :return: Pywikibot.ItemPage of the globe
+ :return: :class:`pywikibot.ItemPage` of the globe
"""
if isinstance(self._entity, pywikibot.ItemPage):
return self._entity
@@ -1047,7 +1047,7 @@
that provided with the WbQuantity.
:param lazy_load: Do not raise NoPage if ItemPage does not
exist.
- :return: Pywikibot.ItemPage
+ :return: :class:`pywikibot.ItemPage`
"""
if not isinstance(self._unit, str):
return self._unit
diff --git a/pywikibot/bot.py b/pywikibot/bot.py
index 7e51b77..96b2695 100644
--- a/pywikibot/bot.py
+++ b/pywikibot/bot.py
@@ -1977,20 +1977,17 @@
Source claims (P143) can be created for specific sites
- :cvar use_from_page: If True (default) it will apply
+ :cvar bool | None use_from_page: If True (default) it will apply
ItemPage.fromPage for every item. If False it assumes that the
pages are actually already ItemPage (page in treat_page_and_item
will be None). If None it'll use ItemPage.fromPage when the page
is not in the site's item namespace.
- :type use_from_page: Bool, None
- :cvar treat_missing_item: Whether pages without items should be
+ :cvar bool treat_missing_item: Whether pages without items should be
treated. Note that this is checked after create_missing_item.
- :type treat_missing_item: Bool
- :ivar create_missing_item: If True, new items will be created if the
+ :ivar bool create_missing_item: If True, new items will be created if the
current page doesn't have one. Subclasses should override this
in the initializer with a bool value or using self.opt
attribute.
- :type create_missing_item: Bool
"""
use_from_page = True
@@ -2118,7 +2115,7 @@
"""Create a Claim usable as a source for Wikibase statements.
:param site: Site that is the source of assertions.
- :return: Pywikibot.Claim or None
+ :return: :class:`pywikibot.Claim` or None
"""
source = None
item = i18n.translate(site, self.source_values)
@@ -2231,7 +2228,7 @@
(optional). Note that data created from the page have higher
priority.
:param summary: Optional edit summary to replace the default one
- :return: Pywikibot.ItemPage or None
+ :return: :class:`pywikibot.ItemPage` or None
"""
if not summary:
summary = ('Bot: New item with sitelink from '
diff --git a/pywikibot/comms/eventstreams.py b/pywikibot/comms/eventstreams.py
index eef4cd9..28697bc 100644
--- a/pywikibot/comms/eventstreams.py
+++ b/pywikibot/comms/eventstreams.py
@@ -28,6 +28,7 @@
from pywikibot import Site, Timestamp, config, debug, warning
from pywikibot.backports import NoneType
from pywikibot.comms.http import user_agent
+from pywikibot.site import BaseSite
from pywikibot.tools import cached, deprecated_args
from pywikibot.tools.collections import GeneratorWrapper
@@ -120,6 +121,12 @@
def __init__(self, **kwargs) -> None:
"""Initializer.
+ .. seealso:: https://stream.wikimedia.org/?doc#streams for
+ available Wikimedia stream types to be passed with `streams`
+ parameter.
+ .. note:: *retry* keyword argument is used instead of the
+ underlying *reconnection_time* argument which is ignored.
+
:keyword bool canary: If True, include canary events, see
https://w.wiki/7$2z for more info.
:keyword APISite site: A project site object. Used if no *url*
@@ -167,14 +174,9 @@
:param kwargs: Other keyword arguments passed to `requests_sse`
and `requests` library
- :raises ModuleNotFoundError: Requests-sse is not installed
+ :raises ModuleNotFoundError: requests-sse package is not
+ installed
:raises NotImplementedError: No stream types specified
-
- .. seealso:: https://stream.wikimedia.org/?doc#streams for
- available Wikimedia stream types to be passed with `streams`
- parameter.
- .. note:: *retry* keyword argument is used instead of the
- underlying *reconnection_time* argument which is ignored.
"""
if isinstance(EventSource, ModuleNotFoundError):
raise ImportError(INSTALL_MSG) from EventSource
@@ -307,15 +309,14 @@
2. ``return data['type'] in ('edit', 'log')``
3. ``return data['bot'] is True``
- :keyword ftype: The filter type, one of 'all', 'any', 'none'.
+ :keyword str ftype: The filter type, one of 'all', 'any', 'none'.
Default value is 'all'
- :type ftype: Str
- :param args: You may pass your own filter functions here.
- Every function should be able to handle the data dict from events.
- :type args: Callable
+ :param Callable args: You may pass your own filter functions
+ here. Every function should be able to handle the data dict
+ from events.
:param kwargs: Any key returned by event data with an event data value
for this given key.
- :type kwargs: Str, list, tuple or other sequence
+ :type kwargs: str, list, tuple or other sequence
:raise TypeError: A given args parameter is not a callable.
"""
def _is(data, key=None, value=None):
@@ -417,16 +418,15 @@
del self.source
-def site_rc_listener(site, total: int | None = None):
+def site_rc_listener(site: BaseSite, total: int | None = None):
"""Yield changes received from EventStream.
- :param site: The Pywikibot.Site object to yield live recent changes
+ :param site: The pywikibot.Site object to yield live recent changes
for
- :type site: Pywikibot.BaseSite
:param total: The maximum number of changes to return
- :return: Pywikibot.comms.eventstream.rc_listener configured for
- given site
- :raises ModuleNotFoundError: Requests-sse installation is required
+ :return: A recent changes listener configured for given site
+ :raises ModuleNotFoundError: rRequests-sse package installation is
+ required
"""
if isinstance(EventSource, ModuleNotFoundError):
raise ModuleNotFoundError(INSTALL_MSG) from EventSource
diff --git a/pywikibot/date.py b/pywikibot/date.py
index 73fb4f9..9e50a04 100644
--- a/pywikibot/date.py
+++ b/pywikibot/date.py
@@ -1956,7 +1956,8 @@
:param lang: Language code
:param title: Value to format
- :return: DictName ('YearBC', 'December', ...) and value (a year, date, ...)
+ :return: dict name ('YearBC', 'December', ...) and value
+ (a year, date, ...)
"""
for dict_name, dictionary in formats.items():
with suppress(Exception):
diff --git a/pywikibot/diff.py b/pywikibot/diff.py
index 813b2b1..00bff5f 100644
--- a/pywikibot/diff.py
+++ b/pywikibot/diff.py
@@ -625,7 +625,7 @@
This method is similar to Python's :pylib:`difflib.get_close_matches()
<difflib#difflib.get_close_matches>` but also gives ratio back and
- Has an *ignorecase* parameter to compare case-insensitive.
+ has an *ignorecase* parameter to compare case-insensitive.
SequenceMatcher is used to return a list of the best "good enough"
matches together with their ratio. The ratio is computed by the
diff --git a/pywikibot/logentries.py b/pywikibot/logentries.py
index 0c9d70a..14f9dfd 100644
--- a/pywikibot/logentries.py
+++ b/pywikibot/logentries.py
@@ -188,7 +188,7 @@
def duration(self) -> datetime.timedelta | None:
"""Return a datetime.timedelta representing the block duration.
- :return: Datetime.timedelta, or None if block is indefinite.
+ :return: datetime.timedelta, or None if block is indefinite.
"""
# Doing the difference is easier than parsing the string
return (self.expiry() - self.timestamp()
diff --git a/pywikibot/login.py b/pywikibot/login.py
index d2ec247..aee1f3c 100644
--- a/pywikibot/login.py
+++ b/pywikibot/login.py
@@ -585,7 +585,7 @@
:param password: Consumer secret
:raises pywikibot.exceptions.NoUsernameError: No username is
configured for the requested site.
- :raises ImportError: Mwoauth isn't installed
+ :raises ImportError: mwoauth package isn't installed
"""
if isinstance(mwoauth, ImportError):
raise ImportError(f'mwoauth is not installed: {mwoauth}.')
diff --git a/pywikibot/page/_basepage.py b/pywikibot/page/_basepage.py
index 4000c7d..d6d874e 100644
--- a/pywikibot/page/_basepage.py
+++ b/pywikibot/page/_basepage.py
@@ -77,7 +77,7 @@
'_timestamp',
)
- def __init__(self, source, title: str = '', ns=0) -> None:
+ def __init__(self, source, title: str = '', ns: int = 0) -> None:
"""Instantiate a Page object.
Three calling formats are supported:
@@ -99,14 +99,12 @@
wikitext, URLs, or another non-normalized source.
:param source: The source of the page
- :type source: Pywikibot.page.BaseLink (or subclass),
+ :type source: pywikibot.page.BaseLink (or subclass),
pywikibot.page.Page (or subclass), or pywikibot.page.Site
:param title: Normalized title of the page; required if source is a
Site, ignored otherwise
- :type title: Str
:param ns: Namespace number; required if source is a Site, ignored
otherwise
- :type ns: Int
"""
if title is None:
raise ValueError('Title cannot be None.')
@@ -190,7 +188,7 @@
def pageid(self) -> int:
"""Return pageid of the page.
- :return: Pageid or 0 if page does not exist
+ :return: Page id or 0 if page does not exist
"""
if not hasattr(self, '_pageid'):
self.site.loadpageinfo(self)
@@ -213,29 +211,31 @@
) -> str:
"""Return the title of this Page, as a string.
- :param underscore: Not used with as_link, If true, replace all ' '
- characters with '_'.
+ :param underscore: Not used with as_link, If true, replace all
+ spaces with underscores.
:param with_ns: If false, omit the namespace prefix. If this
- option is False and used together with as_link return a labeled
- link like [[link|label]].
+ option is False and used together with *as_link* return a
+ labeled link like ``[[link|label]]``.
:param with_section: If false, omit the section.
- :param as_url: Not used with as_link, If true, quote title as if in an
- URL.
- :param as_link: If true, return the title in the form of a wikilink.
- :param allow_interwiki: Only used if as_link is true, If true, format
- the link as an interwiki link if necessary.
- :param force_interwiki: Only used if as_link is true, If true, always
- format the link as an interwiki link.
- :param textlink: Only used if as_link is true, If true, place a ':'
- before Category: and Image: links.
- :param as_filename: Not used with as_link, If true, replace any
- characters that are unsafe in filenames.
- :param insite: Only used if as_link is true, A site object where the
- title is to be shown. Default is the current family/lang given by
- -family and -lang or -site option i.e. config.family and
- config.mylang.
- :param without_brackets: Cannot be used with as_link, If true, remove
- the last pair of brackets (usually removes disambiguation brackets)
+ :param as_url: Not used with *as_link*, If true, quote title as
+ if in an URL.
+ :param as_link: If true, return the title in the form of a
+ wikilink.
+ :param allow_interwiki: Only used if *as_link* is true, If true,
+ format the link as an interwiki link if necessary.
+ :param force_interwiki: Only used if *as_link* is true, If true,
+ always format the link as an interwiki link.
+ :param textlink: Only used if *as_link* is true, If true, place
+ a ':' before ``Category:`` and ``Image:`` links.
+ :param as_filename: Not used with *as_link*, If true, replace
+ any characters that are unsafe in filenames.
+ :param insite: Only used if *as_link* is true, A site object
+ where the title is to be shown. Default is the current
+ family/lang given by ``-family`` and ``-lang`` or ``-site``
+ option i.e. config.family and config.mylang.
+ :param without_brackets: Cannot be used with *as_link*, If true,
+ remove the last pair of brackets (usually removes
+ disambiguation brackets)
"""
title = self._link.canonical_title()
label = self._link.title
@@ -626,9 +626,13 @@
.. versionchanged:: 7.1
`force` parameter was added;
- `_get_parsed_page` becomes a Public method
+ `_get_parsed_page` becomes a public method
+ .. seealso::
+ :meth:`APISite.get_parsed_page()
+ <pywikibot.site._apisite.APISite.get_parsed_page>`
+ :param force: Force updating from the live site
"""
if not hasattr(self, '_parsed_text') or force:
self._parsed_text = self.site.get_parsed_page(self)
@@ -641,12 +645,11 @@
intro: bool = True) -> str:
"""Retrieve an extract of this page.
- .. seealso::
- :meth:`APISite.get_parsed_page
- <pywikibot.site._apisite.APISite.get_parsed_page>`
-
.. versionadded:: 7.1
+ .. seealso:: :meth:`APISite.extract()
+ <pywikibot.site._extensions.TextExtractsMixin.extract>`.
+
:param variant: The variant of extract, either 'plain' for plain
text, 'html' for limited HTML (both excludes templates and
any text formatting) or 'wiki' for bare wikitext which also
@@ -663,9 +666,6 @@
`sentences` parameter.
:raises ValueError: `variant` parameter must be "plain", "html" or
"wiki".
-
- .. seealso:: :meth:`APISite.extract()
- <pywikibot.site._extensions.TextExtractsMixin.extract>`.
"""
if variant in ('plain', 'html'):
extract = self.site.extract(self, chars=chars, sentences=sentences,
@@ -1066,9 +1066,10 @@
"""Return an iterable of redirects to this page.
.. versionadded:: 7.0
- :param filter_fragments: If True, only return redirects with fragments.
- If False, only return redirects without fragments. If None, return
- Both (no filtering).
+
+ :param filter_fragments: If True, only return redirects with
+ fragments. If False, only return redirects without fragments.
+ If None, return both (no filtering).
:param namespaces: Only return redirects from these namespaces
:param total: Maximum number of redirects to retrieve in total
:param content: Load the current content of each redirect
@@ -1476,7 +1477,7 @@
or `never`. For absolute timestamps the :class:`Timestamp`
class can be used.
:return: True if successful, False otherwise.
- :raises APIError: Badexpiry: Invalid value for expiry parameter
+ :raises APIError: badexpiry: Invalid value for expiry parameter
:raises KeyError: 'watch' isn't in API response
:raises TypeError: Unexpected keyword argument
"""
@@ -1492,18 +1493,14 @@
def purge(self, **kwargs) -> bool:
"""Purge the server's cache for this page.
- :keyword redirects: Automatically resolve redirects.
- :type redirects: Bool
- :keyword converttitles: Convert titles to other variants if
+ :keyword bool redirects: Automatically resolve redirects.
+ :keyword bool converttitles: Convert titles to other variants if
necessary. Only works if the wiki's content language
supports variant conversion.
- :type converttitles: Bool
- :keyword forcelinkupdate: Update the links tables.
- :type forcelinkupdate: Bool
- :keyword forcerecursivelinkupdate: Update the links table, and
- update the links tables for any page that uses this page as
- a template.
- :type forcerecursivelinkupdate: Bool
+ :keyword bool forcelinkupdate: Update the links tables.
+ :keyword bool forcerecursivelinkupdate: Update the links table,
+ and update the links tables for any page that uses this page
+ as a template.
"""
self.clear_cache()
return self.site.purgepages([self], **kwargs)
@@ -2300,7 +2297,7 @@
def change_category(self, old_cat, new_cat,
summary: str | None = None,
- sort_key=None,
+ sort_key: str | bool | None = None,
in_place: bool = True,
include: list[str] | None = None,
show_diff: bool = False) -> bool:
@@ -2310,21 +2307,18 @@
The `show_diff` parameter
:param old_cat: Category to be removed
- :type old_cat: Pywikibot.page.Category
+ :type old_cat: pywikibot.page.Category
:param new_cat: Category to be added, if any
- :type new_cat: Pywikibot.page.Category or None
-
+ :type new_cat: pywikibot.page.Category or None
:param summary: String to use as an edit summary
-
- :param sort_key: SortKey to use for the added category.
- Unused if newCat is None, or if inPlace=True
- If sortKey=True, the sortKey used for oldCat will be used.
-
+ :param sort_key: Sort key to use for the added category. Unused
+ if *new_cat* is None, or if *in_place* is True. If *sort_key*
+ is True, the sort Key used for *old_cat* will be used.
:param in_place: If True, change categories in place rather than
rearranging them.
-
- :param include: List of tags not to be disabled by default in relevant
- textlib functions, where CategoryLinks can be searched.
+ :param include: List of tags not to be disabled by default in
+ relevant :mod:`textlib` functions, where category links can
+ be searched.
:param show_diff: Show changes between oldtext and newtext
(default: False)
@@ -2414,7 +2408,7 @@
the link will have http(s) protocol prepended. On Wikimedia
wikis the protocol is already present.
:return: The reduced link.
- :raises APIError: Urlshortener-ratelimit exceeded
+ :raises APIError: urlshortener-ratelimit exceeded
"""
wiki = self.site
if self.site.family.shared_urlshortner_wiki:
diff --git a/pywikibot/page/_collections.py b/pywikibot/page/_collections.py
index 5daad04..b95b059 100644
--- a/pywikibot/page/_collections.py
+++ b/pywikibot/page/_collections.py
@@ -73,7 +73,7 @@
"""Helper function to return language codes of a site object.
:param key: Input key to be normalized
- :type key: Pywikibot.site.BaseSite or str
+ :type key: pywikibot.site.BaseSite or str
"""
if isinstance(key, BaseSite):
key = key.lang
@@ -154,7 +154,7 @@
:param data: Data to normalize
:return: The dict with normalized data
- :raises TypeError: Data values must be a list
+ :raises TypeError: *data* values must be a list
"""
norm_data = {}
for key, values in data.items():
@@ -321,11 +321,10 @@
"""A structure holding SiteLinks for a Wikibase item."""
- def __init__(self, repo, data=None) -> None:
+ def __init__(self, repo: pywikibot.site.DataSite, data=None) -> None:
"""Initializer.
:param repo: The Wikibase site on which badges are defined
- :type repo: Pywikibot.site.DataSite
"""
super().__init__()
self.repo = repo
@@ -344,11 +343,10 @@
return cls(repo, data)
@staticmethod
- def getdbName(site):
+ def getdbName(site: BaseSite | str):
"""Helper function to obtain a dbName for a Site.
:param site: The site to look up.
- :type site: Pywikibot.site.BaseSite or str
"""
if isinstance(site, BaseSite):
return site.dbName()
@@ -358,7 +356,7 @@
"""Get the SiteLink with the given key.
:param key: Site key as Site instance or db key
- :type key: Pywikibot.Site or str
+ :type key: üywikibot.Site or str
:rtype: pywikibot.page.SiteLink
"""
key = self.getdbName(key)
@@ -419,11 +417,10 @@
return obj
@classmethod
- def normalizeData(cls, data) -> dict:
+ def normalizeData(cls, data: list | dict[str, Any]) -> dict:
"""Helper function to expand data into the Wikibase API structure.
:param data: Data to normalize
- :type data: List or dict
:return: The dict with normalized data
"""
norm_data = {}
@@ -497,7 +494,7 @@
"""Initializer.
:param repo: Wikibase site
- :type repo: Pywikibot.site.DataSite
+ :type repo: pywikibot.site.DataSite
:param data: Iterable of LexemeSubEntity
:type data: Iterable
"""
@@ -564,7 +561,6 @@
"""Helper function to expand data into the Wikibase API structure.
:param data: Data to normalize
- :type data: List
:return: The altered dict from parameter data.
"""
raise NotImplementedError # TODO
diff --git a/pywikibot/page/_filepage.py b/pywikibot/page/_filepage.py
index b8f01e3..bc834e1 100644
--- a/pywikibot/page/_filepage.py
+++ b/pywikibot/page/_filepage.py
@@ -130,13 +130,12 @@
def get_file_info(self, ts) -> dict:
"""Retrieve and store information of a specific Image rev of FilePage.
- This function will load also metadata.
- It is also used as a helper in FileInfo to load metadata lazily.
+ This function will load also metadata. It is also used as a
+ helper in FileInfo to load metadata lazily.
.. versionadded:: 8.6
- :param ts: Timestamp of the Image rev. To retrieve
-
+ :param ts: Timestamp of the Image revision to retrieve
:return: Instance of FileInfo()
"""
self.site.loadimageinfo(self, history=False, timestamp=ts)
diff --git a/pywikibot/page/_links.py b/pywikibot/page/_links.py
index 3bf816c..5c59a8e 100644
--- a/pywikibot/page/_links.py
+++ b/pywikibot/page/_links.py
@@ -55,7 +55,7 @@
:param namespace: The namespace of the page linked to. Can be
provided as either an int, a Namespace instance or a str,
defaults to the MAIN namespace.
- :type namespace: Int, pywikibot.Namespace or str
+ :type namespace: int, pywikibot.Namespace, str or None
:param site: The Site object for the wiki linked to. Can be
provided as either a Site instance or a db key, defaults to
pywikibot.Site().
@@ -209,7 +209,7 @@
"""Create a BaseLink to a Page.
:param page: Target pywikibot.page.Page
- :type page: Pywikibot.page.Page
+ :type page: pywikibot.page.Page
:rtype: pywikibot.page.BaseLink
"""
title = page.title(with_ns=False,
@@ -248,18 +248,19 @@
'|&#x[0-9A-Fa-f]+;'
)
- def __init__(self, text, source=None, default_namespace=0) -> None:
+ def __init__(self,
+ text: str,
+ source=None,
+ default_namespace: int = 0) -> None:
"""Initializer.
:param text: The link text (everything appearing between [[ and
]] on a wiki page)
- :type text: Str
:param source: The Site on which the link was found (not
necessarily the site to which the link refers)
:type source: Site or BasePage
:param default_namespace: A namespace to use if the link does
not contain one (defaults to 0)
- :type default_namespace: Int
:raises UnicodeError: Text could not be converted to unicode.
"""
source_is_page = isinstance(source, pywikibot.page.BasePage)
@@ -555,7 +556,7 @@
"""Create a Link to a Page.
:param page: Target Page
- :type page: Pywikibot.page.Page
+ :type page: pywikibot.page.Page
:param source: Link from site source
:param source: Site
:rtype: pywikibot.page.Link
@@ -609,22 +610,23 @@
return link
@classmethod
- def create_separated(cls, link, source, default_namespace=0, section=None,
- label=None):
+ def create_separated(cls,
+ link: str,
+ source,
+ default_namespace: int = 0,
+ section: str | None = None,
+ label: str | None = None) -> Link:
"""Create a new instance but overwrite section or label.
The returned Link instance is already parsed.
:param link: The original link text.
- :type link: Str
:param source: The source of the link.
:type source: Site
:param default_namespace: The namespace this link uses when no
namespace is defined in the link text.
- :type default_namespace: Int
:param section: The new section replacing the one in link. If
None (default) it doesn't replace it.
- :type section: None or str
:param label: The new label replacing the one in link. If None
(default) it doesn't replace it.
"""
@@ -655,17 +657,16 @@
# Components used for __repr__
_items = ('_sitekey', '_rawtitle', 'badges')
- def __init__(self, title, site=None, badges=None) -> None:
+ def __init__(self, title: str, site=None, badges=None) -> None:
"""Initializer.
:param title: The title of the linked page including namespace
- :type title: Str
:param site: The Site object for the wiki linked to. Can be
provided as either a Site instance or a db key, defaults to
pywikibot.Site().
- :type site: Pywikibot.Site or str
+ :type site: pywikibot.Site or str
:param badges: List of badges
- :type badges: [pywikibot.ItemPage]
+ :type badges: [pywikibot.ItemPage] or None
"""
# split of namespace from title
namespace = None
@@ -679,17 +680,16 @@
self._badges = set(badges)
@staticmethod
- def _parse_namespace(title, site=None):
+ def _parse_namespace(title: str, site=None):
"""Parse enough of a title with a ':' to determine the namespace.
+ :param title: The title of the linked page including namespace
:param site: The Site object for the wiki linked to. Can be
provided as either a Site instance or a db key, defaults to
pywikibot.Site().
:type site: Pywikibot.Site or str
- :param title: The title of the linked page including namespace
- :type title: Str
:return: A (site, namespace, title) tuple
- :rtype: (pywikibot.Site, pywikibot.Namespace or None, str)
+ :rtype: tuple[pywikibot.Site, pywikibot.Namespace or None, str]
"""
# need a Site instance to evaluate local namespaces
site = site or pywikibot.Site()
@@ -723,7 +723,7 @@
:param data: JSON containing SiteLink data
:param site: The Wikibase site
- :type site: Pywikibot.site.DataSite
+ :type site: pywikibot.site.DataSite
"""
sl = cls(data['title'], data['site'])
repo = site or sl.site.data_repository()
diff --git a/pywikibot/page/_user.py b/pywikibot/page/_user.py
index 8b491a4..893a637 100644
--- a/pywikibot/page/_user.py
+++ b/pywikibot/page/_user.py
@@ -336,26 +336,22 @@
def logevents(self, **kwargs):
"""Yield user activities.
- :keyword logtype: Only iterate entries of this type (see
+ :keyword str logtype: Only iterate entries of this type (see
mediawiki api documentation for available types)
- :type logtype: Str
:keyword page: Only iterate entries affecting this page
:type page: Page or str
:keyword namespace: Namespace to retrieve logevents from
- :type namespace: Int or Namespace
+ :type namespace: int or Namespace
:keyword start: Only iterate entries from and after this
Timestamp
:type start: Timestamp or ISO date string
:keyword end: Only iterate entries up to and through this
Timestamp
:type end: Timestamp or ISO date string
- :keyword reverse: If True, iterate oldest entries first
+ :keyword bool reverse: If True, iterate oldest entries first
(default: newest)
- :type reverse: Bool
- :keyword tag: Only iterate entries tagged with this tag
- :type tag: Str
- :keyword total: Maximum number of events to iterate
- :type total: Int
+ :keyword str tag: Only iterate entries tagged with this tag
+ :keyword int total: Maximum number of events to iterate
:rtype: iterable
"""
return self.site.logevents(user=self.username, **kwargs)
diff --git a/pywikibot/page/_wikibase.py b/pywikibot/page/_wikibase.py
index e06188b..5f7d3f6 100644
--- a/pywikibot/page/_wikibase.py
+++ b/pywikibot/page/_wikibase.py
@@ -19,7 +19,7 @@
from collections import OrderedDict, defaultdict
from contextlib import suppress
from itertools import chain
-from typing import TYPE_CHECKING, Any, NoReturn
+from typing import TYPE_CHECKING, Any, Literal, NoReturn
import pywikibot
from pywikibot.exceptions import (
@@ -88,13 +88,9 @@
(e.g., 'labels', 'claims') to appropriate collection classes
(e.g., :class:`LanguageDict<pywikibot.page._collections.LanguageDict>`,
:class:`ClaimCollection<pywikibot.page._collections.ClaimCollection>`)
-
- :cvar entity_type: entity type identifier
- :type entity_type: Str
-
- :cvar title_pattern: regular expression which matches all possible
+ :cvar str entity_type: entity type identifier
+ :cvar str title_pattern: regular expression which matches all possible
entity ids for this entity type
- :type title_pattern: Str
"""
DATA_ATTRIBUTES: dict[str, Any] = {}
@@ -104,8 +100,7 @@
:param repo: Entity repository.
:type repo: DataSite
- :param id_: Entity identifier.
- :type id_: Str or None, -1 and None mean non-existing
+ :param id_: Entity identifier; -1 and None mean non-existing
"""
self.repo = repo
self.id = id_ if id_ is not None else '-1'
@@ -555,7 +550,7 @@
.. versionadded:: 8.5
:param claim: The claim to add
- :type claim: Pywikibot.page.Claim
+ :type claim: pywikibot.page.Claim
:param bot: Whether to flag as bot (if possible)
"""
if claim.on_item is not None:
@@ -604,11 +599,10 @@
:param site: Wikibase data site
:type site: Pywikibot.site.DataSite
:param title: Normalized title of the page
- :type title: Str
:keyword ns: Namespace
:type ns: Namespace instance, or int
:keyword entity_type: Wikibase entity type
- :type entity_type: Str ('item' or 'property')
+ :type entity_type: Literal['item' | 'property']
:raises TypeError: Incorrect use of parameters
:raises ValueError: Incorrect namespace
:raises pywikibot.exceptions.Error: title parsing problems
@@ -948,17 +942,15 @@
'sitelinks': SiteLinkCollection,
}
- def __init__(self, site, title=None, ns=None) -> None:
+ def __init__(self, site, title=None, ns: str | None = None) -> None:
"""Initializer.
:param site: Data repository
- :type site: Pywikibot.site.DataSite
- :param title: Identifier of item, "Q###",
- -1 or None for an empty item.
- :type title: Str
- :type ns: Namespace
+ :type site: pywikibot.site.DataSite
+ :param title: Identifier of item, "Q###"; '-1' or None for an
+ empty item.
+ :param ns: Namespace, None for default site.item_namespace
:type ns: Namespace instance, or int, or None
- for default item_namespace
"""
if ns is None:
ns = site.item_namespace
@@ -1070,7 +1062,7 @@
"""Get the ItemPage for a Page that links to it.
:param page: Page to look for corresponding data item
- :type page: Pywikibot.page.Page
+ :type page: pywikibot.page.Page
:param lazy_load: Do not raise NoPageError if either page or
corresponding ItemPage does not exist.
:rtype: pywikibot.page.ItemPage
@@ -1108,7 +1100,7 @@
"""Get the ItemPage from its entity uri.
:param site: The Wikibase site for the item.
- :type site: Pywikibot.site.DataSite
+ :type site: pywikibot.site.DataSite
:param uri: Entity uri for the Wikibase item.
:param lazy_load: Do not raise NoPageError if ItemPage does not
exist.
@@ -1197,7 +1189,7 @@
:param family: String/Family object which represents what family
of links to iterate
- :type family: Str|pywikibot.family.Family
+ :type family: str | pywikibot.family.Family | None
:return: Iterator of pywikibot.Page objects
:rtype: iterator
"""
@@ -1221,7 +1213,7 @@
raises NoSiteLinkError instead of NoPageError.
:param site: Site to find the linked page of.
- :type site: Pywikibot.Site or database name
+ :type site: pywikibot.Site or database name
:param force: Override caching
:raise IsRedirectPageError: instance is a redirect page
:raise NoSiteLinkError: site is not in :attr:`sitelinks`
@@ -1283,7 +1275,7 @@
"""Merge the item into another item.
:param item: The item to merge into
- :type item: Pywikibot.page.ItemPage
+ :type item: pywikibot.page.ItemPage
"""
data = self.repo.mergeItems(from_item=self, to_item=item, **kwargs)
if not data.get('success', 0):
@@ -1501,7 +1493,7 @@
"""Initializer.
:param site: Data repository
- :type site: Pywikibot.site.DataSite
+ :type site: pywikibot.site.DataSite
:param id: Id of the property
:param datatype: Datatype of the property; if not given, it will
be queried via the API
@@ -1569,14 +1561,16 @@
'claims': ClaimCollection,
}
- def __init__(self, source, title=None, datatype=None) -> None:
+ def __init__(self,
+ source,
+ title: str | None = None,
+ datatype: str | None = None) -> None:
"""Initializer.
:param source: Data repository property is on
- :type source: Pywikibot.site.DataSite
+ :type source: pywikibot.site.DataSite
:param title: Identifier of property, like "P##", "-1" or None
for an empty property.
- :type title: Str
:param datatype: Datatype for a new property.
:type datatype: Str
"""
@@ -1974,14 +1968,13 @@
def changeTarget(
self,
- value=None,
- snaktype: str = 'value',
+ value: Any = None,
+ snaktype: Literal['value' | 'somevalue' | 'novalue'] = 'value',
**kwargs
) -> None:
"""Set the target value in the data repository.
:param value: The new target value.
- :type value: Object
:param snaktype: The new snak type ('value', 'somevalue', or
'novalue').
"""
@@ -2004,18 +1997,15 @@
"""
return self.target
- def getSnakType(self) -> str:
- """Return the type of snak.
-
- :return: Str ('value', 'somevalue' or 'novalue')
- """
+ def getSnakType(self) -> Literal['value' | 'somevalue' | 'novalue']:
+ """Return the type of snak."""
return self.snaktype
- def setSnakType(self, value) -> None:
+ def setSnakType(self,
+ value: Literal['value' | 'somevalue' | 'novalue']) -> None:
"""Set the type of snak.
:param value: Type of snak
- :type value: Str ('value', 'somevalue', or 'novalue')
"""
if value in self.SNAK_TYPES:
self.snaktype = value
@@ -2111,7 +2101,7 @@
"""Add the given qualifier.
:param qualifier: The qualifier to add
- :type qualifier: Pywikibot.page.Claim
+ :type qualifier: pywikibot.page.Claim
"""
self._assert_mainsnak('Cannot add qualifiers to a {}')
if qualifier.on_item is not None:
@@ -2131,7 +2121,7 @@
"""Remove the qualifier. Call removeQualifiers().
:param qualifier: The qualifier to remove
- :type qualifier: Pywikibot.page.Claim
+ :type qualifier: pywikibot.page.Claim
"""
self.removeQualifiers([qualifier], **kwargs)
@@ -2288,14 +2278,13 @@
# 'senses': LexemeSenseCollection,
}
- def __init__(self, site, title=None) -> None:
+ def __init__(self, site, title: str | None = None) -> None:
"""Initializer.
:param site: Data repository
:type site: Pywikibot.site.DataSite
- :param title: Identifier of lexeme, "L###",
- -1 or None for an empty lexeme.
- :type title: Str or None
+ :param title: Identifier of lexeme, "L###"; '-1' or None for an
+ empty lexeme.
"""
# Special case for empty lexeme.
if title is None or title == '-1':
@@ -2333,19 +2322,21 @@
return data
- def get(self, force=False, get_redirect=False, *args, **kwargs):
+ def get(self,
+ force: bool = False,
+ get_redirect: bool = False,
+ *args,
+ **kwargs):
"""Fetch all lexeme data, and cache it.
- :param force: Override caching
- :type force: Bool
- :param get_redirect: Return the lexeme content, do not follow the
- redirect, do not raise an exception.
- :type get_redirect: Bool
- :raise NotImplementedError: a value in args or kwargs
-
.. note:: dicts returned by this method are references to content
of this entity and their modifying may indirectly cause
unwanted change to the live content
+
+ :param force: Override caching
+ :param get_redirect: Return the lexeme content, do not follow the
+ redirect, do not raise an exception.
+ :raise NotImplementedError: a value in args or kwargs
"""
data = super().get(force, *args, **kwargs)
@@ -2391,18 +2382,15 @@
:param form: The form to add
:type form: Form
- :keyword bot: Whether to flag as bot (if possible)
- :type bot: Bool
- :keyword asynchronous: If True, launch a separate thread to add
- form asynchronously
- :type asynchronous: Bool
- :keyword callback: A callable object that will be called after
- the claim has been added. It must take two arguments: (1) a
- LexemePage object, and (2) an exception instance, which will
- be None if the entity was saved successfully. This is
- intended for use by bots that need to keep track of which
- saves were successful.
- :type callback: Callable
+ :keyword bool bot: Whether to flag as bot (if possible)
+ :keyword bool asynchronous: If True, launch a separate thread to
+ add form asynchronously
+ :keyword Callable callback: A callable object that will be
+ called after the claim has been added. It must take two
+ arguments: (1) a LexemePage object, and (2) an exception
+ instance, which will be None if the entity was saved
+ successfully. This is intended for use by bots that need to
+ keep track of which saves were successful.
"""
if form.on_lexeme is not None:
raise ValueError('The provided LexemeForm instance is already '
@@ -2419,7 +2407,7 @@
"""Remove a form from the lexeme.
:param form: The form to remove
- :type form: Pywikibot.LexemeForm
+ :type form: pywikibot.LexemeForm
"""
data = self.repo.remove_form(form, **kwargs)
form.on_lexeme.latest_revision_id = data['lastrevid']
@@ -2499,18 +2487,15 @@
:param claim: The claim to add
:type claim: Claim
- :keyword bot: Whether to flag as bot (if possible)
- :type bot: Bool
- :keyword asynchronous: If True, launch a separate thread to add
- claim asynchronously
- :type asynchronous: Bool
- :keyword callback: A callable object that will be called after
- the claim has been added. It must take two arguments: (1) a
- Form object, and (2) an exception instance, which will be
- None if the form was saved successfully. This is intended
- for use by bots that need to keep track of which saves were
- successful.
- :type callback: Callable
+ :keyword bool bot: Whether to flag as bot (if possible)
+ :keyword bool asynchronous: If True, launch a separate thread to
+ add claim asynchronously
+ :keyword Callable callback: A callable object that will be
+ called after the claim has been added. It must take two
+ arguments: (1) a Form object, and (2) an exception instance,
+ which will be None if the form was saved successfully. This
+ is intended for use by bots that need to keep track of which
+ saves were successful.
"""
self.repo.addClaim(self, claim, **kwargs)
claim.on_item = self
diff --git a/pywikibot/pagegenerators/_filters.py b/pywikibot/pagegenerators/_filters.py
index 2d2f399..1f1ff4a 100644
--- a/pywikibot/pagegenerators/_filters.py
+++ b/pywikibot/pagegenerators/_filters.py
@@ -182,19 +182,20 @@
cls,
generator: Iterable[pywikibot.page.WikibasePage],
prop: str,
- claim: str,
+ claim,
qualifiers: dict[str, str] | None = None,
negate: bool = False,
) -> Generator[pywikibot.page.WikibasePage]:
"""Yield all ItemPages which contain certain claim in a property.
:param prop: Property id to check
- :param claim: Value of the property to check. For instance, an
- ItemPage instance or a string, e.g. 'Q37470'.
+ :param claim: Value of the property to check. An ItemPage
+ instance or a string, e.g. 'Q37470'.
+ :type claim: Itempage | str
:param qualifiers: Dict of qualifiers that must be present, or
None if qualifiers are irrelevant
:param negate: True if pages that do *not* contain the specified
- claim should be yielded; Otherwise False
+ claim should be yielded; otherwise False
"""
qualifiers = qualifiers or {}
for page in generator:
@@ -328,7 +329,7 @@
In all the other cases, no filter is applied.
:param generator: A generator object
- :param quality: Proofread-page quality levels (valid range 0-4)
+ :param quality: Any proofread-page quality levels (valid range 0-4)
"""
for page in generator:
if page.namespace() == page.site.proofread_page_ns:
diff --git a/pywikibot/pagegenerators/_generators.py b/pywikibot/pagegenerators/_generators.py
index 6bde90e..1b76e1d 100644
--- a/pywikibot/pagegenerators/_generators.py
+++ b/pywikibot/pagegenerators/_generators.py
@@ -1478,7 +1478,6 @@
def buildQuery(self, id: int):
"""Get the querystring options to query PagePile.
- :param id: Int
:return: Dictionary of querystring parameters to use in the
query
"""
diff --git a/pywikibot/proofreadpage.py b/pywikibot/proofreadpage.py
index a814a6e..92ce8a9 100644
--- a/pywikibot/proofreadpage.py
+++ b/pywikibot/proofreadpage.py
@@ -432,8 +432,8 @@
def __init__(self, source: PageSourceType, title: str = '') -> None:
"""Instantiate a ProofreadPage object.
- :raises UnknownExtensionError: Source Site has no ProofreadPage
- Extension.
+ :raises UnknownExtensionError: *source* Site has no
+ ProofreadPage Extension.
"""
if not isinstance(source, pywikibot.site.BaseSite):
site = source.site
diff --git a/pywikibot/scripts/generate_user_files.py b/pywikibot/scripts/generate_user_files.py
index 6426011..8676088 100755
--- a/pywikibot/scripts/generate_user_files.py
+++ b/pywikibot/scripts/generate_user_files.py
@@ -470,7 +470,7 @@
raise
-def ask_for_dir_change(force) -> tuple[bool, bool]:
+def ask_for_dir_change(force: bool) -> tuple[bool, bool]:
"""Ask whether the base directory is has to be changed.
Only give option for directory change if user-config.py or user-
@@ -478,7 +478,6 @@
config.py also exists in the requested directory.
:param force: Skip asking for directory change
- :type force: Bool
:return: Whether user file or password file exists already
"""
global base_dir
diff --git a/pywikibot/site/_apisite.py b/pywikibot/site/_apisite.py
index 282cc49..ac166b3 100644
--- a/pywikibot/site/_apisite.py
+++ b/pywikibot/site/_apisite.py
@@ -3028,7 +3028,7 @@
:return: True if API returns expected response; False otherwise.
If *unwatch* is False, *expiry* is None or specifies no
defined end date, return False if the page does not exist.
- :raises APIError: Badexpiry: Invalid value for expiry parameter
+ :raises APIError: badexpiry: Invalid value for expiry parameter
:raises KeyError: 'watch' isn't in API response
:raises TypeError: Unexpected keyword argument
"""
diff --git a/pywikibot/site/_basesite.py b/pywikibot/site/_basesite.py
index 123cd74..092de9a 100644
--- a/pywikibot/site/_basesite.py
+++ b/pywikibot/site/_basesite.py
@@ -35,15 +35,13 @@
"""Site methods that are independent of the communication interface."""
- def __init__(self, code: str, fam=None, user=None) -> None:
+ def __init__(self, code: str, fam=None, user: str | None = None) -> None:
"""Initializer.
:param code: The site's language code
- :type code: Str
:param fam: Wiki family name (optional)
- :type fam: Str or pywikibot.family.Family
+ :type fam: str or pywikibot.family.Family or None
:param user: Bot user name (optional)
- :type user: Str
"""
if code.lower() != code:
# Note the Site function in __init__ also emits a UserWarning
@@ -320,7 +318,7 @@
at the same time, even to different sections.
:param page: The page to be locked
- :type page: Pywikibot.Page
+ :type page: pywikibot.Page
:param block: If true, wait until the page is available to be
locked; otherwise, raise an exception if page can't be
locked
@@ -337,7 +335,7 @@
"""Unlock page. Call as soon as a write operation has completed.
:param page: The page to be locked
- :type page: Pywikibot.Page
+ :type page: pywikibot.Page
"""
with self._pagemutex:
self._locked_pages.discard(page.title(with_section=False))
diff --git a/pywikibot/site/_extensions.py b/pywikibot/site/_extensions.py
index 6f6ab18..ed662b9 100644
--- a/pywikibot/site/_extensions.py
+++ b/pywikibot/site/_extensions.py
@@ -64,11 +64,11 @@
def notifications(self, **kwargs):
"""Yield Notification objects from the Echo extension.
- :Keyword str | None format: Notification output format.
+ .. seealso:: :api:`Notifications` for other keywords.
+
+ :keyword str | None format: Notification output format.
Possible values are ``model``, ``special``, or ``None``.
The default is ``special``.
-
- .. seealso:: :api:`Notifications` for other keywords.
"""
params = {
'action': 'query',
@@ -128,7 +128,7 @@
3: 'Proofread', 4: 'Validated'}
:param expiry: Either a number of days or a datetime.timedelta object
- :type expiry: Int (days), :py:obj:`datetime.timedelta`, False (config)
+ :type expiry: int (days), :py:obj:`datetime.timedelta`, False (config)
:return: A tuple containing _proofread_index_ns,
self._proofread_page_ns and self._proofread_levels.
:rtype: Namespace, Namespace, dict
@@ -226,7 +226,7 @@
"""Load [[mw:Extension:PageImages]] info.
:param page: The page for which to obtain the image
- :type page: Pywikibot.Page
+ :type page: pywikibot.Page
:raises APIError: PageImages extension is not installed
"""
title = page.title(with_section=False)
@@ -246,7 +246,7 @@
"""Iterate global image usage for a given FilePage.
:param page: The page to return global image usage for.
- :type page: Pywikibot.FilePage
+ :type page: pywikibot.FilePage
:param total: Iterate no more than this number of pages in
total.
:raises TypeError: Input page is not a FilePage.
@@ -391,13 +391,11 @@
"""APISite mixin for Thanks extension."""
@need_extension('Thanks')
- def thank_revision(self, revid, source=None):
+ def thank_revision(self, revid: int, source: str | None = None):
"""Corresponding method to the 'action=thank' API action.
:param revid: Revision ID for the revision to be thanked.
- :type revid: Int
:param source: A source for the thanking operation.
- :type source: Str
:raise APIError: On thanking oneself or other API errors.
:return: The API response.
"""
@@ -415,16 +413,14 @@
"""APISite mixin for UrlShortener extension."""
@need_extension('UrlShortener')
- def create_short_link(self, url):
+ def create_short_link(self, url: str) -> str:
"""Return a shortened link.
Note that on Wikimedia wikis only metawiki supports this action,
and this wiki can process links to all WM domains.
:param url: The link to reduce, with propotol prefix.
- :type url: Str
:return: The reduced link, without protocol prefix.
- :rtype: str
"""
req = self.simple_request(action='shortenurl', url=url)
data = req.submit()
@@ -451,8 +447,8 @@
:param chars: Maximum characters to return.
:param sentences: How many sentences to return.
:param intro: Return only content before the first section.
- :param plaintext: Return extracts as plain text instead of limited
- HTML.
+ :param plaintext: Return extracts as plain text instead of
+ limited HTML.
:return: The extract of the page.
.. seealso::
diff --git a/pywikibot/site/_generators.py b/pywikibot/site/_generators.py
index a58080d..f5e9fff 100644
--- a/pywikibot/site/_generators.py
+++ b/pywikibot/site/_generators.py
@@ -721,7 +721,7 @@
def _rvprops(self, content: bool = False) -> list[str]:
"""Setup rvprop items for loadrevisions and preloadpages.
- :return: Rvprop items
+ :return: rvprop items
"""
props = ['comment', 'contentmodel', 'flags', 'ids', 'parsedcomment',
'sha1', 'size', 'tags', 'timestamp', 'user', 'userid']
@@ -1296,12 +1296,13 @@
:param image: The image to search for (FilePage need not exist on
the wiki)
- :param namespaces: If present, only iterate pages in these namespaces
- :param filterredir: If True, only yield redirects; if False (and not
- None), only yield non-redirects (default: yield both)
+ :param namespaces: If present, only iterate pages in these
+ namespaces.
+ :param filterredir: If True, only yield redirects; if False (and
+ not None), only yield non-redirects (default: yield both).
:param total: Iterate no more than this number of pages in total
- :param content: If True, load the current content of each iterated page
- (default False)
+ :param content: If True, load the current content of each
+ iterated page (default False)
:raises KeyError: A namespace identifier was not resolved
:raises TypeError: A namespace identifier has an inappropriate
type such as NoneType or bool
@@ -1314,7 +1315,7 @@
namespaces=namespaces,
total=total, g_content=content, **iuargs)
- def logevents(
+ def logevents( # docsig: disable=SIG305
self,
logtype: str | None = None,
user: str | None = None,
@@ -1333,25 +1334,29 @@
.. note:: logevents with `logtype='block'` only logs user blocks
whereas `site.blocks` iterates all blocks including IP ranges.
- .. note:: Due to an API limitation, if namespace param contains
- multiple namespaces, log entries from all namespaces will be
- fetched from the API and will be filtered later during
- iteration.
-
:param logtype: Only iterate entries of this type (see mediawiki
api documentation for available types)
:param user: Only iterate entries that match this user name
:param page: Only iterate entries affecting this page
- :param namespace: Namespace(s) to retrieve logevents from
+ :param namespace: Namespace(s) to retrieve logevents from.
+
+ .. note:: Due to an API limitation, if namespace param
+ contains multiple namespaces, log entries from all
+ namespaces will be fetched from the API and will be
+ filtered later during iteration.
+
:param start: Only iterate entries from and after this Timestamp
- :param end: Only iterate entries up to and through this Timestamp
- :param reverse: If True, iterate oldest entries first (default: newest)
+ :param end: Only iterate entries up to and through this
+ Timestamp
+ :param reverse: If True, iterate oldest entries first (default:
+ newest)
:param tag: Only iterate entries tagged with this tag
:param total: Maximum number of events to iterate
:raises KeyError: The namespace identifier was not resolved
:raises TypeError: The namespace identifier has an inappropriate
- type such as bool, or an iterable with more than one namespace
+ type such as bool, or an iterable with more than one
+ namespace.
"""
if start and end:
self.assert_valid_iter_params('logevents', start, end, reverse)
@@ -1405,9 +1410,9 @@
.. seealso:: :api:`RecentChanges`
:param start: Timestamp to start listing from
- :type start: Pywikibot.Timestamp
+ :type start: pywikibot.Timestamp
:param end: Timestamp to end listing at
- :type end: Pywikibot.Timestamp
+ :type end: pywikibot.Timestamp
:param reverse: If True, start with oldest changes (default: newest)
:param namespaces: Only iterate pages in these namespaces
:param changetype: Only iterate changes of this type ("edit" for
@@ -1668,7 +1673,7 @@
or self.has_right('undelete'))):
raise UserRightsError(err + 'deleted content.')
- def deletedrevs(
+ def deletedrevs( # docsig: disable=SIG305
self,
titles: str
| pywikibot.Page
@@ -1692,7 +1697,6 @@
also given with the content request.
.. seealso:: :api:`Deletedrevisions`
- .. note:: either titles or revids must be set but not both
:param titles: The page titles to check for deleted revisions
:param start: Iterate revisions starting at this Timestamp
@@ -1700,7 +1704,10 @@
:param reverse: Iterate oldest revisions first (default: newest)
:param content: If True, retrieve the content of each revision
:param total: Number of revisions to retrieve
- :keyword revids: Get revisions by their ID
+ :keyword revids: Get revisions by their ID.
+
+ .. note:: Either titles or revids must be set but not both
+
:keyword user: List revisions by this user
:keyword excludeuser: Exclude revisions by this user
:keyword tag: Only list revision tagged with this tag
diff --git a/pywikibot/textlib.py b/pywikibot/textlib.py
index 13cb656..942cef3 100644
--- a/pywikibot/textlib.py
+++ b/pywikibot/textlib.py
@@ -1050,7 +1050,7 @@
:param site: The site that the text is coming from. Required for
reorder of categories and interlanguage links. Te default site
is used otherwise.
- :type site: Pywikibot.Site
+ :type site: pywikibot.Site
"""
# Translating the \\n (e.g. from command line) into binary \n
add = add.replace('\\n', '\n')
@@ -1456,7 +1456,7 @@
:param text: The text that needs to be modified.
:param site: The site that the text is coming from.
- :type site: Pywikibot.Site
+ :type site: pywikibot.Site
:param marker: If defined, marker is placed after the last language
link, or at the end of text if there are no language links.
:return: The modified text.
@@ -1488,7 +1488,7 @@
:param text: The text that needs to be modified.
:param site: The site that the text is coming from.
- :type site: Pywikibot.Site
+ :type site: pywikibot.Site
:param marker: If defined, marker is placed after the last language
link, or at the end of text if there are no language links.
:param separator: The separator string that will be removed if
@@ -1737,7 +1737,7 @@
:param text: The text that needs to be modified.
:param site: The site that the text is coming from.
- :type site: Pywikibot.Site
+ :type site: pywikibot.Site
:param marker: If defined, marker is placed after the last category
link, or at the end of text if there are no category links.
:return: The modified text.
@@ -1768,7 +1768,7 @@
:param text: The text that needs to be modified.
:param site: The site that the text is coming from.
- :type site: Pywikibot.Site
+ :type site: pywikibot.Site
:param marker: If defined, marker is placed after the last category
link, or at the end of text if there are no category links.
:param separator: The separator string that will be removed if
@@ -1790,8 +1790,10 @@
"""Replace old category with new one and return the modified text.
:param oldtext: Content of the old category
- :param oldcat: Pywikibot.Category object of the old category
- :param newcat: Pywikibot.Category object of the new category
+ :param oldcat: :class:`pywikibot.Category` object of the old
+ category
+ :param newcat: :class:`Pywikibot.Category` object of the new
+ category
:param add_only: If add_only is True, the old category won't be
replaced and the category given will be added after it.
:return: The modified text
@@ -1939,7 +1941,7 @@
be either the raw name, [[Category:..]] or [[cat_localised_ns:...]].
:type categories: Iterable
:param insite: Used to to localise the category namespace.
- :type insite: Pywikibot.Site
+ :type insite: pywikibot.Site
:return: String of categories
"""
if not categories:
diff --git a/pywikibot/time.py b/pywikibot/time.py
index 57dc30c..73576b3 100644
--- a/pywikibot/time.py
+++ b/pywikibot/time.py
@@ -472,7 +472,7 @@
:param timestamp: A timestamp to calculate a more accurate duration offset
used by years
- :type timestamp: Datetime.datetime
+ :type timestamp: datetime.datetime
:return: The corresponding timedelta object
"""
key, duration = parse_duration(string)
diff --git a/pywikibot/tools/__init__.py b/pywikibot/tools/__init__.py
index 2562ea0..182a83c 100644
--- a/pywikibot/tools/__init__.py
+++ b/pywikibot/tools/__init__.py
@@ -212,7 +212,7 @@
def __init__(
self,
message: str = '',
- category=Warning,
+ category: type[Warning] = Warning,
filename: str = ''
) -> None:
"""Initialize the object.
@@ -225,7 +225,6 @@
(case-insensitive)
:param category: A class (a subclass of Warning) of which the
warning category must be a subclass in order to match.
- :type category: Type
:param filename: A string containing a regular expression that
the start of the path to the warning module must match.
(case-sensitive)
diff --git a/pywikibot/tools/_deprecate.py b/pywikibot/tools/_deprecate.py
index 92751f5..27984e9 100644
--- a/pywikibot/tools/_deprecate.py
+++ b/pywikibot/tools/_deprecate.py
@@ -57,14 +57,13 @@
"""
-def add_decorated_full_name(obj, stacklevel: int = 1) -> None:
+def add_decorated_full_name(obj: Any, stacklevel: int = 1) -> None:
"""Extract full object name, including class, and store in __full_name__.
This must be done on all decorators that are chained together,
otherwise the second decorator will have the wrong full name.
:param obj: An object being decorated
- :type obj: Object
:param stacklevel: Stack level to use
"""
if hasattr(obj, '__full_name__'):
@@ -208,14 +207,15 @@
.. versionchanged:: 8.2
*warning_class* and *since* are keyword-only parameters.
- :param name: The name of the deprecated object
- :param instead: Suggested replacement for the deprecated object
- :param depth: *depth* + 1 will be used as stacklevel for the
- Warnings
- :param warning_class: A warning class (category) to be used,
- Defaults to FutureWarning
- :param since: A version string when the method or function was
- Deprecated
+
+ :param name: The name of the deprecated object
+ :param instead: Suggested replacement for the deprecated object
+ :param depth: Stacklevel depth; *depth* + 1 will be used as
+ stacklevel for the warnings
+ :param warning_class: A warning class (category) to be used,
+ defaults to FutureWarning
+ :param since: A version string when the method or function was
+ deprecated
"""
msg = _build_msg_string(instead, since)
if warning_class is None:
@@ -245,13 +245,12 @@
:param obj: Function being wrapped
:type obj: Object
"""
- def wrapper(*args, **kwargs):
+ def wrapper(*args, **kwargs) -> Any:
"""Replacement function.
:param args: Args passed to the decorated function.
:param kwargs: Kwargs passed to the decorated function.
:return: The value returned by the decorated function
- :rtype: any
"""
name = obj.__full_name__
depth = get_wrapper_depth(wrapper) + 1
diff --git a/pywikibot/tools/djvu.py b/pywikibot/tools/djvu.py
index 383989c..56db287 100644
--- a/pywikibot/tools/djvu.py
+++ b/pywikibot/tools/djvu.py
@@ -223,11 +223,10 @@
return self._remove_control_chars(stdoutdata)
@check_page_number
- def whiten_page(self, n) -> bool:
+ def whiten_page(self, n: int) -> bool:
"""Replace page 'n' of djvu file with a blank page.
:param n: Page n of djvu file
- :type n: Int
"""
# tmp files for creation/insertion of a white page.
white_ppm = os.path.join(self.dirname, 'white_page.ppm')
@@ -274,11 +273,10 @@
return True
@check_page_number
- def delete_page(self, n) -> bool:
+ def delete_page(self, n: int) -> bool:
"""Delete page 'n' of djvu file.
:param n: Page n of djvu file
- :type n: Int
"""
n_tot = self.number_of_images()
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1226226?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I779c82e2abdfd465c9d501b24919e1df45398b9d
Gerrit-Change-Number: 1226226
Gerrit-PatchSet: 2
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225072?usp=email )
Change subject: Fix: Do not raise UnknownExtensionError on non-Wikibase sites
......................................................................
Fix: Do not raise UnknownExtensionError on non-Wikibase sites
If a site is not connected to a Wikibase, return None instead of
raising UnknownExtensionError. This indicates that no related Page
was found.
Also no longer search for an archiveheader template if it is already
given with the archive template.
Bug: T414068
Change-Id: Ifeb4a9ba1f998034f6f365d3cbdda5110de86919
---
M pywikibot/site/_apisite.py
M scripts/archivebot.py
M tests/wikibase_tests.py
3 files changed, 24 insertions(+), 21 deletions(-)
Approvals:
jenkins-bot: Verified
Xqt: Looks good to me, approved
Tacsipacsi: Looks good to me, but someone else must approve
diff --git a/pywikibot/site/_apisite.py b/pywikibot/site/_apisite.py
index dce623e..282cc49 100644
--- a/pywikibot/site/_apisite.py
+++ b/pywikibot/site/_apisite.py
@@ -46,7 +46,6 @@
SiteDefinitionError,
SpamblacklistError,
TitleblacklistError,
- UnknownExtensionError,
)
from pywikibot.site._basesite import BaseSite
from pywikibot.site._decorators import need_right
@@ -1384,17 +1383,16 @@
.. versionchanged:: 7.7
No longer raise NotimplementedError if used with a Wikibase
site.
+ .. versionchanged:: 11.0
+ No longer raise UnknownExtensionError if site is not
+ connected to a wikibase but retern None instead.
:param item: Id number of item, "Q###",
:return: Page, or Category object given by Wikibase item number
for this site object.
-
- :raises pywikibot.exceptions.UnknownExtensionError: site has no
- Wikibase extension
"""
if not self.has_data_repository:
- raise UnknownExtensionError(
- f'Wikibase is not implemented for {self}.')
+ return None
repo = self.data_repository()
dp = pywikibot.ItemPage(repo, item)
diff --git a/scripts/archivebot.py b/scripts/archivebot.py
index 369d5cc..9b80a85 100755
--- a/scripts/archivebot.py
+++ b/scripts/archivebot.py
@@ -188,7 +188,7 @@
The ``-namespace`` option is now respected by ``-page`` option.
"""
#
-# (C) Pywikibot team, 2006-2025
+# (C) Pywikibot team, 2006-2026
#
# Distributed under the terms of the MIT license.
#
@@ -469,6 +469,10 @@
the current timestamp to the previous if the current is lower.
.. versionchanged:: 7.7
Load unsigned threads using timestamp of the next thread.
+ .. versionchanged:: 11.0
+ Use explicit check for 'archiveheader' to avoid eager
+ evaluation of :meth:`get_header_template` when an
+ archiveheader exists within archive template.
"""
self.header = ''
self.threads = []
@@ -476,8 +480,16 @@
try:
text = self.get()
except NoPageError:
- self.header = self.archiver.get_attr('archiveheader',
- self.get_header_template())
+ # Use explicit check for 'archiveheader' instead of passing it as
+ # a default to get_attr(), because get_attr evaluates the default
+ # eagerly. Without this, get_header_template() would always be
+ # called, even when an archiveheader exists within archive
+ # template, defeating lazy fallback.
+ if 'archiveheader' in self.archiver.attributes:
+ self.header = self.archiver.get_attr('archiveheader')
+ else:
+ self.header = self.get_header_template()
+
if self.params:
self.header = self.header % self.params
return
diff --git a/tests/wikibase_tests.py b/tests/wikibase_tests.py
index 4097eb4..eb8cf13 100755
--- a/tests/wikibase_tests.py
+++ b/tests/wikibase_tests.py
@@ -1,7 +1,7 @@
#!/usr/bin/env python3
"""Tests for the Wikidata parts of the page module."""
#
-# (C) Pywikibot team, 2008-2025
+# (C) Pywikibot team, 2008-2026
#
# Distributed under the terms of the MIT license.
#
@@ -19,7 +19,6 @@
IsNotRedirectPageError,
IsRedirectPageError,
NoPageError,
- UnknownExtensionError,
WikiBaseError,
)
from pywikibot.page import ItemPage, PropertyPage, WikibasePage
@@ -1425,18 +1424,12 @@
with self.assertRaisesRegex(WikiBaseError, regex):
self.wdp.data_item()
- def test_has_data_repository(self, key) -> None:
- """Test that site has no data repository."""
- site = self.get_site(key)
- self.assertFalse(site.has_data_repository)
-
- def test_page_from_repository_fails(self, key) -> None:
- """Test that page_from_repository method fails."""
+ def test_missing_data_repository(self, key) -> None:
+ """Test page_from_repository with no data repository."""
site = self.get_site(key)
dummy_item = 'Q1'
- regex = r'^Wikibase is not implemented for .+\.$'
- with self.assertRaisesRegex(UnknownExtensionError, regex):
- site.page_from_repository(dummy_item)
+ self.assertFalse(site.has_data_repository)
+ self.assertIsNone(site.page_from_repository(dummy_item))
class TestJSON(WikidataTestCase):
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225072?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: Ifeb4a9ba1f998034f6f365d3cbdda5110de86919
Gerrit-Change-Number: 1225072
Gerrit-PatchSet: 7
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: MarcoAurelio <maurelio(a)toolforge.org>
Gerrit-Reviewer: Tacsipacsi <tacsipacsi(a)jnet.hu>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225104?usp=email )
Change subject: Fix Docsig's SIG402 issues
......................................................................
Fix Docsig's SIG402 issues
Bug: T413960
Change-Id: I009bfecb9e409dfe65a438ae979f21b1f02888fa
---
M pyproject.toml
M pywikibot/__init__.py
M pywikibot/_wbtypes.py
M pywikibot/backports.py
M pywikibot/bot.py
M pywikibot/bot_choice.py
M pywikibot/comms/eventstreams.py
M pywikibot/config.py
M pywikibot/cosmetic_changes.py
M pywikibot/date.py
M pywikibot/diff.py
M pywikibot/editor.py
M pywikibot/family.py
M pywikibot/i18n.py
M pywikibot/interwiki_graph.py
M pywikibot/logentries.py
M pywikibot/login.py
M pywikibot/page/_basepage.py
M pywikibot/page/_category.py
M pywikibot/page/_collections.py
M pywikibot/page/_filepage.py
M pywikibot/page/_links.py
M pywikibot/page/_page.py
M pywikibot/page/_user.py
M pywikibot/page/_wikibase.py
M pywikibot/pagegenerators/__init__.py
M pywikibot/pagegenerators/_factory.py
M pywikibot/pagegenerators/_filters.py
M pywikibot/pagegenerators/_generators.py
M pywikibot/proofreadpage.py
M pywikibot/scripts/generate_user_files.py
M pywikibot/scripts/login.py
M pywikibot/scripts/wrapper.py
M pywikibot/site/_apisite.py
M pywikibot/site/_basesite.py
M pywikibot/site/_decorators.py
M pywikibot/site/_extensions.py
M pywikibot/site/_generators.py
M pywikibot/site/_namespace.py
M pywikibot/specialbots/_upload.py
M pywikibot/textlib.py
M pywikibot/time.py
M pywikibot/tools/__init__.py
M pywikibot/tools/_deprecate.py
M pywikibot/tools/djvu.py
M pywikibot/tools/itertools.py
M pywikibot/userinterfaces/gui.py
M pywikibot/userinterfaces/terminal_interface_base.py
M pywikibot/version.py
49 files changed, 1,292 insertions(+), 1,297 deletions(-)
Approvals:
jenkins-bot: Verified
Xqt: Looks good to me, approved
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225104?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I009bfecb9e409dfe65a438ae979f21b1f02888fa
Gerrit-Change-Number: 1225104
Gerrit-PatchSet: 2
Gerrit-Owner: Sanjai Siddharthan <sanjaisiddharth2002(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1224577?usp=email )
Change subject: families: Add support for meta.wikimedia.beta.wmcloud.org
......................................................................
families: Add support for meta.wikimedia.beta.wmcloud.org
Bug: T413060
Change-Id: I7f85173982175eb939549c3c7f5ea91188c24701
---
M pywikibot/families/meta_family.py
1 file changed, 16 insertions(+), 5 deletions(-)
Approvals:
jenkins-bot: Verified
Xqt: Looks good to me, approved
diff --git a/pywikibot/families/meta_family.py b/pywikibot/families/meta_family.py
index cb308ce..6b029c5 100644
--- a/pywikibot/families/meta_family.py
+++ b/pywikibot/families/meta_family.py
@@ -1,6 +1,6 @@
"""Family module for Meta Wiki."""
#
-# (C) Pywikibot team, 2005-2024
+# (C) Pywikibot team, 2005-2026
#
# Distributed under the terms of the MIT license.
#
@@ -9,18 +9,29 @@
from pywikibot import family
-# The Wikimedia Meta-Wiki family
-class Family(family.WikimediaOrgFamily):
+class Family(family.WikimediaFamily):
- """Family class for Meta Wiki."""
+ """Family class for Meta Wiki.
+
+ .. versionchanged:: 11.0
+ beta site code was added.
+ """
name = 'meta'
+ langs = {
+ 'meta': 'meta.wikimedia.org',
+ 'beta': 'meta.wikimedia.beta.wmcloud.org',
+ }
+ # Sites we want to edit but not count as real languages
+ test_codes = ['beta']
interwiki_forward = 'wikipedia'
cross_allowed = ['meta']
+ # Templates that indicate a category redirect
+ # Redirects to these templates are automatically included
category_redirect_templates = {
- 'meta': (
+ '_default': (
'Category redirect',
),
}
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1224577?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I7f85173982175eb939549c3c7f5ea91188c24701
Gerrit-Change-Number: 1224577
Gerrit-PatchSet: 2
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: Zache-tool <kimmo.virtanen(a)gmail.com>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225003?usp=email )
Change subject: tests: Fix flaky test_watch by using polling instead of fixed sleep
......................................................................
tests: Fix flaky test_watch by using polling instead of fixed sleep
The test_watch test in TestPageUserAction was failing intermittently because
it relied on a fixed 15-second sleep to wait for a watchlist update (unwatch)
to propagate. If the server or test environment was slow, the page would
still appear in the watchlist after the sleep, causing the assertion
assertNotIn to fail.
This patch replaces the fixed time.sleep(15) with a retry loop that polls
watched_pages every second for up to 30 seconds. This ensures:
* The test passes as soon as the change propagates (potentially faster).
* The test is robust against delays up to 30 seconds, reducing flakiness.
Bug: T414166
Change-Id: I3bc6a6d7a819e2f8548b49b1e6badd67b823865f
---
M AUTHORS.rst
M tests/page_tests.py
2 files changed, 10 insertions(+), 2 deletions(-)
Approvals:
Xqt: Looks good to me, approved
jenkins-bot: Verified
diff --git a/AUTHORS.rst b/AUTHORS.rst
index 04188ba..135011d 100644
--- a/AUTHORS.rst
+++ b/AUTHORS.rst
@@ -302,6 +302,7 @@
Sanjai Siddharthan
+ Sarthak Singh
Serio Santoro
Scot Wilcoxon
Shardul C
diff --git a/tests/page_tests.py b/tests/page_tests.py
index 1b1791e..6de059a 100755
--- a/tests/page_tests.py
+++ b/tests/page_tests.py
@@ -1105,8 +1105,15 @@
rv = userpage.watch(expiry='5 seconds')
self.assertTrue(rv)
self.assertIn(userpage, userpage.site.watched_pages(**wp_params))
- time.sleep(15)
- self.assertNotIn(userpage, userpage.site.watched_pages(**wp_params))
+ # Wait for the expiry to pass
+ time.sleep(5)
+ # Retry check for unwatch to propagate for up to 30 seconds
+ for _ in range(30):
+ time.sleep(1)
+ if userpage not in userpage.site.watched_pages(**wp_params):
+ break
+ else:
+ self.fail('unwatch failed')
class TestPageDelete(TestCase):
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225003?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I3bc6a6d7a819e2f8548b49b1e6badd67b823865f
Gerrit-Change-Number: 1225003
Gerrit-PatchSet: 7
Gerrit-Owner: SarthakSingh2904 <sarthakunited2022(a)gmail.com>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225074?usp=email )
Change subject: IMPR: Add user-agent to message if Non-JSON response received from server
......................................................................
IMPR: Add user-agent to message if Non-JSON response received from server
UA is responsible for some Non-JSON response.
Bug: T414170
Change-Id: I23dec15b3ef569e110020cd1492f1bdfa8952f2d
---
M pywikibot/data/api/_requests.py
1 file changed, 2 insertions(+), 0 deletions(-)
Approvals:
jenkins-bot: Verified
Xqt: Looks good to me, approved
diff --git a/pywikibot/data/api/_requests.py b/pywikibot/data/api/_requests.py
index 419439e..9ec54dd 100644
--- a/pywikibot/data/api/_requests.py
+++ b/pywikibot/data/api/_requests.py
@@ -799,11 +799,13 @@
text = removeDisabledParts(response.text, ['script'])
text = re.sub('\n{2,}', '\n',
'\n'.join(removeHTMLParts(text).splitlines()[:20]))
+ ua = response.request.headers.get('User-Agent')
msg = f"""\
Non-JSON response received from server {self.site} for url
{response.url}
The server may be down.
Status code: {response.status_code}
+User agent: {ua}
The text message is:
{text}
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225074?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I23dec15b3ef569e110020cd1492f1bdfa8952f2d
Gerrit-Change-Number: 1225074
Gerrit-PatchSet: 1
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot
jenkins-bot has submitted this change. ( https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225073?usp=email )
Change subject: doc: Add typing information and improve docstring for Request._http_request
......................................................................
doc: Add typing information and improve docstring for Request._http_request
Change-Id: I6ca773d1984d6df124918b5fd915920ef71bb066
---
M pywikibot/data/api/_requests.py
1 file changed, 36 insertions(+), 4 deletions(-)
Approvals:
jenkins-bot: Verified
Xqt: Looks good to me, approved
diff --git a/pywikibot/data/api/_requests.py b/pywikibot/data/api/_requests.py
index a8289f0..419439e 100644
--- a/pywikibot/data/api/_requests.py
+++ b/pywikibot/data/api/_requests.py
@@ -20,7 +20,7 @@
from contextlib import suppress
from email.mime.nonmultipart import MIMENonMultipart
from pathlib import Path
-from typing import Any, NoReturn
+from typing import TYPE_CHECKING, Any, NoReturn
from urllib.parse import unquote, urlencode, urlparse
from warnings import warn
@@ -42,6 +42,9 @@
from pywikibot.tools import deprecated
+if TYPE_CHECKING:
+ import requests
+
__all__ = ('CachedRequest', 'Request', 'encode_url')
TEST_RUNNING = os.environ.get('PYWIKIBOT_TEST_RUNNING', '0') == '1'
@@ -678,9 +681,25 @@
f'Headers: {headers!r}\nURI: {uri!r}\nBody: {body!r}')
return use_get, uri, body, headers
- def _http_request(self, use_get: bool, uri: str, data, headers,
- paramstring) -> tuple:
- """Get or post a http request with exception handling.
+ def _http_request(
+ self,
+ use_get: bool,
+ uri: str,
+ data: dict[str, str | int | float | bool] | None,
+ headers: dict[str, str] | None,
+ paramstring: str
+ ) -> tuple[requests.Response | None, bool]:
+ """Send an HTTP GET or POST request with exception handling.
+
+ This method wraps :func:`comms.http.request` to send a request
+ to the site's server, handle common HTTP errors, and optionally
+ retry using an alternative scheme or method.
+
+ .. note::
+ ImportError during request handling will terminate the
+ program; it is not propagated as an exception. Any other
+ unexpected exceptions are logged and trigger a wait before
+ retrying; they are not propagated to the caller.
.. versionchanged:: 8.2
change the scheme if the previous request didn't have json
@@ -688,9 +707,22 @@
.. versionchanged:: 9.2
no wait cycles for :exc:`ImportError` and :exc:`NameError`.
+ :param use_get: If True, send a GET request; otherwise send POST.
+ :param uri: The URI path to request on the site.
+ :param data: The data to send in the request body (for POST) or
+ query string (for GET).
+ :param headers: HTTP headers to include in the request.
+ :param paramstring: A string representing the request parameters
+ (used for logging/debug).
:return: a tuple containing requests.Response object from
:func:`comms.http.request` and *use_get* value
+ :raises Client414Error: If a 414 URI Too Long occurs on a POST
+ request after GET retry failed.
+ :raises ConnectionError: For network connection errors.
+ :raises FatalServerError: For critical server errors.
+ :raises NameError: If a NameError occurs during request handling.
+
:meta public:
"""
kwargs = {}
--
To view, visit https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1225073?usp=email
To unsubscribe, or for help writing mail filters, visit https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: I6ca773d1984d6df124918b5fd915920ef71bb066
Gerrit-Change-Number: 1225073
Gerrit-PatchSet: 1
Gerrit-Owner: Xqt <info(a)gno.de>
Gerrit-Reviewer: Xqt <info(a)gno.de>
Gerrit-Reviewer: jenkins-bot