Hi,
I write scripts, mostly for temporary tasks. I understand, that pwb.py
handles arguments and logs and finds my script in /scripts/userscripts.
Most of these scripts don't excpect arguments, the task is wired in. So is
there any advantage of using pwb.py in this case?
--
Bináris
ERROR: Traceback (most recent call last):
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_basesite.py",
line 192, in __getattr__
method = getattr(self.family, attr)
AttributeError: 'Family' object has no attribute '_namespaces'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\tools\__init__.py",
line 726, in wrapper
return getattr(obj, cache_name)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_basesite.py",
line 200, in __getattr__
raise AttributeError("{} instance has no attribute '{}'"
AttributeError: APISite instance has no attribute '_namespaces'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\data\api\_requests.py",
line 681, in _http_request
response = http.request(self.site, uri=uri,
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\comms\http.py",
line 233, in request
r = fetch(baseuri, headers=headers, **kwargs)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\comms\http.py",
line 399, in fetch
callback(response)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\comms\http.py",
line 272, in error_handling_callback
raise FatalServerError(str(response))
pywikibot.exceptions.FatalServerError: HTTPSConnectionPool(host='
he.wikisource.org', port=443): Max retries exceeded with url:
/w/api.php?action=query&meta=siteinfo%7Cuserinfo&siprop=namespaces%7Cnamespacealiases%7Cgeneral&continue=&uiprop=blockinfo%7Chasmsg&maxlag=5&format=json
(Caused by SSLError(SSLCertVerificationError(1, '[SSL:
CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local
issuer certificate (_ssl.c:997)')))
Traceback (most recent call last):
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_basesite.py",
line 192, in __getattr__
method = getattr(self.family, attr)
AttributeError: 'Family' object has no attribute '_namespaces'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\tools\__init__.py",
line 726, in wrapper
return getattr(obj, cache_name)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_basesite.py",
line 200, in __getattr__
raise AttributeError("{} instance has no attribute '{}'"
AttributeError: APISite instance has no attribute '_namespaces'. Did you
mean: 'namespaces'?
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pwb.py", line
39, in <module>
sys.exit(main())
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pwb.py", line
35, in main
runpy.run_path(str(path), run_name='__main__')
File "C:\Program
Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.2288.0_x64__qbz5n2kfra8p0\lib\runpy.py",
line 289, in run_path
return _run_module_code(code, init_globals, run_name,
File "C:\Program
Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.2288.0_x64__qbz5n2kfra8p0\lib\runpy.py",
line 96, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "C:\Program
Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.2288.0_x64__qbz5n2kfra8p0\lib\runpy.py",
line 86, in _run_code
exec(code, run_globals)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\scripts\wrapper.py",
line 516, in <module>
main()
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\scripts\wrapper.py",
line 500, in main
if not execute():
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\scripts\wrapper.py",
line 487, in execute
run_python_file(filename, script_args, module)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\scripts\wrapper.py",
line 147, in run_python_file
exec(compile(source, filename, 'exec', dont_inherit=True),
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\scripts\redirect.py",
line 742, in <module>
main()
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\scripts\redirect.py",
line 733, in main
if gen_factory.namespaces:
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\pagegenerators\_factory.py",
line 191, in namespaces
self.site.namespaces.resolve(self._namespaces))
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\tools\__init__.py",
line 728, in wrapper
val = fn(obj)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_basesite.py",
line 246, in namespaces
return NamespacesDict(self._build_namespaces())
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_apisite.py",
line 1063, in _build_namespaces
for nsdata in self.siteinfo.get('namespaces', cache=False).values():
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_siteinfo.py",
line 304, in get
preloaded = self._get_general(key, expiry)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_siteinfo.py",
line 242, in _get_general
default_info = self._get_siteinfo(props, expiry)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\site\_siteinfo.py",
line 167, in _get_siteinfo
data = request.submit()
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\data\api\_requests.py",
line 1236, in submit
self._data = super().submit()
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\data\api\_requests.py",
line 965, in submit
response, use_get = self._http_request(use_get, uri, body, headers,
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\data\api\_requests.py",
line 681, in _http_request
response = http.request(self.site, uri=uri,
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\comms\http.py",
line 233, in request
r = fetch(baseuri, headers=headers, **kwargs)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\comms\http.py",
line 399, in fetch
callback(response)
File
"C:\Users\IMOE001\Downloads\pywikibot-master\pywikibot-master\pywikibot\comms\http.py",
line 272, in error_handling_callback
raise FatalServerError(str(response))
pywikibot.exceptions.FatalServerError: HTTPSConnectionPool(host='
he.wikisource.org', port=443): Max retries exceeded with url:
/w/api.php?action=query&meta=siteinfo%7Cuserinfo&siprop=namespaces%7Cnamespacealiases%7Cgeneral&continue=&uiprop=blockinfo%7Chasmsg&maxlag=5&format=json
(Caused by SSLError(SSLCertVerificationError(1, '[SSL:
CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local
issuer certificate (_ssl.c:997)')))
CRITICAL: Exiting due to uncaught exception <class
'pywikibot.exceptions.FatalServerError'>
בברכה,
ישראל קלר.
Hi,
I want to write a script that monitors block logs or the list of active
blocks for a certain reason (open proxy), and warns me if a block expires
soon.
Warning may be written either to a noticeboard or a mailing list, the main
thing is to find them.
Rationale: we give 1 year block for proxies, but after expiration they are
likely to be still open, and need revision.
Do we have a clever tool for this? How would you begin the task?
--
Bináris
Hi,
for the past few weeks I've been looking at Pywikibot's documentation
to figure out if there's anything I can do to help improve it. Most of
this was done as part of [1]. This week I put my conclusions and
proposals for such improvements in [2]. I am hoping to work on some of
them until the end of this year, but I would appreciate feedback,
ideas, and suggestions about these proposals first.
Please take a look at [2] and use the talk page to provide your
feedback. You can also subscribe to [3] to follow my work on this or
help contribute to these improvements in other ways.
Thank you,
Kamil
[1]: https://phabricator.wikimedia.org/T312992
[2]: https://www.mediawiki.org/wiki/User:KBach-WMF/Collections/PywikibotCollecti…
[3]: https://phabricator.wikimedia.org/T320625
--
Kamil Bach
Technical Writer
Wikimedia Foundation
Hi,
I am new to core.
Is there currently any possibility to work only on flagged pages?
https://en.wikipedia.org/wiki/Wikipedia:Flagged_revisions /
https://de.wikipedia.org/wiki/Hilfe:Gesichtete_Versionen
It would be useful to work only on currently flagged or currently unflagged
versions. For the flagged versions, a command line switch is useful for not
programmers.
Rationale:
When unflagged versions are edited by bots, the diff to check can be messy,
and users are angry with bot owners.
Alhough some changes (i.e. spelling corrections) are necessary, fore
example cosmetic changes could be prohibited by communities on unflagged
pages.
--
Bináris