Happy Monday,
There are strange people who make such links (kindof urlencoded?):
[[Második világháború#Partrasz.C3.A1ll.C3.A1s Szic.C3.ADli.C3.A1ban
.28Huskey hadm.C5.B1velet.29|Huskey hadműveletben]]
So the section title must have been copied from the URL.
Do we have a ready tool to fix these?
--
Bináris
Hi,
+cc pywikibot(a)lists.wikimedia.org
On 12/22/2016 03:55 PM, Anthony Di Franco wrote:
> Hi all,
> I'm doing some renovations on recitation-bot and running into trouble when
> the time comes for pywikibot to upload article data to wikisource and
> commons. The thread doing so hangs without any sort of informative error. I
> made sure that the unix user under which the web service that is using
> pywikibot is running is logged into each wiki per Max's advice but I still
> have the problem. I'm going to try to get more information about what's
> going on but would also appreciate pointers about what might be going
> wrong. Particularly, the web service is now running under Kubernetes rather
> than sun grid engine, so I suspect that the login state might not be making
> it into the container - can anyone advise on where the login state is
> maintained and whether this will be transferred into the kubernetes
> container?
Pywikibot stores all of its state in the same directory that your
user-config.py file is in.
In my experience python hanging is typically an accidental infinite
loop. Adding debug logging can help pinpoint where it starts hanging and
narrow down the problematic code.
-- Legoktm
We run a private wiki, requiring logins to read. Pywikibot worked fine in MediaWiki 1.26, but since upgrading to MediaWiki 1.27.1, pywikbot now cannot log in. Any debugging help appreciated! The symptom is:
$ pwb.py login.py
ERROR: APIError: readapidenied: You need read permission to use this module
Password for user Danb on examplewiki:en (no characters will be shown): ********
Logging in to examplewiki:en as Danb
WARNING: API warning (login): Fetching a token via action=login is deprecated. Use action=query&meta=tokens&type=login instead.
ERROR: Login failed (Failed).
My family file DOES contains an isPublic method that returns False, as instructed for private wikis at https://www.mediawiki.org/wiki/Manual:Pywikibot/Use_on_third-party_wikis#Bo….
Also, authentication is by LDAP (using extension LdapAuthentication from mediawiki.org) and requires username, password, and domain. This worked fine for Pywikibot in MW 1.26.
I tried generating a new family file, and that failed too:
$ ./generate_family_file.py https://examplewiki.net/wiki/Home examplewiki
Generating family file from https://examplewiki.net/wiki/Home
Traceback (most recent call last):
File "./generate_family_file.py", line 321, in <module>
FamilyFileGenerator(*sys.argv[1:]).run()
File "./generate_family_file.py", line 94, in run
w = Wiki(self.base_url)
File "./generate_family_file.py", line 262, in __init__
self._parse_post_117(wp, fromurl)
File "./generate_family_file.py", line 293, in _parse_post_117
info = json.loads(data.read().decode(data.charset))['query']['general']
KeyError: u'query'
Here is the Pywikibot version info:
$ pwb.py version.py
Pywikibot: [https] r-pywikibot-core (fd4707e, g5894, 2016/12/05, 19:10:45, n/a)
Release version: 2.0rc5
httplib2 version: 0.9
cacerts: /home/danb/pywikibot/externals/httplib2/python2/httplib2/cacerts.txt
certificate test: ok
Python: 2.7.12 (default, Jul 1 2016, 15:12:24)
[GCC 5.4.0 20160609]
unicode test: ok
PYWIKIBOT2_DIR: Not set
PYWIKIBOT2_DIR_PWB: /home/danb/pywikibot
PYWIKIBOT2_NO_USER_CONFIG: Not set
Config base dir: /home/danb/pywikibot
Usernames for family "examplewiki":
en: danb (also sysop)
Thanks again,
DanB
Hello,
I'm not sure to have reported this to the correct mailing list because I
have no news about it.
According to a frwiki user, the cosmetic change <br> -> <br /> is incorrect
because the correct tag is <br> , even if <br /> accepted.
See
https://fr.m.wikipedia.org/wiki/Utilisateur:HerculeBot#/talk/38 for details
(in french)
Can someone correct this on default cosmetic changes used by Pywikipedia
scripts ?
Regards
Hercule
Hi,
My name is Mayank Jindal. I am third-year undergraduate student
currently studying at Indian Institute of Technology, Kharagpur. I want to
take part in Gsoc-2017 from Wikimedia.
I have seen some projects related to pywikibot on possible tech projects
list on pywikibot. So I am getting started with easy bugs on pywikibot.
I have started with - https://phabricator.wikimedia.org/T150299
Kindly help me in getting started with development on pywikibot.
--
Kind Regards,
Mayank Jindal,
Third year undergraduate student,
Indian Institute of Technology Kharagpur
Mobile : +91- 7076670299 || 8875432718
Hi,
I've noticed https://phabricator.wikimedia.org/T70797 has been
proposed for GCI, but cannot be published due to a lack of mentor
(mentor being Ladsgroup, I suppose?)
Is there anyone else who can take over as mentor? (it will go through
code review anyway, so we can all take a look)
Strainu
Hello,
I used Compat on Tools Labs. Il try to now install core, but it doesn't
work.
I follow the instructions on https://www.mediawiki.org/
wiki/Manual:Pywikibot/Installation/Labs
When identified as my bot, I run
tools.herculebot@tools-bastion-03:~$ git clone --recursive
https://gerrit.wikimedia.org/r/pywikibot/core.git pywikibot-core
Cloning into 'pywikibot-core'...
remote: Counting objects: 1488, done
remote: Finding sources: 100% (200/200)
remote: Getting sizes: 100% (113/113)
remote: Compressing objects: 100% (2623205/2623205)
error: RPC failed; result=56, HTTP code = 200MiB | 2.29 MiB/s
fatal: The remote end hung up unexpectedly
fatal: early EOF
fatal: index-pack failed
Can anybody help me ? I can't run anymore the pywikipedia scripts on
fr.wiki.
Regards
Hercule
Hi Martin,
On 4 December 2016 at 18:29, Martin Urbanec <martin.urbanec(a)wikimedia.cz>
wrote:
> I was running weblinkchecker.py for whole cswiki (job was submited to the
> grid at Sun, 20 Nov 2016 16:54:24 GMT) because I wished to have a list of
> deadlinks. This may correspond with the UA (because I used script named
> weblinkschecker.py). I trusted this script it won't do anything wrong
> because this script was and still is in standard core package. I also use
> 3.0-dev version of pywikibot and Python 2.7.6.
>
>
It probably wasn't you, but it was indeed the standard weblinkchecker
causing this. Apparently no throttling is implemented -- just a maximum
number of parallel connections. Many parallel connections are fine... but
not to the same host. This bot was running on eswiki, and ewsiki has
thousands of links to http://www.minorplanetcenter.net/.
I have contacted the user, and will file a bug for Pywikibot to get this
solved on that end.
Best,
Merlijn
Forwarding this as it might be a pywikibot operator subscribed to this list.
--
User:Whym
Member of Wikimedian Society of Tokyo (東京ウィキメディアン会) / http://tokyo.wikimedia.jp
---------- Forwarded message ----------
From: Maximilian Doerr <maximilian.doerr(a)gmail.com>
Date: Sun, Dec 4, 2016 at 1:51 PM
Subject: [Labs-l] Some using a Python framework is relentlessly
hammering Harvard sites, resulting an IP range ban.
To: labs-l(a)lists.wikimedia.org
Would the user who is querying the Harvard sites for planet data, that
is carrying the UA “weblinkchecker Pywikibot/3.0-dev (g7171)
requests/2.2.1 Python/2.7.6.final.0”, please stop, or severely
throttle the GET requests. It’s making 168 requests to that site a
minute, and consequently they banned labs from accessing it, according
to the IT department there, who kindly shared with me the access log.
I would imagine it’s also not being very friendly with the bandwidth
usage of Labs itself.
Consequently, InternetArchiveBot cannot ascertain if the site is alive
or not because of this ban, and while I am working on a solution for
these cases, it’s really best if the bot can be able to make these
decisions on its own rather than deferring to a whitelist of sorts.
Cyberpower678
English Wikipedia Account Creation Team
Mailing List Moderator
Global User Renamer
_______________________________________________
Labs-l mailing list
Labs-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l