Happy Monday,
There are strange people who make such links (kindof urlencoded?):
[[Második világháború#Partrasz.C3.A1ll.C3.A1s Szic.C3.ADli.C3.A1ban
.28Huskey hadm.C5.B1velet.29|Huskey hadműveletben]]
So the section title must have been copied from the URL.
Do we have a ready tool to fix these?
--
Bináris
Hello all
>From one of my assignments as a bot operator I have some code which
does template parsing and general text parsing (e.g. Image/File tags).
It is not using regex and thus able to correctly parse nested
templates and other such nasty things. I have written those as library
classes and written tests for them which cover almost all of the code.
I would now really like to contribute that code back to the community.
Would you be interested in adding this code to the pywikibot
framework? If yes, can I send the code to someone for code review or
how do you usually operate?
Greetings
Hannes
PS: wiki userpage is http://en.wikipedia.org/wiki/User:Hannes_R%C3%B6st
Did this effect anyone?
Did pywikibot gracefully handle the problem?
If not, what happened and how can we improve for next time?
---------- Forwarded message ---------
From: Sjoerd de Bruin <sjoerddebruin(a)me.com>
Date: Thu, 30 Jun 2016 02:50
Subject: [Wikidata] Temporary outage of adding statements
To: Discussion list for the Wikidata project. <wikidata(a)lists.wikimedia.org>
Hello everyone,
It was impossible to add any statements to items from 19:00± till 19:37
(both UTC) due to a bug that appeared after deployment of new software.
More information on Phabricator [1], please check your bot runs for missing
data. We are sorry for any inconvenience caused.
Greetings,
Sjoerd de Bruin
[1] https://phabricator.wikimedia.org/T138974
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Forwarding, not sure if this breaks pywikibot.
-------- Doorgestuurd bericht --------
Onderwerp: [Mediawiki-api-announce] [BREAKING CHANGE]
meta=notifications outputs array instead of object since 1.27.0-wmf.22
Datum: Tue, 28 Jun 2016 15:49:03 +0200
Van: Roan Kattouw <roan.kattouw(a)gmail.com>
Antwoord-naar: mediawiki-api(a)lists.wikimedia.org
Aan: MediaWiki API announcements & discussion
<mediawiki-api-announce(a)lists.wikimedia.org>
As of 1.27.0-wmf.22 (deployed to WMF wikis April 26-28) and MediaWiki
release 1.27 (specifically, this change
<https://phabricator.wikimedia.org/rECHOb10bd700333d2eef10c7df0a9946b87fd451…>
in Echo), meta=notifications will return its result as an array rather
than an object.
Previously, the meta=notifications output looked like:
{
"query": {
"notifications": {
"list": {
"12345": {
"id": "12345",
"type": "edit-user-talk",
...
}, ...
}
Now, it looks like:
{
"query": {
"notifications": {
"list": [
{
"id": "12345",
"type": "edit-user-talk",
...
}, ...
]
My apologies for the late announcement, and thanks to APerson and Cienca
Al Poder for noticing <https://phabricator.wikimedia.org/T138690> and
encouraging me to write this announcement.
Hello !
I'm contributing to the genealogy wiki Rodovid (http://en.rodovid.org/wk/Main_Page) using version 1.9.3 of Mediawiki, and I have difficulties upgrading from compat to core (perhaps because I'm a beginner with Python).
I can log without problem to Wikipedia, but when I try to do the same with Rodovid, typing "login" does not lead to the prompt where the password should be entered. Instead there is the message :
WARNING: API error unknown_meta: Unrecognised value for parameter 'meta'
ERROR: APIError: unknown_meta: Unrecognised value for parameter 'meta' [help:
then the entire api page of Rodovid is displayed, before having a loop with the lines :
WARNING: Http response status 503
WARNING: Non-JSON response received from server rodovid:fr; the server may be down.
WARNING: Waiting 5 seconds before retrying.
And if I try with the latest stable version, after having entered the password, I get :
Logging in to rodovid:fr as Ad
Traceback (most recent call last):
…
File "C:\compat\pywikibot\site.py", line 1684, in login
if loginMan.login(retry=True):
File "C:\compat\pywikibot\login.py", line 239, in login
cookiedata = self.getCookie()
File "C:\compat\pywikibot\data\api.py", line 2560, in getCookie
prefix = login_result['login']['cookieprefix']
KeyError: u'cookieprefix'
<type 'exceptions.KeyError'>
CRITICAL: Waiting for 1 network thread(s) to finish. Press ctrl-c to abort
Can someone tell me what's wrong ?
Hello,
About a month ago I had sent a mail[0] about my GSoC project on porting
catimages to pywikibot-core with my mentors DrTrigon[1] and jayvdb[2]. As a
step towards that, we've made a library to analyze files on commons using
exif data and computer vision techniques which will be used in the bot. We
recently released a v0.1.0.
Currently, the library is able to identify mimetypes, detect barcodes,
detect faces, read exif data, and measure the average color. You can read
more about the library at User:AbdealiJK/file-metadata[3]. It contains
installation instructions and also a simple script using pywikibot which
can be used to analyze files on commons.
We've been running the library on a number of files (35,000+) on commons to
test for corner cases and to check it's validity. You can find the logs of
that analysis on
https://commons.wikimedia.org/wiki/User:AbdealiJKTravis/logs . There are
some discrepancies which have been seen, and it would be great to hear your
comments on it.
It would also be immensely helpful if users can install the library and
test it out. If any problems arise, please make an issue at the bug
tracker[4] on github or on the Talk page so that we can help you out and
also make the library more robust.
Regards,
Abdeali JK
References:
[0] - https://lists.wikimedia.org/pipermail/commons-l/2016-May/007740.html
[1] - https://commons.wikimedia.org/wiki/User:DrTrigon
[2] - https://commons.wikimedia.org/wiki/User:Jayvdb
[3] - https://commons.wikimedia.org/wiki/User:AbdealiJK/file-metadata
[4] - https://github.com/AbdealiJK/file-metadata/issues