https://bugzilla.wikimedia.org/show_bug.cgi?id=60739
Web browser: ---
Bug ID: 60739
Summary: Missing message for the anti-spam filter error
Product: Pywikibot
Version: compat (1.0)
Hardware: PC
OS: Windows XP
Status: UNCONFIRMED
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: basilicofresco(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
At the moment you do not get any error message when you try to change a page
with replace.py and the spam filter blocks your edit.
For example writing...
replace.py "htttp" "http" -page:"Clinique makeup" -lang:en
The edit will not be saved but you will get:
1 page was changed.
Updating page [[Clinique makeup]] via API
The variable "error" at line 10317 in wikipedia.py is None, but should be
SpamfilterError.
version.py:
Pywikibot: [https] r/pywikibot/compat (r10829, 5b8f363, 2014/02/02, 13:13:58,
ok)
Release version: 1.0b1
Python: 2.7.3 (default, Apr 10 2012, 23:31:26) [MSC v.1500 32 bit (Intel)]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55109
Web browser: ---
Bug ID: 55109
Summary: Multilingual development
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/101/
Reported by: Anonymous user
Created on: 2007-08-06 17:47:39
Subject: Multilingual development
Original description:
English speaking at end
==Francais== Texte d'origine
Bonjour
Je voudrais vous proposer un système qui permet de rendre le robot multilingue.
En effet, tous les messages envoyé à la console sont anglophone. Or le but d'un
robot est de s'adapter à la multitude des languages pouvant exister de la part
des utilisateurs. C'est pour cela que je vous propose le système suivant :
Création d'un nouveau répertoire 'lang'. Dans ce répertoire s'y trouverait des
fichiers de type XX.py \(XX étant le code ISO 639 de la langue\). Donc ce
répertoire contiendra 1 ficher par code de langue existant.
Lorsque les différents programmes veulent afficher un message sur la console,
la commande utilisé est très souvent 'wikipedia.output' ou 'wikipedia.input'.
Le travail de cette commande serait d'appeller le fichier xx.py avec le numéro
du message à renvoyer en paramètre, le choix du xx serait donnée par la
variable mylang de user-config.py. le fichier xx.py enverrais alors le message
à afficher en tenant compte des différentes variables de type %s \(ou autre\)
bien entendu
Exemple :
dans user-config.py, j'ai "mylang = 'fr'"
Replace.py à la ligne 375 contient la commande "wikipedia.input\(u'Please enter
the new text:'\)",
Le nouveau système coderait "wikipedia.input\(u'Please enter the new text:'\)"
par "wikipedia.input.message\(284\)"
appelerait donc lang/fr.py et lui demanderais de lui retourner le message n°
284 qui serait "s'il vous plais, entrez le nouveaux texte :" et le lui
retourne.
Voila, en esperant avoir compris ma demande.
Je vous remercie de votre écoute
==English== Text translates since French by a machine translation system
Hello
I would like to propose you a system which allows to return the multilingual
robot. Indeed, all the messages messenger in the console are English-speaking.
Now the purpose of a robot is to adapt itself to the multitude of the languages
which can exist on behalf of the users. It is for it that I propose you the
following system:
Creation of a new directory ' lang '. In this directory would be files of type
XX.py \(XX there being the code ISO 639 of the language\). Thus this directory
will contain 1 file by existing code of language.
When the various programs want to post a message on the console, the order used
is very often ' wikipedia.output ' or ' wikipedia.input '. The work of this
order would be to call the xx.py file with the number of the message to be sent
back in parameter, the choice of the xx would be given by the mylang variable
to user-config.py
The xx.py file would send then the message to be posted\(shown\) by taking into
account various variables of type %s \(or other\) naturally
Example:
In user-config.py, I have " mylang = ' fr ' "
Replace.py in the line 375 contains the command " wikipedia.input \(u' Please
enter the new text: '\) ",
The new system would code " wikipedia.input \(u' Please enter the new text: '\)
"by" wikipedia.input.message \(284\) " call thus lang/fr.py and would ask it to
return it the message n° 284 which would be " s'il vous plais, entrez le
nouveaux texte : ".
Here we are, by hoping to have understood my demand.
I thank you for your listening
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55007
Web browser: ---
Bug ID: 55007
Summary: missing pagegenerators in rewite branch
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/339/
Reported by: cdpark
Created on: 2013-05-29 03:19:17
Subject: missing pagegenerators in rewite branch
Original description:
Some pagegenerator functions are missing in rewrite branch.
EdittimeFilterPageGenerator\(\)
ImageGenerator\(\)
LogpagesPageGenerator\(\)
PageTitleFilterPageGenerator\(\)
RandomPageGenerator\(\)
RandomRedirectPageGenerator\(\)
UnCategorizedTemplatesGenerator\(\)
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=62438
Bug ID: 62438
Summary: Make library functions for getSource / cacheSources
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: Wikidata
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: maarten(a)mdammers.nl
Web browser: ---
Mobile Platform: ---
We now have getSource / cacheSources in several bots. Should be a generic
non-WP-specific function in a central library which the different bots can use.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55161
Web browser: ---
Bug ID: 55161
Summary: pageform file and cosmetic changes
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1545/
Reported by: jandudik
Created on: 2012-11-28 10:51:03
Subject: pageform file and cosmetic changes
Original description:
I tried to made several redirects:
pagefromfile.py -start:XXX -end:XXX -file:hsb.txt -notitle -lang:hsb
where hsb.txt contained
XXX
'''Carex dioica'''
\#REDIRECT\[\[dwójna rězna\]\]
YYY
...
But bot imported article with cosmetic chages and there isn't \#REDIRECT but \#
REDIRECT
http://hsb.wikipedia.org/w/index.php?title=Carex\_dioica&oldid=274441
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55162
Web browser: ---
Bug ID: 55162
Summary: Bot saves big pages but doesn't know that it was saved
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/bugs/1543/
Reported by: eugo
Created on: 2012-11-19 09:25:39
Subject: Bot saves big pages but doesn't know that it was saved
Original description:
The bot can edit \(i.e. the edit is actually made\) a huge page \(>900 KB\),
but it gets back an error and cannot proceed:
"""
Updating page \[\[Wikipedia:Bibliotheksrecherche/Anfragen/Archiv/2012\]\] via
API
<urlopen error \[Errno 35\] Resource temporarily unavailable>
WARNING: Could not open 'http://de.wikipedia.org/w/api.php'. Maybe the server
or
your connection is down. Retrying in 1 minutes...
"""
Run the attached script to see this error.
I remember there was a similar problem with Wikimedia's server configuration
some time ago. But I don't know the details anymore...
\---
Pywikipedia \[http\] trunk/pywikipedia \(r10741, 2012/11/18, 20:22:23\)
Python 2.7.3 \(v2.7.3:70274d53c1dd, Apr 9 2012, 20:52:43\)
\[GCC 4.2.1 \(Apple Inc. build 5666\) \(dot 3\)\]
config-settings:
use\_api = True
use\_api\_login = True
unicode test: ok
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55091
Web browser: ---
Bug ID: 55091
Summary: interwiki summary should tell deletion reason
Product: Pywikibot
Version: unspecified
Hardware: All
OS: All
Status: NEW
Severity: enhancement
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: legoktm.wikipedia(a)gmail.com
Classification: Unclassified
Mobile Platform: ---
Originally from: http://sourceforge.net/p/pywikipediabot/feature-requests/183/
Reported by: Anonymous user
Created on: 2009-02-23 18:56:23
Subject: interwiki summary should tell deletion reason
Original description:
If removing a interwiki link, the summary should tell about the reason for this
\(disambig, namespace, missing page\). This would ensure better control of the
bot if its removing links which often leads to problems and some trouble. Some
abbrev. would be ok; sth. like would help:
deleting: de, en \(dg\); fi, fr \(ns\); af, bg \(mp\)
<w:de:User:Xqt>
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63800
Bug ID: 63800
Summary: Page generators not working with Wikidata
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: UNCONFIRMED
Severity: major
Priority: Unprioritized
Component: pagegenerators
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: sofardamngood(a)reallymymail.com
Web browser: ---
Mobile Platform: ---
Pegegenerator has been broken for Wikidata scripts like harvest_template or
claimit since February. The diff
http://git.wikimedia.org/blob/pywikibot%2Fcore.git/b9ddecb363a1c208b507dbfe…
is the last working version.
A command like
python pwb.py claimit -family:wikipedia -lang:en -transcludes:'Infobox video
game' P19 Q30
is supposed to create a generator with pages transcluding the template on the
given Wikipedia, but the current version of pagegenerators ignores the
arguments and tries to fetch the pages from wikidatawiki instead, which of
course fails.
More information about this bug is available here:
https://de.wikipedia.org/w/index.php?title=Benutzer_Diskussion:Xqt&oldid=12…
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63671
Bug ID: 63671
Summary: Wikidataquery ip or url (wikidataquery.eu) seems
broken
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: major
Priority: Unprioritized
Component: Wikidata
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: info(a)gno.de
Web browser: ---
Mobile Platform: ---
See https://travis-ci.org/wikimedia/pywikibot-core/jobs/22514348
IP 208.80.153.172 is not reachable.
local test with url "wikidataquery.eu" results in:
c:\Pywikipedia\ssh\pywikibot\core>pwb.py tests/wikidataquery_tests
.....E#xception in thread Thread-1:
Traceback (most recent call last):
File "C:\Python27\lib\threading.py", line 551, in __bootstrap_inner
self.run()
File "C:\Pywikipedia\ssh\pywikibot\core\pywikibot\comms\threadedhttp.py",
line
350, in run
item.data = self.http.request(*item.args, **item.kwargs)
File "C:\Pywikipedia\ssh\pywikibot\core\pywikibot\comms\threadedhttp.py",
line
251, in request
uri, method, body, headers, response, content, max_redirects)
ValueError: need more than 0 values to unpack
WARNING: Failed to retrieve
http://wikidataquery.eu/api?q=claim%5B105%5D%20AND%2
0noclaim%5B225%5D%20AND%20claim%5B100%5D
E
======================================================================
ERROR: testQueryApiGetter (__main__.TestApiSlowFunctions)
----------------------------------------------------------------------
Traceback (most recent call last):
File "tests/wikidataquery_tests.py", line 249, in testQueryApiGetter
data = w.query(q)
File "C:\Pywikipedia\ssh\pywikibot\core\pywikibot\data\wikidataquery.py",
line
530, in query
data = self.getDataFromHost(fullQueryString)
File "C:\Pywikipedia\ssh\pywikibot\core\pywikibot\data\wikidataquery.py",
line
501, in getDataFromHost
resp = http.request(None, url)
File "C:\Pywikipedia\ssh\pywikibot\core\pywikibot\comms\http.py", line 156,
in
request
if request.data[0].status == 504:
TypeError: 'NoneType' object has no attribute '__getitem__'
----------------------------------------------------------------------
Ran 6 tests in 21.124s
FAILED (errors=1)
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63667
Bug ID: 63667
Summary: tavis test fails for
timestripper_tests.test_timestripper()
Product: Pywikibot
Version: core (2.0)
Hardware: All
OS: All
Status: NEW
Severity: minor
Priority: Unprioritized
Component: General
Assignee: Pywikipedia-bugs(a)lists.wikimedia.org
Reporter: info(a)gno.de
Web browser: ---
Mobile Platform: ---
tavis test fails for timestripper_tests.test_timestripper() but local test
works fine. See (temporary) patch https://gerrit.wikimedia.org/r/#/c/124559/
--
You are receiving this mail because:
You are the assignee for the bug.