Revision: 5357
Author: nicdumz
Date: 2008-05-12 16:17:13 +0000 (Mon, 12 May 2008)
Log Message:
-----------
Trying to fix bug #1962464
Modified Paths:
--------------
trunk/pywikipedia/weblinkchecker.py
Modified: trunk/pywikipedia/weblinkchecker.py
===================================================================
--- trunk/pywikipedia/weblinkchecker.py 2008-05-12 12:57:09 UTC (rev 5356)
+++ trunk/pywikipedia/weblinkchecker.py 2008-05-12 16:17:13 UTC (rev 5357)
@@ -417,10 +417,20 @@
except httplib.error, error:
return False, u'HTTP Error: %s' % error.__class__.__name__
except socket.error, error:
- # TODO: decode error[1]. On Linux, it's encoded in UTF-8.
+ # http://docs.python.org/lib/module-socket.html :
+ # socket.error :
+ # The accompanying value is either a string telling what went
+ # wrong or a pair (errno, string) representing an error
+ # returned by a system call, similar to the value
+ # accompanying os.error
+ if isinstance(error, basestring):
+ msg = error
+ else:
+ msg = error[1]
+ # TODO: decode msg. On Linux, it's encoded in UTF-8.
# How is it encoded in Windows? Or can we somehow just
# get the English message?
- return False, u'Socket Error: %s' % repr(error[1])
+ return False, u'Socket Error: %s' % repr(msg)
if wasRedirected:
if self.url in self.redirectChain:
if useHEAD:
Bugs item #1961161, was opened at 2008-05-09 14:43
Message generated for change (Comment added) made by djbarrett
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1961161&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Closed
Resolution: Accepted
Priority: 5
Private: No
Submitted By: Daniel Barrett (djbarrett)
Assigned to: Nobody/Anonymous (nobody)
Summary: Cannot authenticate with LDAPauthentication extension
Initial Comment:
pywikipedia cannot authenticate to MediaWiki if the wiki is running the LDAPauthentication extension, http://www.mediawiki.org/wiki/Extension:LDAP_Authentication.
The problem (most likely) is that LDAP authentication requires a domain, and pywikipedia does not provide it.
When in use, LDAPauthentication adds a domain dropdown to the MediaWiki login page, named wpDomain:
<select name="wpDomain" value="invaliddomain" tabindex="3">
<option>domain1</option>
<option>domain2</option>
<option>domain3</option>
</select>
When you run login.py, you get this behavior:
Password for user mybot on mywiki:en: *******
Logging in to mywiki:en as mybot
Login failed. Wrong password or CAPTCHA answer?
and the error reported by MediaWiki is:
PHP Notice: Undefined index: invaliddomain in extensions\LdapAuthentication.php on line 170
LDAPauthentication is an important extension and I think pywikipedia should support it, just like you support CAPTCHAs today.
Thank you.
----------------------------------------------------------------------
>Comment By: Daniel Barrett (djbarrett)
Date: 2008-05-12 12:06
Message:
Logged In: YES
user_id=558133
Originator: YES
Thank you. I have confirmed that the patch is working for our LDAP site.
----------------------------------------------------------------------
Comment By: Daniel Herding (wikipedian)
Date: 2008-05-09 20:31
Message:
Logged In: YES
user_id=880694
Originator: NO
I applied the patch, thank you.
Untested because I don't have access to a wiki with LDAP auth, but at
least it doesn't break
normal login.
----------------------------------------------------------------------
Comment By: Daniel Barrett (djbarrett)
Date: 2008-05-09 15:42
Message:
Logged In: YES
user_id=558133
Originator: YES
Attached is a diff implementing support for LDAP authentication.
File Added: ldap-diff.txt
----------------------------------------------------------------------
Comment By: Daniel Barrett (djbarrett)
Date: 2008-05-09 14:46
Message:
Logged In: YES
user_id=558133
Originator: YES
A fix is detailed at
http://www.mediawiki.org/wiki/Extension_talk:LDAP_Authentication#Making_wor….
Could this be made official?
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1961161&group_…
Bugs item #1962464, was opened at 2008-05-12 11:39
Message generated for change (Comment added) made by djbarrett
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962464&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: weblinkchecker.py exception
Initial Comment:
Exception in thread NameOfPage - http://example.com/webservices/something/renderservice.asmx:
Traceback (most recent call last):
File "c:\Python25\lib\threading.py", line 486, in __bootstrap_inner
self.run()
File "weblinkchecker.py", line 485, in run
ok, message = linkChecker.check()
File "weblinkchecker.py", line 423, in check
return False, u'Socket Error: %s' % repr(error[1])
IndexError: tuple index out of range
This is the most recent version in svn (as of today).
----------------------------------------------------------------------
Comment By: Daniel Barrett (djbarrett)
Date: 2008-05-12 11:40
Message:
Logged In: YES
user_id=558133
Originator: NO
This was submitted by SourceForge user djbarrett.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962464&group_…
Bugs item #1962460, was opened at 2008-05-12 11:29
Message generated for change (Comment added) made by djbarrett
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962460&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: weblinkchecker.py exits with no arguments
Initial Comment:
Running this command:
python weblinkchecker.py
simply prints a usage message and exits. However, according to the usage message, it is supposed to "load all wiki pages in alphabetical order using the Special:Allpages feature".
No error message is printed.
"python weblinkchecker.py SomePageName" works fine.
----------------------------------------------------------------------
Comment By: Daniel Barrett (djbarrett)
Date: 2008-05-12 11:40
Message:
Logged In: YES
user_id=558133
Originator: NO
This was submitted by SourceForge user djbarrett.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962460&group_…
Bugs item #1962464, was opened at 2008-05-12 08:39
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962464&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: weblinkchecker.py exception
Initial Comment:
Exception in thread NameOfPage - http://example.com/webservices/something/renderservice.asmx:
Traceback (most recent call last):
File "c:\Python25\lib\threading.py", line 486, in __bootstrap_inner
self.run()
File "weblinkchecker.py", line 485, in run
ok, message = linkChecker.check()
File "weblinkchecker.py", line 423, in check
return False, u'Socket Error: %s' % repr(error[1])
IndexError: tuple index out of range
This is the most recent version in svn (as of today).
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962464&group_…
Bugs item #1962460, was opened at 2008-05-12 08:29
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962460&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: weblinkchecker.py exits with no arguments
Initial Comment:
Running this command:
python weblinkchecker.py
simply prints a usage message and exits. However, according to the usage message, it is supposed to "load all wiki pages in alphabetical order using the Special:Allpages feature".
No error message is printed.
"python weblinkchecker.py SomePageName" works fine.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962460&group_…
Bugs item #1962374, was opened at 2008-05-12 14:41
Message generated for change (Settings changed) made by darkoneko
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962374&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
>Status: Closed
Resolution: None
Priority: 5
Private: No
Submitted By: DarkoNeko (darkoneko)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki.py got an infinite loop ?
Initial Comment:
used version : version 5355 (latest)
Tested with :
python interwiki.py -continue -restore -autonomous -lang:ja -nlog -putthrottle:3 (usual run)
python interwiki.py "My Chemical Romance" (to locate the problem)
once arrived on the page "My Chemical Romance" and making the rowiki update, the program start consuming full CPU and don't go any further.
My guess is there's an infinite loop of some sort.
Screenshot : http://www.enregistrersous.com/images2/2/161327732720080512143322.html
NB: it show 50% because it's a dual CPU ; so it's using one of the 2 at 100%.
----------------------------------------------------------------------
Comment By: DarkoNeko (darkoneko)
Date: 2008-05-12 15:01
Message:
Logged In: YES
user_id=1809111
Originator: YES
tested with non vandalised version of "My Chemical romance", it worked
fine.
Sorry for the scare.
----------------------------------------------------------------------
Comment By: Philippe Elie (phil_e)
Date: 2008-05-12 14:58
Message:
Logged In: YES
user_id=318973
Originator: NO
That was caused by a 700 KB vandalism in the page, older version of
interwiki.py had no trouble with big pager (svn rev. 5014)
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962374&group_…
Bugs item #1962374, was opened at 2008-05-12 14:41
Message generated for change (Comment added) made by darkoneko
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962374&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: DarkoNeko (darkoneko)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki.py got an infinite loop ?
Initial Comment:
used version : version 5355 (latest)
Tested with :
python interwiki.py -continue -restore -autonomous -lang:ja -nlog -putthrottle:3 (usual run)
python interwiki.py "My Chemical Romance" (to locate the problem)
once arrived on the page "My Chemical Romance" and making the rowiki update, the program start consuming full CPU and don't go any further.
My guess is there's an infinite loop of some sort.
Screenshot : http://www.enregistrersous.com/images2/2/161327732720080512143322.html
NB: it show 50% because it's a dual CPU ; so it's using one of the 2 at 100%.
----------------------------------------------------------------------
>Comment By: DarkoNeko (darkoneko)
Date: 2008-05-12 15:01
Message:
Logged In: YES
user_id=1809111
Originator: YES
tested with non vandalised version of "My Chemical romance", it worked
fine.
Sorry for the scare.
----------------------------------------------------------------------
Comment By: Philippe Elie (phil_e)
Date: 2008-05-12 14:58
Message:
Logged In: YES
user_id=318973
Originator: NO
That was caused by a 700 KB vandalism in the page, older version of
interwiki.py had no trouble with big pager (svn rev. 5014)
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962374&group_…
Bugs item #1962374, was opened at 2008-05-12 14:41
Message generated for change (Comment added) made by phil_e
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962374&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: DarkoNeko (darkoneko)
Assigned to: Nobody/Anonymous (nobody)
Summary: interwiki.py got an infinite loop ?
Initial Comment:
used version : version 5355 (latest)
Tested with :
python interwiki.py -continue -restore -autonomous -lang:ja -nlog -putthrottle:3 (usual run)
python interwiki.py "My Chemical Romance" (to locate the problem)
once arrived on the page "My Chemical Romance" and making the rowiki update, the program start consuming full CPU and don't go any further.
My guess is there's an infinite loop of some sort.
Screenshot : http://www.enregistrersous.com/images2/2/161327732720080512143322.html
NB: it show 50% because it's a dual CPU ; so it's using one of the 2 at 100%.
----------------------------------------------------------------------
Comment By: Philippe Elie (phil_e)
Date: 2008-05-12 14:58
Message:
Logged In: YES
user_id=318973
Originator: NO
That was caused by a 700 KB vandalism in the page, older version of
interwiki.py had no trouble with big pager (svn rev. 5014)
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1962374&group_…
Revision: 5356
Author: russblau
Date: 2008-05-12 12:57:09 +0000 (Mon, 12 May 2008)
Log Message:
-----------
Not all wikis belong to Wikimedia Foundation.
Modified Paths:
--------------
trunk/pywikipedia/wikipedia.py
Modified: trunk/pywikipedia/wikipedia.py
===================================================================
--- trunk/pywikipedia/wikipedia.py 2008-05-11 08:45:24 UTC (rev 5355)
+++ trunk/pywikipedia/wikipedia.py 2008-05-12 12:57:09 UTC (rev 5356)
@@ -223,13 +223,13 @@
"""Page: A MediaWiki page
Constructor has two required parameters:
- 1) The wikimedia Site on which the page resides [note that, if the
+ 1) The wiki Site on which the page resides [note that, if the
title is in the form of an interwiki link, the Page object may
have a different Site than this]
2) The title of the page as a unicode string
Optional parameters:
- insite - the wikimedia Site where this link was found (to help decode
+ insite - the wiki Site where this link was found (to help decode
interwiki links)
defaultNamespace - A namespace to use if the link does not contain one
@@ -1403,8 +1403,7 @@
# just check for HTTP Status 500 (Internal Server Error)?
if ("<title>Wikimedia Error</title>" in data or "has a problem</title>" in data) \
or response.status == 500:
- output(
- u"Wikimedia has technical problems; will retry in %i minute%s."
+ output(u"Server error encountered; will retry in %i minute%s."
% (retry_delay, retry_delay != 1 and "s" or ""))
time.sleep(60 * retry_delay)
retry_delay *= 2
@@ -3721,7 +3720,7 @@
Constructor takes four arguments; only code is mandatory:
code language code for Site
- fam Wikimedia family (optional: defaults to configured).
+ fam Wiki family (optional: defaults to configured).
Can either be a string or a Family object.
user User to use (optional: defaults to configured)
persistent_http Use a persistent http connection. An http connection