Bugs item #1980112, was opened at 2008-05-31 05:29
Message generated for change (Comment added) made by melancholie
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1980112&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 9
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Fresh login not possible anymore for SUL accounts
Initial Comment:
Fresh login not possible anymore for SUL accounts!
With a fresh SVN checkout (no login-data), no login is possible anymore if bot account is unified (SUL).
Error message:
Logging in to wikipedia:en
Login failed. Wrong password or CAPTCHA answer?
It was not asked for a CAPTCHA, password is correct. Login works with Firefox for the account. With non-SULed account login works.
----------------------------------------------------------------------
>Comment By: Melancholie (melancholie)
Date: 2008-05-31 05:55
Message:
Logged In: YES
user_id=2089773
Originator: YES
See also: https://bugzilla.wikimedia.org/show_bug.cgi?id=14335
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1980112&group_…
Bugs item #1980112, was opened at 2008-05-31 05:29
Message generated for change (Settings changed) made by melancholie
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1980112&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
>Priority: 9
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Fresh login not possible anymore for SUL accounts
Initial Comment:
Fresh login not possible anymore for SUL accounts!
With a fresh SVN checkout (no login-data), no login is possible anymore if bot account is unified (SUL).
Error message:
Logging in to wikipedia:en
Login failed. Wrong password or CAPTCHA answer?
It was not asked for a CAPTCHA, password is correct. Login works with Firefox for the account. With non-SULed account login works.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1980112&group_…
Bugs item #1980112, was opened at 2008-05-31 05:29
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1980112&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Fresh login not possible anymore for SUL accounts
Initial Comment:
Fresh login not possible anymore for SUL accounts!
With a fresh SVN checkout (no login-data), no login is possible anymore if bot account is unified (SUL).
Error message:
Logging in to wikipedia:en
Login failed. Wrong password or CAPTCHA answer?
It was not asked for a CAPTCHA, password is correct. Login works with Firefox for the account. With non-SULed account login works.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1980112&group_…
Revision: 5478
Author: btongminh
Date: 2008-05-30 14:58:37 +0000 (Fri, 30 May 2008)
Log Message:
-----------
Cache Family objects using WeakValueDictionary
Modified Paths:
--------------
trunk/pywikipedia/wikipedia.py
Modified: trunk/pywikipedia/wikipedia.py
===================================================================
--- trunk/pywikipedia/wikipedia.py 2008-05-30 14:44:23 UTC (rev 5477)
+++ trunk/pywikipedia/wikipedia.py 2008-05-30 14:58:37 UTC (rev 5478)
@@ -128,6 +128,7 @@
from BeautifulSoup import *
import simplejson
import diskcache
+import weakref
# Set the locale to system default. This will ensure correct string
# handling for non-latin characters on Python 2.3.x. For Python 2.4.x it's no
@@ -3753,8 +3754,11 @@
found = False
return result
-
-def Family(fam = None, fatal = True):
+# Warning! _familyCache does not necessarily have to be consistent between
+# two statements. Always ensure that a local reference is created when
+# accessing Family objects
+_familyCache = weakref.WeakValueDictionary()
+def Family(fam = None, fatal = True, force = False):
"""
Import the named family.
@@ -3763,6 +3767,11 @@
"""
if fam == None:
fam = config.family
+
+ family = _familyCache.get(fam)
+ if family and not force:
+ return family
+
try:
# search for family module in the 'families' subdirectory
sys.path.append(config.datafilepath('families'))
@@ -3778,7 +3787,10 @@
sys.exit(1)
else:
raise ValueError("Family %s does not exist" % repr(fam))
- return myfamily.Family()
+
+ family = myfamily.Family()
+ _familyCache[fam] = family
+ return family
class Site(object):
Revision: 5473
Author: nicdumz
Date: 2008-05-30 12:00:07 +0000 (Fri, 30 May 2008)
Log Message:
-----------
tools.wikimedia.de is obsolete, changing it to toolserver.org
Modified Paths:
--------------
trunk/pywikipedia/add_text.py
trunk/pywikipedia/checkimages.py
trunk/pywikipedia/family.py
trunk/pywikipedia/imagerecat.py
trunk/pywikipedia/udp-log.py
Modified: trunk/pywikipedia/add_text.py
===================================================================
--- trunk/pywikipedia/add_text.py 2008-05-30 11:39:10 UTC (rev 5472)
+++ trunk/pywikipedia/add_text.py 2008-05-30 12:00:07 UTC (rev 5473)
@@ -88,13 +88,13 @@
return text
def untaggedGenerator(untaggedProject, limit = 500):
- """ Function to get the pages returned by this tool: http://tools.wikimedia.de/~daniel/WikiSense/UntaggedImages.php """
+ """ Function to get the pages returned by this tool: http://toolserver.org/~daniel/WikiSense/UntaggedImages.php """
lang = untaggedProject.split('.', 1)[0]
project = '.' + untaggedProject.split('.', 1)[1]
if lang == 'commons':
- link = 'http://tools.wikimedia.de/~daniel/WikiSense/UntaggedImages.php?wikifam=comm…'
+ link = 'http://toolserver.org/~daniel/WikiSense/UntaggedImages.php?wikifam=commons.…'
else:
- link = 'http://tools.wikimedia.de/~daniel/WikiSense/UntaggedImages.php?wikilang=' + lang + '&wikifam=' + project + '&order=img_timestamp&max=' + str(limit) + '&ofs=0&max=' + str(limit)
+ link = 'http://toolserver.org/~daniel/WikiSense/UntaggedImages.php?wikilang=' + lang + '&wikifam=' + project + '&order=img_timestamp&max=' + str(limit) + '&ofs=0&max=' + str(limit)
text = pageText(link)
#print text
regexp = r"""<td valign='top' title='Name'><a href='http://.*?\.org/w/index\.php\?title=(.*?)'>.*?</a></td>"""
Modified: trunk/pywikipedia/checkimages.py
===================================================================
--- trunk/pywikipedia/checkimages.py 2008-05-30 11:39:10 UTC (rev 5472)
+++ trunk/pywikipedia/checkimages.py 2008-05-30 12:00:07 UTC (rev 5473)
@@ -42,7 +42,7 @@
-url[:#] - Define the url where are the images
- -untagged[:#] - Use daniel's tool as generator ( http://tools.wikimedia.de/~daniel/WikiSense/UntaggedImages.php )
+ -untagged[:#] - Use daniel's tool as generator ( http://toolserver.org/~daniel/WikiSense/UntaggedImages.php )
---- Istructions for the real-time settings ----
* For every new block you have to add:
@@ -648,9 +648,9 @@
lang = untaggedProject.split('.', 1)[0]
project = '.%s' % untaggedProject.split('.', 1)[1]
if lang == 'commons':
- link = 'http://tools.wikimedia.de/~daniel/WikiSense/UntaggedImages.php?wikifam=comm…'
+ link = 'http://toolserver.org/~daniel/WikiSense/UntaggedImages.php?wikifam=commons.…'
else:
- link = 'http://tools.wikimedia.de/~daniel/WikiSense/UntaggedImages.php?wikilang=%s&…' % (lang, project, limit, limit)
+ link = 'http://toolserver.org/~daniel/WikiSense/UntaggedImages.php?wikilang=%s&wiki…' % (lang, project, limit, limit)
text = self.site.getUrl(link, no_hostname = True)
regexp = r"""<td valign='top' title='Name'><a href='http://.*?\.org/w/index\.php\?title=(.*?)'>.*?</a></td>"""
results = re.findall(regexp, text)
Modified: trunk/pywikipedia/family.py
===================================================================
--- trunk/pywikipedia/family.py 2008-05-30 11:39:10 UTC (rev 5472)
+++ trunk/pywikipedia/family.py 2008-05-30 12:00:07 UTC (rev 5473)
@@ -2161,7 +2161,7 @@
# A dictionary where keys are family codes that can be used in
# inter-family interwiki links. Values are not used yet.
- # Generated from http://tools.wikimedia.de/~daniel/interwiki-en.txt:
+ # Generated from http://toolserver.org/~daniel/interwiki-en.txt:
# remove interlanguage links from file, then run
# f = open('interwiki-en.txt')
# for line in f.readlines():
Modified: trunk/pywikipedia/imagerecat.py
===================================================================
--- trunk/pywikipedia/imagerecat.py 2008-05-30 11:39:10 UTC (rev 5472)
+++ trunk/pywikipedia/imagerecat.py 2008-05-30 12:00:07 UTC (rev 5473)
@@ -92,7 +92,7 @@
Get category suggestions from commonshelper. Parse them and return a list of suggestions.
'''
parameters = urllib.urlencode({'i' : imagepage.titleWithoutNamespace(), 'r' : 'on', 'go-clean' : 'Find+Categories'})
- commonsHelperPage = urllib.urlopen("http://tools.wikimedia.de/~daniel/WikiSense/CommonSense.php?%s" % parameters)
+ commonsHelperPage = urllib.urlopen("http://toolserver.org/~daniel/WikiSense/CommonSense.php?%s" % parameters)
commonsenseRe = re.compile('^#COMMONSENSE(.*)#USAGE(\s)+\((?P<usage>(\d)+)\)(.*)#KEYWORDS(\s)+\((?P<keywords>(\d)+)\)(.*)#CATEGORIES(\s)+\((?P<catnum>(\d)+)\)\s(?P<cats>(.*))\s#GALLERIES(\s)+\((?P<galnum>(\d)+)\)(.*)#EOF$', re.MULTILINE + re.DOTALL)
matches = commonsenseRe.search(commonsHelperPage.read())
Modified: trunk/pywikipedia/udp-log.py
===================================================================
--- trunk/pywikipedia/udp-log.py 2008-05-30 11:39:10 UTC (rev 5472)
+++ trunk/pywikipedia/udp-log.py 2008-05-30 12:00:07 UTC (rev 5473)
@@ -8,7 +8,7 @@
import sys, re, socket
__version__ = '$Id$'
-TARGET_HOST = 'tools.wikimedia.de'
+TARGET_HOST = 'toolserver.org'
TARGET_PORT = 42448
input = sys.stdin.read()
Revision: 5472
Author: filnik
Date: 2008-05-30 11:39:10 +0000 (Fri, 30 May 2008)
Log Message:
-----------
Using getUrl() also for the toolserver's script
Modified Paths:
--------------
trunk/pywikipedia/checkimages.py
Modified: trunk/pywikipedia/checkimages.py
===================================================================
--- trunk/pywikipedia/checkimages.py 2008-05-30 11:32:38 UTC (rev 5471)
+++ trunk/pywikipedia/checkimages.py 2008-05-30 11:39:10 UTC (rev 5472)
@@ -71,7 +71,7 @@
#
# (C) Kyle/Orgullomoore, 2006-2007 (newimage.py)
-# (C) Siebrand Mazeland, 2007
+# (C) Siebrand Mazeland, 2007
# (C) Filnik, 2007-2008
#
# Distributed under the terms of the MIT license.
@@ -401,28 +401,6 @@
time_zone = unicode(time.strftime(u"%d %b %Y %H:%M:%S (UTC)", time.gmtime()))
wikipedia.output(u"%s%s" % (message, time_zone))
-def pageText(url):
- """ Function used to get HTML text from every reachable URL """
- # When the page is not a wiki-page (as for untagged generator) you need that function
- try:
- request = urllib2.Request(url)
- user_agent = 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.12) Gecko/20050915 Firefox/1.0.7'
- request.add_header("User-Agent", user_agent)
- response = urllib2.urlopen(request)
- text = response.read()
- response.close()
- # When you load to many users, urllib2 can give this error.
- except urllib2.HTTPError:
- printWithTimeZone(u"Server error. Pausing for 10 seconds... ")
- time.sleep(10)
- request = urllib2.Request(url)
- user_agent = 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.12) Gecko/20050915 Firefox/1.0.7'
- request.add_header("User-Agent", user_agent)
- response = urllib2.urlopen(request)
- text = response.read()
- response.close()
- return text
-
def returnOlderTime(listGiven, timeListGiven):
""" Get some time and return the oldest of them """
#print listGiven; print timeListGiven
@@ -673,7 +651,7 @@
link = 'http://tools.wikimedia.de/~daniel/WikiSense/UntaggedImages.php?wikifam=comm…'
else:
link = 'http://tools.wikimedia.de/~daniel/WikiSense/UntaggedImages.php?wikilang=%s&…' % (lang, project, limit, limit)
- text = pageText(link)
+ text = self.site.getUrl(link, no_hostname = True)
regexp = r"""<td valign='top' title='Name'><a href='http://.*?\.org/w/index\.php\?title=(.*?)'>.*?</a></td>"""
results = re.findall(regexp, text)
if results == []:
@@ -1090,7 +1068,7 @@
generator = pagegenerators.NewimagesPageGenerator(number = limit, site = site)
# if urlUsed and regexGen, get the source for the generator
if urlUsed == True and regexGen == True:
- textRegex = pagetext(regexPageUrl)
+ textRegex = site.getUrl(regexPageUrl, no_hostname = True)
# Not an url but a wiki page as "source" for the regex
elif regexGen == True:
pageRegex = wikipedia.Page(site, regexPageName)