jayvdb added a comment.
I just received a DBQueryError on this test.
``` 2014-11-29 01:41:35 threadedhttp.py, 219 in request: DEBUG ('https://en.wikipedia.org/w/api.php?inprop=protection&giulimit=5&maxl...', 'GET', None, {'cookie': 'forceHTTPS=true; enwikiUserName=JVbot; enwikiSession=....; enwikiUserID=5197800; GeoIP=ID:Jakarta:...:v4; centralauth_Token=....; centralauth_User=JVbot; forceHTTPS=1', 'connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'user-agent': u'python -m unittest (wikipedia:en; User:JVbot) Pywikibot/2.0b2 (g4649) httplib2/0.9 Python/2.7.5.final.0'}, 5, None) 2014-11-29 01:42:35 api.py, 966 in submit: DEBUG API response received from wikipedia:en: {"servedby":"mw1221","error":{"code":"internal_api_error_DBQueryError","info":"Database query error","*":""}} 2014-11-29 01:42:35 api.py, 1043 in submit: ERROR Detected MediaWiki API exception DBQueryError; retrying 2014-11-29 01:42:35 api.py, 1049 in submit: VERBOSE MediaWiki exception DBQueryError details: query= {'action': [u'query'], 'continue': [u''], 'format': ['json'], 'generator': [u'imageusage'], 'giufilterredir': [u'redirects'], u'giulimit': [u'5'], 'giutitle': [u'File:Wiktionary-logo-en.svg'], 'iiprop': [u'timestamp', u'user', u'comment', u'url', u'size', u'sha1', u'metadata'], 'indexpageids': [u''], 'inprop': [u'protection'], 'maxlag': ['5'], 'meta': ['userinfo'], 'prop': [u'info', u'imageinfo', u'categoryinfo'], 'uiprop': ['blockinfo', 'hasmsg']} response= {u'servedby': u'mw1221', u'error': {'help': u''}} 2014-11-29 01:42:35 api.py, 1081 in wait: WARNING Waiting 5 seconds before retrying. 2014-11-29 01:42:40 threadedhttp.py, 92 in pop_connection: DEBUG Retrieved connection from 'https:en.wikipedia.org' pool. 2014-11-29 01:42:40 threadedhttp.py, 219 in request: DEBUG ('https://en.wikipedia.org/w/api.php?inprop=protection&giulimit=5&maxl...', 'GET', None, {'cookie': 'forceHTTPS=true; enwikiUserName=JVbot; enwikiSession=...; enwikiUserID=5197800; GeoIP=ID:Jakarta:...:v4; centralauth_Token=...; centralauth_User=JVbot; forceHTTPS=1', 'connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'user-agent': u'python -m unittest (wikipedia:en; User:JVbot) Pywikibot/2.0b2 (g4649) httplib2/0.9 Python/2.7.5.final.0'}, 5, None) 2014-11-29 01:44:40 api.py, 959 in submit: ERROR Traceback (most recent call last): File "pywikibot/data/api.py", line 937, in submit headers=headers, body=body) File "pywikibot/tools.py", line 679, in wrapper return obj(*__args, **__kw) File "pywikibot/comms/http.py", line 261, in request raise request.data SSLError: The read operation timed out
```
! In T73971#755541, @Mpaa wrote:
(In reply to John Mark Vandenberg from comment #13)
Changes to the tests have been merged. Underlying problem hasnt been fixed.
I do not think there is an underlying problem in test_image_usage_in_redirects. The API is already slow by itself when called with those parameters.
Yes, I had hoped that mediawiki devs might help diagnose this problem.
One solution would be to change the logic that we currently use for setting the query limits. I think this can be avoided just raising the number of retrieved tems from 5 to 500 instead of @expectedFailureIf(TRAVIS=true) on imageusage test.
This sounds like a good approach. For queries we know are problematic (from experience), QueryGenerator can adjust the limits used in the API query (maybe try unlimited) , and QueryGenerator enforces the limit internally.
TASK DETAIL https://phabricator.wikimedia.org/T73971
REPLY HANDLER ACTIONS Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
To: jayvdb Cc: pywikipedia-bugs, Legoktm, valhallasw, XZise, jayvdb, Mpaa