[Labs-l] Some using a Python framework is relentlessly hammering Harvard sites, resulting an IP range ban.
Maximilian Doerr
maximilian.doerr at gmail.com
Sun Dec 4 18:09:21 UTC 2016
Sorry, I just realized I responded to the wrong person.
Cyberpower678
English Wikipedia Account Creation Team
ACC Mailing List Moderator
Global User Renamer
> On Dec 4, 2016, at 13:06, Martin Urbanec <martin.urbanec at wikimedia.cz> wrote:
>
> I don't know what you mean. https://phabricator.wikimedia.org/F4978348 <https://phabricator.wikimedia.org/F4978348> is probably a file not a bug and I suppose it is the accesslog. Can you clarify it for me please?
>
> Martin
>
> ne 4. 12. 2016 v 19:04 odesílatel Maximilian Doerr <maximilian.doerr at gmail.com <mailto:maximilian.doerr at gmail.com>> napsal:
> No it wasn’t. I set the custom policy and instructed to deny everyone but the given users. This seems to be a Phabricator bug.
>
> Cyberpower678
> English Wikipedia Account Creation Team
> ACC Mailing List Moderator
> Global User Renamer
>
>> On Dec 4, 2016, at 12:53, Martin Urbanec <martin.urbanec at wikimedia.cz <mailto:martin.urbanec at wikimedia.cz>> wrote:
>>
>> Hi,
>> if it wasn't me, it's good. Please can you give me a link to that phab task when it will be filled? My phab nickname is Urbanecm. Thanks in advance.
>>
>> Best,
>> Martin
>>
>> ne 4. 12. 2016 v 18:44 odesílatel Merlijn van Deen (valhallasw) <valhallasw at arctus.nl <mailto:valhallasw at arctus.nl>> napsal:
>> Hi Martin,
>>
>> On 4 December 2016 at 18:29, Martin Urbanec <martin.urbanec at wikimedia.cz <mailto:martin.urbanec at wikimedia.cz>> wrote:
>> I was running weblinkchecker.py for whole cswiki (job was submited to the grid at Sun, 20 Nov 2016 16:54:24 GMT) because I wished to have a list of deadlinks. This may correspond with the UA (because I used script named weblinkschecker.py). I trusted this script it won't do anything wrong because this script was and still is in standard core package. I also use 3.0-dev version of pywikibot and Python 2.7.6.
>>
>>
>> It probably wasn't you, but it was indeed the standard weblinkchecker causing this. Apparently no throttling is implemented -- just a maximum number of parallel connections. Many parallel connections are fine... but not to the same host. This bot was running on eswiki, and ewsiki has thousands of links to http://www.minorplanetcenter.net/ <http://www.minorplanetcenter.net/>.
>>
>> I have contacted the user, and will file a bug for Pywikibot to get this solved on that end.
>>
>> Best,
>> Merlijn
>> _______________________________________________
>> Labs-l mailing list
>> Labs-l at lists.wikimedia.org <mailto:Labs-l at lists.wikimedia.org>
>> https://lists.wikimedia.org/mailman/listinfo/labs-l <https://lists.wikimedia.org/mailman/listinfo/labs-l>
>> _______________________________________________
>> Labs-l mailing list
>> Labs-l at lists.wikimedia.org <mailto:Labs-l at lists.wikimedia.org>
>> https://lists.wikimedia.org/mailman/listinfo/labs-l <https://lists.wikimedia.org/mailman/listinfo/labs-l>
>
> _______________________________________________
> Labs-l mailing list
> Labs-l at lists.wikimedia.org <mailto:Labs-l at lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/labs-l <https://lists.wikimedia.org/mailman/listinfo/labs-l>
> _______________________________________________
> Labs-l mailing list
> Labs-l at lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/labs-l
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.wikimedia.org/pipermail/labs-l/attachments/20161204/7ec04298/attachment.html>
More information about the Labs-l
mailing list