Brion already said that this wouldn't be implemented and discussion was over. You now bring it up on Foundation-l. This is known as forum shopping.
Also known as "asking the other parent."
-Chad
On Mon, Apr 28, 2008 at 2:58 PM, White Cat wikipedia.kawaii.neko@gmail.com wrote:
I beg your pardon? Forum shopping on foundation-l? Seems self contradictory...
On Mon, Apr 28, 2008 at 7:34 PM, Chad innocentkiller@gmail.com wrote:
Forum shopping for this after the lead developer and CTO has said no is not the way to go about it.
From a technical standpoint: I agree with Brion. There are a whole host of reasons why an edit might fail (locked db's, protected pages, or even the server dying), and the bot needs to be designed to deal with that. If your bot crashes, etc. due to an edit failing: well that's your fault as a developer.
-Chad
On Mon, Apr 28, 2008 at 11:17 AM, White Cat wikipedia.kawaii.neko@gmail.com wrote:
https://bugzilla.wikimedia.org/show_bug.cgi?id=13706
Perhaps a community discussion is necessary on the matter, I hereby
initiate
it.
When a person tries to edit a page that contains a URL matching the
spam
autoblocker regex, the user is prohibited from making the edit until
the
spam link is removed. The spam autoblocker was intended to prevent the addition of new spam.
In a scenario where a spambot adds spam links to wikipedia, then later
the
spam url is added to the spam blacklist, then a user tries to edit a
page
that already contains spam added before the spam url is added to the
spam
blacklist. For a human this isn't much of a deal to deal with, it is
however
a different story when it comes to bots.
Consider you are operating a bot that makes non-controversial routine maintenance edits on a regular basis. The spam autoblocker would
prevent
such edits. If your bot's task is dealing with images renamed/deleted
on
commons or if your bots task is dealing with interwiki links this is particularly problematic. Interwiki bots, commons delinking bots often
edit
hundereds of pages a day on hundereds of wikis. Thats a lot of logs. So
the
suggestion that I should spend perhaps hours per day reading log files
for
spam on pages on languages I cannot even understand (or necesarily read
the
?'s and %'s) is quite unreasonable. This is a task better dealt with by
the
locals (humans) of the wiki community rather than bots preforming
mindless,
routine and non-controversial tasks.
There is also the matter of legitimate reason to include spam on pages
such
as archived discussion on a spam bot attack where example URLs are used before these make their way to the spam autoblocker.
- White Cat
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l