Phil Boswell wrote:
Would this be checking for identical URLs, or simply
multiple URLs
referencing the same site?
Because I can think of one concrete example where I have been adding many
URLs: references to Placeopedia. Does the fact that I am using a small
template help?
Sometimes, you might have a WikiProject which is
bringing their articles up
to a standard which includes adding references, and if
this means that a
bunch of articles get similar URLs added in a short space of time, this
might trigger your filter.
Phil, just to be clear, at this point I'm proposing a feasibility analysis
- but you make some very good points/questions. This would be for identical
urls posted to different pages. The goal is to develop a filter that would
be able to block spammy behaviour, even if it's edits from multiple
users/IPs. I think if you're posted a template, that would NOT trigger it
(there's a hole - a spammer could create a spam template? but at least you
could quickly negate the effects by deleting the template... nothing's
perfect I guess), only multiple posts of the same URL.
I can't imagine where you'd have a project that would cause you to post the
exact same URL to a whole bunch of pages at once, but I'm sure it's
possible. Maybe we'd only want to apply the filter against anonymous
edits. Or, maybe triggering the filter would invoke Brion's captcha? I'm
not trying to nail down the implementation at this point, just the
feasibility. But this is a good discussion.
Thanks,
Aerik