[Mediawiki-l] spam attack avoids captcha

Steven Dick kg4ydw at gmail.com
Sat Mar 19 11:59:06 UTC 2011


On Wed, Mar 16, 2011 at 8:15 PM, Daniel Friesen
<lists at nadir-seen-fire.com>wrote:
[...]

> Sure enough after a bit I started seeing the same spam,
>

Perhaps it is a bit draconian, but on my wiki, when I get spam, I use
checkuser to get their ip.  I block their whole /24 subnet.  I then go look
at the spam they left me, and I blacklist every domain in the URL's. Most
spammers spam a single domain, but if they come back with different URL's, I
look at the phrasing of the language of the containing text, and pick
something (like a grammatical error) and blacklist that by regular
expression.

On my wiki, if they come back, it is at least not with the same spam.  They
might come back once or twice, but I don't often see many repeats.

The whole thing is pointless, because most of the time they edit pages that
are in the ban list in robots.txt, so google would never even download those
pages.  And if they do occasionally edit a page that would be downloaded by
google, I've turned on the option that tells google not to use those links
in pagerank, so it won't help them with SEO.  Occasionally, I put that
message in the ban text when I get a repeat spammer.  Maybe they read it and
don't come back, but I doubt that they are even that clever.

Maybe my wiki is just too small for them to care enough to come back and I
just got caught in a huge list they were spamming.  My wiki is certainly
small enough that I don't care if some ISP's subdomain gets blocked
accidently, I can afford to be draconian.


More information about the MediaWiki-l mailing list