If your wiki's robots.txt is configured correctly then search engines
should not crawl past revisions anyway. However if you really want to
delete the revisions wholesale then the Oversight extension may be for you.
I'm not sure I understand your question about user blocking. Are you
looking for something more powerful than the usual Special:Blockip
functionality which is available to every admin?
Soo
Andre-John Mas wrote:
Hi,
Our wiki has become prey to spammers, even after having added e-mail
confirmation for accounts. I am busy trying to clean up the mess, and
now I am trying to find solutions that will make the life of spammers
harder, without penalising ligitimate users. The first thing to do is
find out how to block the user account causing issue. Is there any way
to do this without having to open up the database?
The main anti-spammer solution I am looking at it adding a captcha,
but I would also be interested in other solutions. Additionally I want
to ensure that the text from the spammers is removed from the
database, so that they don't get the benefit of having their links
indexed by search engines, via our site. Any suggestions for this?
Andre
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/wikitech-l