Hi,
Our wiki has become prey to spammers, even after having added e-mail confirmation for accounts. I am busy trying to clean up the mess, and now I am trying to find solutions that will make the life of spammers harder, without penalising ligitimate users. The first thing to do is find out how to block the user account causing issue. Is there any way to do this without having to open up the database?
The main anti-spammer solution I am looking at it adding a captcha, but I would also be interested in other solutions. Additionally I want to ensure that the text from the spammers is removed from the database, so that they don't get the benefit of having their links indexed by search engines, via our site. Any suggestions for this?
Andre
Andre-John Mas schreef:
The first thing to do is find out how to block the user account causing issue. Is there any way to do this without having to open up the database?
Of course. Go to the [[Special:Blockip]] page and fill out the form. Blocking users requires sysop rights.
The main anti-spammer solution I am looking at it adding a captcha, but I would also be interested in other solutions. Additionally I want to ensure that the text from the spammers is removed from the database, so that they don't get the benefit of having their links indexed by search engines, via our site. Any suggestions for this?
Well old revisions are not indexed by search engines, so simply reverting vandalism edits should be enough. If a vandalism edit contains stuff you don't want to be accessible, not even through the page history, try Oversight [1]. Adding a CAPTCHA can be done easily with ConfirmEdit [2].
Roan Kattouw (Catrope)
[1] http://www.mediawiki.org/wiki/Extension:Oversight [2] http://www.mediawiki.org/wiki/Extension:ConfirmEdit
If your wiki's robots.txt is configured correctly then search engines should not crawl past revisions anyway. However if you really want to delete the revisions wholesale then the Oversight extension may be for you.
I'm not sure I understand your question about user blocking. Are you looking for something more powerful than the usual Special:Blockip functionality which is available to every admin?
Soo
Andre-John Mas wrote:
Hi,
Our wiki has become prey to spammers, even after having added e-mail confirmation for accounts. I am busy trying to clean up the mess, and now I am trying to find solutions that will make the life of spammers harder, without penalising ligitimate users. The first thing to do is find out how to block the user account causing issue. Is there any way to do this without having to open up the database?
The main anti-spammer solution I am looking at it adding a captcha, but I would also be interested in other solutions. Additionally I want to ensure that the text from the spammers is removed from the database, so that they don't get the benefit of having their links indexed by search engines, via our site. Any suggestions for this?
Andre
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 3-Feb-08, at 13:53 , Soo Reams wrote:
If your wiki's robots.txt is configured correctly then search engines should not crawl past revisions anyway. However if you really want to delete the revisions wholesale then the Oversight extension may be for you.
Do you have an example of correctly configured robots.txt for this?
I'm not sure I understand your question about user blocking. Are you looking for something more powerful than the usual Special:Blockip functionality which is available to every admin?
I was not aware of that, though I have now started using this on the users in questions.
Andre-John
AM> Do you have an example of correctly configured robots.txt for this? http://www.mediawiki.org/wiki/Robots.txt
wikitech-l@lists.wikimedia.org