Discovering a backlog of requests on Meta spam blacklist Aphaia & I have put some work into clearing this and now have it under control (I hope!).
However in the course of this I have learnt a lot and some (to me) Foundation level questions arise.
The policy at Meta has been only to blacklist those sites with persistent cross wiki records of link placement. However some of the sites that have been requested for blocking are fairly obviously undesirable whether they currently are troubling just one wiki or many (porn being the obvious example but there are plenty more that I think consensus on undesirability would be found).
So should we be rejecting requests for such sites saying that they should use their own local lists? I am aware that preventative blocking of anything or anyone can be frowned on (personally if I can see trouble coming I prefer to take action before it arrives rather than clean up afterwards). It may be that, in asking en wp for example to block locally, we are merely missing the opportunity to avoid problems across wikis in the near future. Spammers are adept at exploiting any opening that they can.
There has been a tendency to treat the Meta blacklist as a place of last resort - I question whether a more thoughtful approach to keeping Foundation sites clean might not be desirable?
Herby [[user:Herbythyme]] most places
Any link blacklist will involve a certain amount of arguing and bureaucracy -- the meta blacklist affects all languages and all projects (the stakes are higher), and the relationship between people making requests and people maintaining the list is different. It's my understanding is that local blacklists allow us to manage things "in house" and hopefully avoid a lot of the policy arguments from meta -- the restrictive policy of one wiki is not a huge concern, anymore, if it doesn't directly affect the other wikis, for example.
If we keep things local, we keep things simple for the "end user" (editor), which I guess is the way to go. Treating meta as a "last resort" is probably a bit much, as you've said, but I don't know that we gain anything by using the central blacklist all that often.
How often do spammers go after multiple wikis? Multiple languages? How quickly will we notice, if they do? Those questions seem important to me, in terms of deciding how often to use/consider the meta blacklist. Unfortunately, I'm not aware of statistics on the first two. As far as "how quickly will we notice?" I'm guessing we're counting on people to notice these things (people heavily active in multiple projects and languages, especially), which probably means not very quickly. It might be helpful to have some software try and keep track of cross-wiki spamming -- beyond my ability, but I bet it'd be helpful, here.
Just a thought. Thanks for bringing it up. -Luna
Hello,
Luna a écrit :
Any link blacklist will involve a certain amount of arguing and bureaucracy -- the meta blacklist affects all languages and all projects (the stakes are higher), and the relationship between people making requests and people maintaining the list is different. It's my understanding is that local blacklists allow us to manage things "in house" and hopefully avoid a lot of the policy arguments from meta -- the restrictive policy of one wiki is not a huge concern, anymore, if it doesn't directly affect the other wikis, for example.
If we keep things local, we keep things simple for the "end user" (editor), which I guess is the way to go. Treating meta as a "last resort" is probably a bit much, as you've said, but I don't know that we gain anything by using the central blacklist all that often.
How often do spammers go after multiple wikis? Multiple languages? How quickly will we notice, if they do? Those questions seem important to me, in terms of deciding how often to use/consider the meta blacklist. Unfortunately, I'm not aware of statistics on the first two. As far as "how quickly will we notice?" I'm guessing we're counting on people to notice these things (people heavily active in multiple projects and languages, especially), which probably means not very quickly. It might be helpful to have some software try and keep track of cross-wiki spamming -- beyond my ability, but I bet it'd be helpful, here.
Most often, they spam several langages and several projects at once, even within and outside Wikimedia.
I think we need another feature to prevent spamming: the ability to block the creation of pages with a certern pattern like .*/w/index.php which are only created by spammers. A central black list of these would be useful.
Just a thought. Thanks for bringing it up. -Luna
Regards,
Yann
On Thu, 23 Aug 2007 19:58:32 -0700, "Luna" lunasantin@gmail.com said:
Any link blacklist will involve a certain amount of arguing and bureaucracy -- the meta blacklist affects all languages and all projects (the stakes are higher), and the relationship between people making requests and people maintaining the list is different. It's my understanding is that local blacklists allow us to manage things "in house" and hopefully avoid a lot of the policy arguments from meta -- the restrictive policy of one wiki is not a huge concern, anymore, if it doesn't directly affect the other wikis, for example.
If we keep things local, we keep things simple for the "end user" (editor), which I guess is the way to go. Treating meta as a "last resort" is probably a bit much, as you've said, but I don't know that we gain anything by using the central blacklist all that often.
How often do spammers go after multiple wikis? Multiple languages? How quickly will we notice, if they do? Those questions seem important to me, in terms of deciding how often to use/consider the meta blacklist. Unfortunately, I'm not aware of statistics on the first two. As far as "how quickly will we notice?" I'm guessing we're counting on people to notice these things (people heavily active in multiple projects and languages, especially), which probably means not very quickly. It might be helpful to have some software try and keep track of cross-wiki spamming -- beyond my ability, but I bet it'd be helpful, here.
At least a couple of sites I've blocked have spammed 20+ wiki. There are some folk on en wp spam project that are very hot on checking/chasing cross wiki spammers. I followed one across es, it & fr wikis a week ago. There are some great tools out there for cross wiki contributions (http://tools.wikimedia.de/~luxo/contributions/contributions.php?lang=en) and links (eagle's one - can't find the link instantly).
It certainly happens and a few folk are very dedicated to keeping foundation sites as clean as possible. I know it isn't a main stream aspect of the project but keeping porn, gambling, finance links out of pages seems a worthwhile activity and worth finding a "good" or even "best" way to deal with it.
Cheers Herby
wikimedia-l@lists.wikimedia.org