This is something that has to be discussed *on the projects themselves*,
not on mailing lists that have (comparatively) very low participation by
active editors. Sending to another mailing list, even a broader one than
this, isn't going to get the buy-in needed from the people who will have to
clean up the messes. You will need buy-in from at least the following
groups:
- A significant number of editors from the project involved in the trial
- Stewards
- Global sysops/global rollbackers
- Checkusers
You will also have to absolutely guarantee that the trial will end on the
date stated *regardless of what happens during the trial*, and that there
will be non-project support for the collection and analysis of data. One
of the reasons projects tend to not want to participate in trials is the
unwillingness to return to status quo ante because someone/developers/the
WMF/etc has decided on their own basis that the results were favourable
without any analysis of actual data. Frankly, we've experienced this so
often on English Wikipedia that it's resulted in major showdowns with the
WMF that have had a real and ongoing impact on the WMF's ability to develop
and improve software. (Don't kid yourself, this will be seen as a WMF
proposal even though it may be coming from volunteer developers.)
Edit filters are developed project-by-project, and cannot be relied upon to
catch problem edits; even with the huge number of edit filters on enwiki,
there is still significant spamming and vandalism happening. Many of the
projects most severely impacted by inappropriate editing are smaller
projects with comparatively few active editors and few edit filters, where
recent changes are not routinely reviewed; stewards and global
sysops/rollbackers are often the people who clean up the messes there.
There also needs to be a good answer to the "attribution problem" that has
long been identified as a secondary concern related to Tor and other proxy
systems. The absence of a good answer to this issue may be sufficient in
itself to derail any proposed trial.
Not saying a trial can't happen....just making it clear that it's not
something that is within the purview of developers (volunteer or staff)
because the blocking of Tor has always been directly linked to behaviour
and core policy, not to technical issues. I very much disagree that this
is a technical issue; Tor's blocking is a technical solution to a genuine
policy/behaviour problem.
Risker/Anne
On 1 October 2014 09:05, Derric Atzrott <datzrott(a)alizeepathology.com>
wrote:
If, as it
seems right now, the problem is technical (weed out the bots
and vandals) rather than ideological (as we allow anonymous
contributions after all) we can find a way to allow people to edit any
wikipedia via TOR while minimizing the amount of vandalism allowed.
Of course, let's not kid ourselves - it will require some special
measures probably, and editing via TOR would probably end up not being
as easy as editing via a public-facing IP (we may e.g. restrict
publishing via TOR to users that have logged in and have done 5 "good"
edits reviewed by others, or we can use modern bot-detecting
techniques in that case - those are just ideas).
I would be curious to see what percentage of problematic edits are
caught by running all prospective edits through AbuseFilter and
ClueBotNG. I suspect those two tools would catch a large
percentage of the vandalism edits. I understand that they catch most
of such edits that regular IP users make. This would be a good start
and would give us a little bit of data as to what other sorts of
measures might need to be taken to make this sort of thing work.
AbuseFilter has the ability to tag edits for further review so we
could leverage that functionality to tag Tor edits during a trial.
I could reach out to the maintainer of ClueBotNG and see what could
be done to get it to interface with AbuseFilter such that any edits
it sees as unconstructive are tagged, and if that isn't possible
maybe just have it log such edits somewhere special.
We've had this conversation a few times and
I'd love to see creative
approaches to a trial/pilot with data driving future decisions.
If I approached Wikimedia-l with the idea of a limited trial with
the above approach for maybe two weeks' time with all Tor edits
being tagged, do you think they might bite?
It clearly is the kind of problem where people
do
like to _look_ for clever technical fixes, which is why it's a
recurring topic on this list.
I suspect one exists somewhere. I'll reach out to the folks at the
Tor project and see if they have any suggestions for ways to
prevent abuse from a technical standpoint. Especially in regards to
Sockpuppet abuse. I agree with Giuseppe that the measures that will
need to be put in place will make editing via Tor more difficult than
editing without Tor, but that's acceptable so long as they are not
as prohibitively difficult as they are currently.
Without having spoken to the Tor Project though,
the Nymble approach seems like a reasonable way to go to me. The
protocol could potentially be modified to accept some sort of
proof of work rather than their public facing IP address as well.
If we had a system where in order to be issued a certificate in
Nymble you had to complete a proof-of-work that took perhaps
several hours of computation and was issued for a week, that might
be a sufficient barrier to stop most socks, though definitely some
more data needs gathered.
Thank you,
Derric Atzrott
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l