I wonder if the refactor described in https://www.mediawiki.org/wiki/Requests_for_comment/Support_for_user-specifi... could be adapted to help with this use case. I suspect it would be highly useful to be able to create publicly viewable watchlists of suspicious edits.
With a little bit of tweaking maybe when viewing the watchlist only the suspicious edits would show in the list and users could collectively 'unwatch' them when reviewed?
On Fri, Sep 27, 2013 at 7:42 AM, Petr Bena benapetr@gmail.com wrote:
Hi,
I think you perfectly summarized this issue. I like the first solution (3rd provider on wikimedia labs with some well documented api interface) but I must admit that identity sharing might be little problem (if some troll figured out this system and we weren't using any identification at all, they could easily wipe all edits).
Having this directly in MW as tags that can be applied by users would be probably best solution, but I am afraid it's going to take ages for this to happen
On Fri, Sep 27, 2013 at 4:21 PM, Aaron Halfaker ahalfaker@wikimedia.org wrote:
I've got to say that this problem seems pretty straightforward. Essentially, we need something lighter than 'revert' for edits that need a second set of eyes.
What we really want is a queue of suspect revisions that allows Wikipedians to flag new revisions, query current flagged revisions and remove revisions from the list after review.
I see two clear options:
*3rd party tool. *A queue of suspect revisions can be created as a 3rd party tool (e.g. webapp + API running on labs). Then gadgets and other 3rd party tools make use of the API to add, remove, update & query the set of flagged edits. I worry about this option due to the lack of good identity sharing between Wikipedia and 3rd party wiki tools, but otherwise, it seems trivial to implement.
*Make use of infrastructure in MediaWiki. *We can either build on top of the features currently deployed or on top of new features in the pipeline.
- Current MW: Someone brought up the example of adding a template to
articles who have recent revisions needing review. Such templates could appear on the talk page so as to not clutter the article. I've got to admit that this sounds messy, but the user warning level system employed by Huggle, ClueBot NG, Twinkle, etc. is equally message.
- New Features: If arbitrary tags could manually be added to revisions and
queried from the MediaWiki (preferably, the API), the functionality of a third party tool described above could be captured without need for accessing an external tool. This might require a little bit of gadget support for common actions taken on the "suspicious edit queue".
On Fri, Sep 27, 2013 at 8:59 AM, John Vandenberg jayvdb@gmail.com wrote:
On Fri, Sep 27, 2013 at 8:52 PM, Petr Bena benapetr@gmail.com wrote:
Not really, I can't see how tags help at all in here. We are talking about any kind of edit (nothing that can be matched by regex) which seems suspicious to vandal-fighter (human) but who can't make sure if it's vandalism or not. Nothing like abuse filter nor patrolled edits can help here (unless we mark every single edit as patrolled and these people who see such a suspicious edit would mark it as un-patrolled or something like that)
If I understand correctly, you want user-applied tags on revisions, which is bug 1189.
https://bugzilla.wikimedia.org/show_bug.cgi?id=1189
All edits start out unpatrolled.
If your interface shows the union of unpatrolled edits and a huggle-user-selected-tag (be it tor, abusefilter, or manually added tag) ..
'Experts' edit the page if required, and then mark the revision as patrolled so it no longer appears in the queue.
-- John Vandenberg
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l