Hi Petr,
I can see the value. Although I'm not entirely sure how you were planning on identifying these edits-- are you thinking all our current tools would have another classification (like "more review needed"), and submit them? Or would these be identified by another, new bot?
On Thu, Sep 26, 2013 at 6:06 AM, Petr Bena benapetr@gmail.com wrote:
Hi,
I noticed that there is a high amount of suspicious edits that may be vandalism but were never reverted because people who were dealing with vandals (using some automated tool) in that moment weren't able to decide if it was vandalism or wasn't. For example some "smart" changes to statistical data, dates, football scores, changes that look weird but aren't clearly vandalism etc. These edits should be reviewed by expert on the topic, but in this moment, they aren't collected anywhere.
I think we should create a new service (on tool labs?) that would allow these tools to insert such edits to queue (or database) of "suspicious edits" for later review by experts, this categorized database / queue could be browsed by people who are experts on given topics and got reviewed / reverted by them.
The database would need to be periodically scanned and all changes that were reverted would need to be removed from it. The people who reviewed the edits could also flag them as "ok".
This way we could improve the efficiency of anti-vandalism tools by the amount of edits which are ignored or skipped these days.
Some suggestions or ideas how to implement such a feature?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l