Very cool! Is there any way for users of this tool to help train it? For example, the first four things it flagged in my watchlist were all false positives (next 5-6 were correctly flagged.) It'd be nice to be able to contribute to training the model somehow when we see these false-positives.

On Tue, Aug 23, 2016 at 11:10 AM Amir Ladsgroup <ladsgroup@gmail.com> wrote:

We (The Revision Scoring Team) are happy to announce the deployment of the ORES review tool as a beta feature on English Wikipedia. Once enabled, ORES highlights edits that are likely to be damaging in Special:RecentChanges, Special:Watchlist and Special:Contributions to help you prioritize your patrolling work. ORES detects damaging edits using a basic prediction model based on past damage. ORES is an experimental technology. We encourage you to take advantage of it but also to be skeptical of the predictions made. It's a tool to support you – it can't replace you. Please reach out to us with your questions and concerns.

Documentation
mw:ORES review tool, mw:Extension:ORES, and m:ORES
Bugs & feature requests
https://phabricator.wikimedia.org/tag/revision-scoring-as-a-service-backlog/
IRC
#wikimedia-aiconnect

Sincerely,
Amir from the Revision Scoring team
_______________________________________________
AI mailing list
AI@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/ai