Very cool! Is there any way for users of this tool to help train it? For example, the first four things it flagged in my watchlist were all false positives (next 5-6 were correctly flagged.) It'd be nice to be able to contribute to training the model somehow when we see these false-positives.
On Tue, Aug 23, 2016 at 11:10 AM Amir Ladsgroup ladsgroup@gmail.com wrote:
We (The Revision Scoring Team https://meta.wikimedia.org/wiki/Research:Revision_scoring_as_a_service#Team) are happy to announce the deployment of the ORES https://meta.wikimedia.org/wiki/ORES review tool https://www.mediawiki.org/wiki/ORES_review_tool as a beta feature https://en.wikipedia.org/wiki/Special:Preferences#mw-prefsection-betafeatures on *English Wikipedia*. Once enabled, ORES highlights edits that are likely to be damaging in Special:RecentChanges https://en.wikipedia.org/wiki/Special:RecentChanges, Special:Watchlist https://en.wikipedia.org/wiki/Special:Watchlist and Special:Contributions https://en.wikipedia.org/wiki/Special:Contributions to help you prioritize your patrolling work. ORES detects damaging edits using a basic prediction model based on past damage https://meta.wikimedia.org/wiki/Research:Automated_classification_of_edit_quality. ORES is an experimental technology. We encourage you to take advantage of it but also to be skeptical of the predictions made. It's a tool to support you – it can't replace you. Please reach out to us with your questions and concerns. Documentationmw:ORES review tool https://www.mediawiki.org/wiki/ORES_review_tool, mw:Extension:ORES https://www.mediawiki.org/wiki/Extension:ORES, and m:ORES https://meta.wikimedia.org/wiki/ORESBugs & feature requests https://phabricator.wikimedia.org/tag/revision-scoring-as-a-service-backlog/ IRC#wikimedia-aiconnect http://webchat.freenode.net/?channels=#wikimedia-ai Sincerely,Amir from the Revision Scoring team _______________________________________________ AI mailing list AI@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/ai