Hey all,

a reminder that the livestream of our monthly research showcase will start in about 2 hours (11:30 PT / 18:30 UTC) with our collaborators from Jigsaw and Cornell as guest speakers. You can follow the stream on YouTube: https://www.youtube.com/watch?v=m4vzI0k4OSg and join the live Q&A on IRC in the #wikimedia-research channel.

Looking forward to seeing you there!

Dario


On Thu, May 31, 2018 at 5:07 PM Dario Taraborelli <dtaraborelli@wikimedia.org> wrote:
Hey everyone,

we're hosting a dedicated session in June on our joint work with Cornell and Jigsaw on predicting conversational failure on Wikipedia talk pages. This is part of our contribution to WMF's Anti-Harassment program.

The showcase will be live-streamed on Monday, June 18, 2018 at 11:30 AM (PDT), 18:30 (UTC).  (Please note this falls on a Monday this month).

Conversations Gone Awry. Detecting Early Signs of Conversational Failure
By Justine Zhang and Jonathan Chang, Cornell University
One of the main challenges online social systems face is the prevalence of antisocial behavior, such as harassment and personal attacks. In this work, we introduce the task of predicting from the very start of a conversation whether it will get out of hand. As opposed to detecting undesirable behavior after the fact, this task aims to enable early, actionable prediction at a time when the conversation might still be salvaged. To this end, we develop a framework for capturing pragmatic devices—such as politeness strategies and rhetorical prompts—used to start a conversation, and analyze their relation to its future trajectory. Applying this framework in a controlled setting, we demonstrate the feasibility of detecting early warning signs of antisocial behavior in online discussions.


Building a rich conversation corpus from Wikipedia Talk pages
We present a corpus of conversations that encompasses the complete history of interactions between contributors to English Wikipedia's Talk Pages. This captures a new view of these interactions by containing not only the final form of each conversation but also detailed information on all the actions that led to it: new comments, as well as modifications, deletions and restorations. This level of detail supports new research questions pertaining to the process (and challenges) of large-scale online collaboration. As an example, we present a small study of removed comments highlighting that contributors successfully take action on more toxic behavior than was previously estimated.


As usual, you can join the conversation on IRC at #wikimedia-research. And, you can watch our past research showcases here.

Hope to see you there on June 18!
Dario


--

Dario Taraborelli  Director, Head of Research, Wikimedia Foundation
research.wikimedia.org • nitens.org • @readermeter