Hey all,
a reminder that the livestream of our monthly research showcase starts in 45 minutes (11.30 PT)
- Video: https://www.youtube.com/watch?v=L-1uzYYneUo - IRC: #wikimedia-research - Abstracts: https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#January_2018
Dario
On Tue, Jan 16, 2018 at 9:45 AM, Lani Goto lgoto@wikimedia.org wrote:
Hi Everyone,
The next Research Showcase will be live-streamed this Wednesday, January 17, 2018 at 11:30 AM (PST) 19:30 UTC.
YouTube stream: https://www.youtube.com/watch?v=L-1uzYYneUo
As usual, you can join the conversation on IRC at #wikimedia-research. And, you can watch our past research showcases here.
This month's presentation:
*What motivates experts to contribute to public information goods? A field experiment at Wikipedia* By Yan Chen, University of Michigan Wikipedia is among the most important information sources for the general public. Motivating domain experts to contribute to Wikipedia can improve the accuracy and completeness of its content. In a field experiment, we examine the incentives which might motivate scholars to contribute their expertise to Wikipedia. We vary the mentioning of likely citation, public acknowledgement and the number of views an article receives. We find that experts are significantly more interested in contributing when citation benefit is mentioned. Furthermore, cosine similarity between a Wikipedia article and the expert's paper abstract is the most significant factor leading to more and higher-quality contributions, indicating that better matching is a crucial factor in motivating contributions to public information goods. Other factors correlated with contribution include social distance and researcher reputation.
*Wikihounding on Wikipedia* By Caroline Sinders, WMF Wikihounding (a form of digital stalking on Wikipedia) is incredibly qualitative and quantitive. What makes wikihounding different then mentoring? It's the context of the action or the intention. However, all interactions inside of a digital space has a quantitive aspect to it, every comment, revert, etc is a data point. By analyzing data points comparatively inside of wikihounding cases and reading some of the cases, we can create a baseline for what are the actual overlapping similarities inside of wikihounding to study what makes up wikihounding. Wikihounding currently has a fairly loose definition. Wikihounding, as defined by the Harassment policy on en:wp, is: “the singling out of one or more editors, joining discussions on multiple pages or topics they may edit or multiple debates where they contribute, to repeatedly confront or inhibit their work. This is with an apparent aim of creating irritation, annoyance or distress to the other editor. Wikihounding usually involves following the target from place to place on Wikipedia.” This definition doesn't outline parameters around cases such as frequency of interaction, duration, or minimum reverts, nor is there a lot known about what a standard or canonical case of wikihounding looks like. What is the average wikihounding case? This talk will cover the approaches myself and members of the research team: Diego Saez-Trumper, Aaron Halfaker and Jonathan Morgan are taking on starting this research project.
-- Lani Goto Project Assistant, Engineering Admin _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/ wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/ wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe