The next Research Showcase will be live-streamed this Wednesday,
December 21, 2016 at 11:30 AM (PST) 18:30 (UTC).
YouTube stream: https://www.youtube.com/watch?v=nmrlu5qTgyA
As usual, you can join the conversation on IRC at #wikimedia-research. And,
you can watch our past research showcases here
The December 2016 Research Showcase includes:
English Wikipedia Quality Dynamics and the Case of WikiProject Women
ScientistsBy *Aaron Halfaker
<https://meta.wikimedia.org/wiki/User:Halfak_(WMF)>*With every productive
edit, Wikipedia is steadily progressing towards higher and higher quality.
In order to track quality improvements, Wikipedians have developed an
article quality assessment rating scale that ranges from "Stub" at the
bottom to "Featured Articles" at the top. While this quality scale has the
promise of giving us insights into the dynamics of quality improvements in
Wikipedia, it is hard to use due to the sporadic nature of manual
re-assessments. By developing a highly accurate prediction model (based on
work by Warncke-Wang et al.), we've developed a method to assess an
articles quality at any point in history. Using this model, we explore
general trends in quality in Wikipedia and compare these trends to those of
an interesting cross-section: Articles tagged by WikiProject Women
Scientists. Results suggest that articles about women scientists were lower
quality than the rest of the wiki until mid-2013, after which a dramatic
shift occurred towards higher quality. This shift may correlate with (and
even be caused by) this WikiProjects initiatives.
Privacy, Anonymity, and Perceived Risk in Open Collaboration. A Study of
Tor Users and WikipediansBy *Andrea Forte*In a recent qualitative study to
be published at CSCW 2017, collaborators Rachel Greenstadt, Naz Andalibi,
and I examined privacy practices and concerns among contributors to open
collaboration projects. We collected interview data from people who use the
anonymity network Tor who also contribute to online projects and from
Wikipedia editors who are concerned about their privacy to better
understand how privacy concerns impact participation in open collaboration
projects. We found that risks perceived by contributors to open
collaboration projects include threats of surveillance, violence,
harassment, opportunity loss, reputation loss, and fear for loved ones. We
explain participants’ operational and technical strategies for mitigating
these risks and how these strategies affect their contributions. Finally,
we discuss chilling effects associated with privacy loss, the need for open
collaboration projects to go beyond attracting and educating participants
to consider their privacy, and some of the social and technical approaches
that could be explored to mitigate risk at a project or community level.
Sarah R. Rodlund
Senior Project Coordinator-Engineering, Wikimedia Foundation
This feature article from Bloomberg BusinessWeek does, IMO, a great job of
exploring and contextualizing Wikipedia's diversity issues. The reporters
really did their homework on this one, taking the time to explore all
There are certainly a few factual errors, the most egregious are probably
the confusing between policy and guideline on paid editing, and the
conflation of Wikipedia and WMF in a couple places.
Is Wikipedia Woke?
The ubiquitous reference site tries to expand its editor ranks beyond the
Comic Con set.
by Dimitra Kessenides and Max Chafkin
December 22, 2016
For your consideration. It is important to have more diversity in
membership of the committee and those giving feedback about the nominees.
---------- Forwarded message ----------
From: "Maor Malul" <maor_x(a)zoho.com>
Date: Dec 17, 2016 11:44 AM
Subject: [Affiliates] AffCom call for new members
To: "Wikimedia Mailing List" <wikimedia-l(a)lists.wikimedia.org>, "Affiliates
discussion list" <affiliates(a)lists.wikimedia.org>, <
The Affiliations Committee – the committee responsible for guiding
volunteers in establishing Wikimedia chapters, thematic organizations, and
user groups – is looking for new members! If you are interested in joining
us, please submit an application by 31 December 2016.
Based on the successful experience of last call, this one will include too
a public review and comment period. All applications will be posted on
Meta, and all members of the community are invited to provide comments and
feedback about each candidate, always keeping a friendly tone.
The full call for candidates, which contains more information on the
committee's work, membership criteria, and instructions for how to apply,
can be found on Meta at https://meta.wikimedia.org/
We invite all community members to consider to apply or invite those
members that you think could make it cool as AffCom members. We make a
special call to women since AffCom is worried to keep or increase its women
quota among its members.
If you have any questions, please don't hesitate to contact me or any other
member of the committee.
"*Jülüjain wane mmakat* ein kapülain tü alijunakalirua jee wayuukanairua
junain ekerolaa alümüin supüshuwayale etijaanaka. Ayatashi waya junain."
Socio, A.C. Wikimedia Venezuela | RIF J-40129321-2 | www.wikimedia.org.ve
Member, Wikimedia Israel | www.wikimedia.org.il <http://wikimedia.org.il>
Chair, Wikimedia Foundation Affiliations Committee
Phone: +972-52-4869915 <+972%2052-486-9915>
Affiliates mailing list
The Berkman Klein Center for Internet and Society at Harvard University has
four new publications online. https://cyber.harvard.edu/node/99716
“Understanding Harmful Speech Online: Research Note” is a summary of
current research, with several pages of links at the end. One phrase that
stood out: "Munger also recently conducted an experiment among groups of
users on Twitter considered harassers on the platform and found that
counter speech using automated bots can impact and reduce instances of
racist speech if 'that subjects… were sanctioned by a high-follower white
Two papers are from the Global South. "Grassroots Perspectives on Hate
Speech, Race, & Inequality in Brazil & Colombia" has an entire section on
"counter-speech", or counter narratives, a term that seems to be gaining
some currency. "Preliminary Findings on Online Hate Speech and the Law in
India" talks about inciting sectarian violence with fake news.
Finally, for an understanding of the definitions of hate speech, forget the
Wikipedia article, which embarrassingly uses the words "politically
correct" and "Newspeak" in the introductory paragraphs, sourced to opinion
pieces by two bloggers who did not even use the words. The paper “Defining
Hate Speech” gives a thought-provoking overview of various approaches to
identifying hate speech in a text. One such framework developed by Parekh
noted “three essential features” of hate speech: (1) “it is directed
against a specified or easily identifiable individual or, more commonly, a
group of individuals based on an arbitrary or normatively irrelevant
feature;” (2) the speech “stigmatizes the target group by implicitly or
explicitly ascribing to it qualities widely regarded as undesirable;” and
(3) “because of its negative qualities, the target group is viewed as an
undesirable presence and a legitimate object of hostility.” Also this,
food for thought about criteria for communication on Wikipedia's talk
pages: "...Ward’s definition, noting that a speaker should be seen as
employing hate speech if 'their attacks are so virulent that an observer
would have great difficulty separating the message delivered from the
attack against the victim'.”
This email might interest others who would like to know what next steps WMF
is considering address this set of issues, in terms of policies, practices,
and technical developments.
---------- Forwarded message ----------
From: Patrick Earley <pearley(a)wikimedia.org>
Date: Thu, Dec 8, 2016 at 12:30 PM
Subject: Re: [Wikimedia-l] Statement by Wikimedia Board on Healthy
Community Culture, Inclusivity, and Safe Spaces
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
I want to thank the Board for this letter, and for their focus on this
What specific work should we be doing to make progress around this issue?
Harassment is a complex problem, and there are no easy solutions. Nor is
there likely to be a single solution; improvement will have to be made
through a number of initiatives and coordinated approaches.
Wikimedia volunteers have offered many different approaches to the problem,
through consultations, workshops, the Inspire campaign, conference
sessions, and discussion. The Support and Safety team has been collating
these ideas, exploring the issue in the broader context of online
communities, and delving into academic research on the topic.
>From these conversations and research, we have identified some
categories/areas for improvement:
- Better blocking tools and detection - the Wikimedia community works
hard on the front lines keeping our users safe from harassment, through
monitoring noticeboards and recent changes for problems, investigating
“sock” accounts used to abuse contributors, and placing blocks on
problematic users. Improvements to blocking tools, and the ability to
detect harassing comments sooner can empower contributors to be more
effective at these tasks.
- Reporting and evaluation tools - The current systems for reporting
harassment are overburdened and can be unclear to users, and there are
limited tools that admins and stewards can use to evaluate the cases and
make good decisions. New tools, developed in collaboration with
functionaries and communities, can improve the experience of reporting,
investigating and managing harassment cases.
- Training for better handling of both in-person and online harassment -
Better training can give contributors the tools and skills to handle
harassment situations quickly and empathetically, document cases, and
provide good advice to targets of harassment.
- Policy and enforcement - Wikimedia communities have developed a
variety of processes, policies, and approaches to dealing with
problems. As a movement, we need to identify which are working well, and
share those successes. We also need to identify where our approaches are
not working well, identify the problems, and try new solutions based on
research and data.
- Coordination with other platforms on harassment approaches and tools,
and keeping up with current academic research - Our communities are not
only ones struggling with the problem of online harassment. We need to
work more closely with other platforms, researchers, online communities,
and experts to make sure we are aware of successful techniques, new
research, and useful tools.
The above areas are not the only areas where improvement can be made -
right now, contributors are brainstorming other approaches through the
Community Wishlist process. We also encourage contributors to reach out
to the Support and Safety team at ca(a)wikimedia.org with ideas, or contact
me privately at pearley(a)wikimedia.org.
On Thu, Dec 8, 2016 at 12:26 PM, Sydney Poore <sydney.poore(a)gmail.com>
> Thank you Christophe and the rest of the Wikimedia Foundation trustees
> for dedicating time and thought to this important topic.
> I'm optimistic that if we collaborate together as a community we can
> make a difference in the level of harassment on Wikimedia projects and
> maybe even other parts of the internet.
> Sydney Poore
> On Thu, Dec 8, 2016 at 3:18 PM, Christophe Henner <chenner(a)wikimedia.org>
> > Hello everyone,
> > As many of you know, over the past couple of years the Wikimedia
> > has taken a focused look at community health—particularly in regards to
> > harassment. The Foundation's Board has been monitoring and discussing
> > issue over the past year with great interest. We have prepared a
> > offering our thoughts on this topic, and providing a clear mandate for
> > Foundation’s leadership to fully engage on this issue.
> > Our statement is below and has been posted on Meta-Wiki, where it is set
> > for translation:
> > https://meta.wikimedia.org/wiki/Wikimedia_Foundation_
> > Since the Foundation was established, we have been invested in building
> > positive community culture. As part of these efforts, we have monitored
> > projects for instances of harassment, escalating our capacity to respond
> > recent years. Thanks to the work of the Foundation's Support and Safety
> > Team, we now have data in the form of the 2015 Harassment Survey
> > the nature of the issue. This has enabled us to identify key areas of
> > concern, and step up our response appropriately. This research shows
> > harassment has a negative impact on participation in our projects. This
> > implications for our ability to collect, share, and disseminate free
> > knowledge in support of the Wikimedia vision. Our statement speaks to
> > Board's duty to help the Foundation fulfill its mission.
> > The Board is committed to making our communities safer and will not
> > harassment and toxic behavior on Wikimedia projects. We believe this
> > deserves the Foundation's attention and resources, and have confirmed
> > responsibility at our latest Board meeting on November 13th. The
> > that lay before us all now are how to best address this threat, rather
> > if we should attempt to do so.
> > The Board especially appreciates and applauds the work being done to
> > address this important issue by many community leaders across the
> > and teams within the Foundation. We look forward to seeing this
> > work not only continue, but expand. Finally, we encourage everyone who
> > interested in helping the Foundation address this threat to our vision
> > mission to engage in the upcoming discussions around this issue.
> > On behalf of the Wikimedia Foundation Board of Trustees,
> > Christophe Henner, Board Chair
> > María Sefidari, Board Vice Chair
> >  https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
> > Statement by the Wikimedia Board on Healthy Community Culture,
> > and Safe Spaces
> > At our Board meeting on November 13, and in Board meetings in September
> > June, we spent considerable time discussing the issues of harassment and
> > hostility on the internet generally, and more specifically on the
> > projects.
> > This is an important issue. Approximately 40% of internet users, and 70%
> > women internet users, have personally experienced harassment. Of
> > who have reported experiencing harassment on Wikimedia projects, more
> > 50% reported decreasing their participation in our community. Based
> > this and other research, we conclude that harassment and toxic behavior
> > the Wikimedia projects negatively impacts the ability of the Wikimedia
> > projects to collect, share, and disseminate free knowledge. This
> > is contrary to our vision and mission.
> > Our communities deserve safe spaces in which they can contribute
> > productively and debate constructively. It is our belief that the
> > Foundation should be proactively engaged in eliminating harassment,
> > promoting inclusivity, ensuring a healthier culture of discourse, and
> > improving the safety of Wikimedia spaces. We request management to
> > appropriate resources to this end.
> > We urge every member of the Wikimedia communities to collaborate in a
> > that models the Wikimedia values of openness and diversity, step forward
> > do their part to stop hostile and toxic behavior, support people who
> > been targeted by such behavior, and help set clear expectations for all
> > contributors.
> >  2014 Pew Research Center Study, found at:
> > http://www.pewinternet.org/2014/10/22/online-harassment/
> >  2015 WMF Harassment Survey, found at:
> > https://upload.wikimedia.org/wikipedia/commons/5/52/
> > Christophe HENNER
> > Chair of the board of trustees
> > chenner(a)wikimedia.org
> > +33650664739
> > twitter *@schiste* skype *christophe_henner*
> > _______________________________________________
> > Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/
> > New messages to: Wikimedia-l(a)lists.wikimedia.org
> > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/
> New messages to: Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
Senior Community Advocate
(1) 415 975 1874
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/
New messages to: Wikimedia-l(a)lists.wikimedia.org
Thought would be of interest.
---------- Forwarded message ----------
From: "Amy Lee, USA" <amy.lee(a)jhu.edu>
Date: Dec 6, 2016 1:21 AM
Subject: [hifa] Are there gender differences in access and use of digital
To: "HIFA - Healthcare Information For All" <HIFA(a)dgroups.org>
When planning for a digital health program, people often assume that women
have less access to the internet or are less likely to use technology than
men. The Knowledge for Health (K4Health) Project [https://www.k4health.org/]
wondered if this assumption is actually true and surveyed Global Health
eLearning (GHeL) Center [https://www.globalhealthlearning.org/] users to
understand how gender plays a role in GHeL online engagement.
I recently wrote about my experience working on this activity and our
findings in a new post on The Exchange, K4Health’s Medium Publication –
Online Learning: Are There Really Differences Between Men and Women? [
http://bit.ly/2geRpc1] The main takeaway for me is that there were more
similarities than differences between men and women. For example, the top
three reasons both men and women gave for taking courses were interest in
topic, desire for technical knowledge, and interest in improving job
We’d love to hear your own experience looking at gender and digital health.
Have you found that men and women are similarly engaged in your programs
and activities? Or do some types of tools and services show a difference in
access and use??
Please feel free to share with relevant contacts and colleagues.
Program Specialist, Knowledge for Health (K4Health)
Johns Hopkins Center for Communication Programs
111 Market Place, Suite 310
Baltimore, Maryland 21202
Web: ccp.jhu.edu| www.k4health.org
HIFA profile: Amy Lee is a Program specialist at The Johns Hopkins
University Center for Communication Programs in the United States of
America. amy.lee AT jhu.edu
Thank you to all HIFA Financial Supporters in 2016: British Medical
Association (lead funder), Africa Health, Afro-European Medical & Research
Network, Asia Pacific Association of Medical Journal Editors, Chartered
Society of Physiotherapy, Commonwealth Nurses Federation, Council of
International Neonatal Nurses, ecancer, Elsevier, Foundation of Mother &
Child Health, Global Health Media Project, Haiti Nursing Foundation,
International Child Health Group (RCPCH), International Foundation for
Dermatology, International League of Dermatological Societies,
International Society for Social Paediatrics and Child Health, Joanna
Briggs Institute, The Lancet, LiveWell Initiative (Nigeria), mPowering
Frontline Health Workers, Medical Education Cooperation with Cuba, Medic
Mobile, Network for Information and Digital Access, Next GenU, Palliative
Drugs, Partnerships in Health Information, Physicians for Haiti, Public
Library of Science (PLOS), Research Square, Royal College of Midwives, The
Mother and Child Health and Education Trust, WHO Collaborating Centre for
Knowledge Translation & Health, Wiki Project Med Foundation, Your MD,
Zambia UK Healthworkforce Alliance
To send a message to the HIFA forum, simply send an email to:
HIFA: Healthcare Information For All: www.hifa.org
HIFA Voices database: www.hifavoices.org
You are receiving this message because you're a member of the community
HIFA - Healthcare Information For All.
View this contribution on the web site https://dgroups.org/_/r67bl4pa
A reply to this message will be sent to all members of HIFA - Healthcare
Information For All.
To reply to sender, send a message to amy.lee(a)jhu.edu.
To unsubscribe, send an email to leave.HIFA(a)dgroups.org
In light of Trump boasting of grabbing women "by the pussy" and of him
calling Clinton a "nasty woman" on national TV during a formal presidential
debate, I wonder what effect, if any, his election may have on the WP
Since the election there have been numerous reports of women and minorities
being openly attacked by emboldened men and women and who supported Trump
and his... values. Apparently this was the case in Englad post-Brexit, too?
Has anyone else thought about this?