Congratulations, Abraham, and congratulations, WMDE!
On Jan 30, 2017 7:57 AM, "Tim Moritz Hector" <tim-moritz.hector(a)wikimedia.de>
It is our great pleasure to announce that during last weekend’s Board
retreat, we voted to appoint Abraham Taherivand as Executive Director of
Wikimedia Deutschland with immediate effect.
Abraham has joined Wikimedia Deutschland in 2012, has been the director of
our Software Development department, and interim ED in the past two months.
In all his roles he has shown vast experience and qualifications as well as
the much needed, deep commitment for Free Knowledge. We are convinced that
Abraham is the right person at the right time for Wikimedia Deutschland and
have great confidence that the management of the office is in good hands
with him. Abraham will continue to lead the Software Development department
on an interim basis until we have been able to fill this position with a
new permanent director.
Together with Abraham, WMDE staff, our members and communities as well as
other interested parties, the board will analyse and – where applicable –
revise the composition of leadership and decision making structures at WMDE
in 2017. Kurt Jansson, Sebastian Moleski and myself will be steering this
process and are available for your questions and feedback via email (
We wish Abraham the very best in this role, and the Board looks forward to
continuing to work with him. Please join us in congratulating Abraham!
For the Supervisory Board
Tim Moritz Hector
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/
New messages to: Wikimedia-l(a)lists.wikimedia.org
Today I was reading in the (international) news about websites with
knowledge on the topic of climate change disappear from the internet as
result of the Trump administration. The second thing I read is that before
something can be published about this topic, the government needs to
Do you realise what the right word for this is? censorship.
Even if it is only partially.
Luckily there are many scientists working on getting all the data abroad,
out of the US to ensure the research data is saved, including on servers in
the Netherlands where Trump (hopefully) has no reach.
In the past week I was reading about the Internet Archive organisation, who
is making a back up in Canada because of the Trump administration. I did
not understood this, you may call me naive, but now I do understand,
apparently they have some visionary people at the Internet Archive.
I miss a good answer to this situation from the Wikimedia Foundation.
Trump is now promoting harassment and disrespect, already for some time,
What signal is given to the rest of the world if an America based
organisation is spreading the thought of a harassment free Wikipedia and
the free word, while the president of the US is promoting harassment,
disrespect and censorship on a massive scale.
This is just the first week of this president!
I am 100% sure everyone in the Wikimedia movement is willing to make sure
Wikimedia faces no damage whatsoever, including in WMF, but to me this
still starts to get concerning.
If we as Wikimedia movement think that free knowledge, free speech, freedom
of information, etc are important, I would think that the location where
the organisation is based is that country where liberty is the largest, I
do not know where this is but it is definitely not the US.
To my impression WMF is stuck in the US, so I do not believe they would
actually move when the danger grows.
But I think it is possible to make sure risks are spread over the world.
Certainly as we are an international movement that intends to cover the
knowledge of the whole humanoid civilisation.
To come to a conclusion, I think WMF and the Wikimedia movement should
think about a back-up plan if it actually goes wrong.
If you do not agree with me: that is perfectly fine, that's your right and
should be protected.
>... there is zero chance that the president will be able to censor
> the private sector.
If you mean the U.S. private sector, you're right. But otherwise, the
U.S. President is allowed to take a whole lot of actions which can
effectively censor non-citizens, and I've got some bad news pertaining
to one in particular involving compliance with European privacy
regulations which could potentially result in the deletion of records
including accounts of European citizens from hosting providers such as
Google, Apple, Facebook, Twitter, etc. Please see:
"Enforcing privacy policies that specifically 'exclude persons who are
not United States citizens or lawful permanent residents,' while aimed
at enhancing domestic immigration laws, effectively invalidates
America's part of the Privacy Shield agreement, opens the current
administration up to sanctions by the EU and could lead our allies
across the Atlantic to suspend the agreement outright."
If Google is forced to delete all the personally identifying
information of European citizens because the President ordered U.S.
federal agencies to stop enforcing privacy policies, that would
effectively be an act of censorship on a scale without historical
precedent, would it not?
> there are a lot of resources based in the US that may
> need to be distributed on a wider base including
> personal/private data already collected by the WMF
For editors, but not readers. On November 8 a top
foundation official tweeted that the Foundation would
not store personally identifying information in reader
logs. Samuel Klein retweeted the statement, but the
tweet has since been deleted.
According to this recently created document:
the Foundation keeps full reader logs with IP addresses
so that "in order to allow buffer time to be able to rerun
metrics due to bugs or data issues." Although the
analytics team has been responsive to many of my
questions, nobody has been able to give any reasons
why it is more important to "rerun metrics" than to
protect the privacy of readers, after repeated requests
for any such reasons, or even the basis upon such
a decision about the trade-off is made. Furthermore,
while that document states that it's "imperative for
Ops to be able to examine raw IP addresses" but after
at least half a dozen requests, nobody in analytics or
Ops has said whether Ops is able to use the request
logs with IP addresses but not article names.
This is a technical approach to a psychological / sociological problem. I
am not in the business of whack a mole but there are a few things to
consider. As far as I know there is not much in this that is supported by
good sociological or psychological understanding. To the best of my
understanding giving what we do is about groups and group dynamics, about
the interaction of all kinds of people who are often obsessively active in
what they do, we as an organisation lack support from people who have a
good understanding of all the issues that are involved.
It has been said in the past that Wikipedia is not a therapy. To the people
who say so, they are right up to a point. Wikipedia is therapeutic for a
sizeable group of people. Other people have a twisted mind and Wikipedia
has become their favourite play ground. We have been in denial about both
aspects. The latter is something we cannot handle and the first is
something we could handle better.
When we accept that the current grant will attempt to up the war against
those who make our environment toxic, we do this probably without enough
understanding of the dynamics that are going on. Why the people who make
our community toxic are doing this and how they became this way. The fact
of the matter is that as a community we are not dedicated to "share in the
sum of all knowledge", we are much more into Wikipedia. This is
understandable when you consider sociodynamics but it is vital to
appreciate it when the current objective is to arm "us" against "them". It
is vital when you want to understand many of the conflicts we have.
Yes, there is a need to improve our strategies against the negative
influence by some. But please do this with an understanding of what is
happening and model the strategies on this understanding.
On 27 January 2017 at 03:30, Samantha Lien <slien(a)wikimedia.org> wrote:
> This press release is also available online here:
> And as a blog post on the Wikimedia blog here:
> Wikimedia Foundation receives $500,000 from the Craig Newmark Foundation
> and craigslist Charitable Fund to support a healthy and inclusive Wikimedia
> Grant supports development of more advanced tools for volunteers and staff
> to reduce harassing behavior on Wikipedia and block harassers from the site
> SAN FRANCISCO — January 26, 2017 — Today, the Wikimedia Foundation
> announced the launch of a community health initiative to address harassment
> and toxic behavior on Wikipedia, with initial funding of US$500,000 from
> the Craig Newmark Foundation and craigslist Charitable Fund. The two seed
> grants, each US$250,000, will support the development of tools for
> volunteer editors and staff to reduce harassment on Wikipedia and block
> Approximately 40% of internet users
> <http://www.pewinternet.org/2014/10/22/online-harassment/>, and as many
> as 70% of younger users have personally experienced harassment online, with
> regional studies showing rates as high as 76%
> for young women. While harassment differs across the internet, on Wikipedia
> and other Wikimedia projects, harassment has been shown to reduce
> participation on the sites. More than 50%
> of people who reported experiencing harassment also reported decreasing
> their participation in the Wikimedia community.
> Volunteer editors on Wikipedia are often the first line of response for
> finding and addressing harassment on Wikipedia. "Trolling
> <https://en.wikipedia.org/wiki/Internet_troll>," "doxxing
> <https://en.wikipedia.org/wiki/Doxing>," and other menacing behaviors are
> burdens to Wikipedia's contributors, impeding their ability to do the
> writing and editing that makes Wikipedia so comprehensive and useful. This
> program seeks to respond to requests from editors over the years for better
> tools and support for responding to harassment and toxic behavior.
> “To ensure Wikipedia’s vitality, people of good will need to work together
> to prevent trolling, harassment and cyber-bullying from interfering with
> the common good,” said Craig Newmark, founder of craigslist. “To that end,
> I'm supporting the work of the Wikimedia Foundation towards the prevention
> of harassment.”
> The initiative is part of a commitment to community health at the
> Wikimedia Foundation, the non-profit organization that supports Wikipedia
> and the other Wikimedia projects, in collaboration with the global
> community of volunteer editors. In 2015, the Foundation published its first
> Harassment Survey
> <https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015> about
> the nature of the issue in order to identify key areas of concern. In
> November 2016, the Wikimedia Foundation Board of Trustees issued a
> statement of support
> calling for a more “proactive” approach to addressing harassment as a
> barrier to healthy, inclusive communities on Wikipedia.
> "If we want everyone to share in the sum of all knowledge, we need to make
> sure everyone feels welcome,” said Katherine Maher, Executive Director of
> the Wikimedia Foundation. “This grant supports a healthy culture for the
> volunteer editors of Wikipedia, so that more people can take part in
> sharing knowledge with the world."
> The generous funding from the Craig Newmark Foundation and craigslist
> Charitable Fund will support the initial phase of a program
> <https://meta.wikimedia.org/wiki/Community_health_initiative> to
> strengthen existing tools and develop additional tools to more quickly
> identify potentially harassing behavior, and help volunteer administrators
> evaluate harassment reports and respond effectively. These improvements
> will be made in close collaboration with the Wikimedia community to
> evaluate, test, and give feedback on the tools as they are developed.
> This initiative addresses the major forms of harassment reported on the
> Wikimedia Foundation’s 2015 Harassment Survey
> which covers a wide range of different behaviors: content vandalism,
> stalking, name-calling, trolling, doxxing, discrimination—anything that
> targets individuals for unfair and harmful attention. From research and
> community feedback, four areas have been identified where new tools could
> be beneficial in addressing and responding to harassment:
> * Detection and prevention - making it easier and faster for editors to
> identify and flag harassing behavior
> * Reporting - providing victims and respondents of harassment improved
> ways to report instances that offer a clearer, more streamlined approach
> * Evaluating - supporting tools that help volunteers better evaluate
> harassing behavior and inform the best way to respond
> * Blocking - making it more difficult for someone who is blocked from the
> site to return
> For more information, please visit: https://meta.wikimedia.org/
> About the Wikimedia Foundation
> The Wikimedia Foundation is the non-profit organization that supports and
> operates Wikipedia and its sister projects. More than a billion unique
> devices access the Wikimedia sites each month. Roughly 75,000 people edit
> Wikipedia and its sister projects every month, collectively creating and
> improving its more than 40 million articles across hundreds of languages –
> this all makes Wikipedia one of the most popular web properties in the
> world. Based in San Francisco, California, the Wikimedia Foundation is a
> 501(c)(3) charity that is funded primarily through donations and grants.
> About Wikipedia
> Wikipedia is the world’s free knowledge resource. It is a collaborative
> creation that has been added to and edited by millions of people from
> around the globe since it was created in 2001: anyone can edit it, at any
> time. Wikipedia is offered in hundreds of languages containing more than 40
> million articles. Wikimedia and its sister projects are collectively
> visited by more than a billion unique devices each month.
> Harassment takes different forms on Wikipedia than it does on other major
> websites. Unlike other platforms, Wikipedia editors generally don’t write
> about their personal lives. Instead, on Wikipedia, harassment usually
> begins as a content dispute between editors that results in an attack on an
> editor’s personal attributes—their gender, race, religion, sexual
> orientation, political affiliation—based on something that they’ve shared,
> or an assumption based on the user’s edit history.
> About the Craig Newmark Foundation
> The Craig Newmark Foundation (CNF) is a private foundation created by
> craigslist founder Craig Newmark in 2016 to support and connect nonprofit
> communities and drive powerful civic engagement. The Foundation’s
> priorities include Trustworthy Journalism, Veterans and Military Families,
> Voter Protection and Education, Consumer Protection and Education, Public
> Diplomacy, Government Transparency, Micro-Lending to Alleviate Poverty, and
> Women in Tech.
> About craigslist Charitable Fund
> The craigslist Charitable Fund (CCF) provides millions of dollars each
> year in one-time and recurring grants to hundreds of partner organizations
> addressing four broad areas of interest including Environment and
> Transportation; Education, Rights, Justice, and Reason; Nonviolence,
> Veterans and Peace; and Journalism, Open Source, and Internet.
> Press contacts
> Craig Newmark Foundation
> Bruce Bonafede
> Wikimedia Foundation
> Juliet Barbara
> (415) 839-6885
> *Samantha Lien*
> Communications Manager
> Wikimedia Foundation
> 149 New Montgomery Street
> San Francisco, CA 94105
> (To be unsubscribed from this press release distribution list, please
> reply to communications(a)wikimedia.org with 'UNSUBSCRIBE' in the subject
> Please note: all replies sent to this mailing list will be immediately
> directed to Wikimedia-l, the public mailing list of the Wikimedia
> community. For more information about Wikimedia-l:
> WikimediaAnnounce-l mailing list
(this is an announcement in my capacity as a volunteer.)
Inspired by a lightning talk at the recent CEE Meeting by our colleague
Lars Aronsson, I made a little command-line tool to automate batch
recording of pronunciations of words by native speakers, for uploading to
Commons and integration into Wiktionary etc. It is called *pronuncify*, is
written in Ruby and uses the sox(1) tool, and should work on any modern
Linux (and possibly OS X) machine. It is available here, with
I was then asked about a Windows version, and agreed to attempt one. This
version is called *pronuncify.net <http://pronuncify.net>*, and is a .NET
gooey GUI version of the same tool, with slightly different functions. It
is available here, with instructions.
Both tools require word-list files in plaintext, with one word (or phrase)
per line. Both tools name the files according to the standard established
in [[commons:Category:Pronunciation]], and convert them to Ogg Vorbis for
you, so they are ready to upload.
In the future, I may add OAuth-based direct uploading to Commons. If you
run into difficulties, please file issues on GitHub, for the appropriate
tool. Feedback is welcome.
We have an update on the community health initiative mentioned following
the Board's Statement on Healthy Community Culture, Inclusivity, and Safe
As Patrick Earley from the Support and Safety team noted on Wikimedia-l
last month, we’ve been developing a community health initiative to help
address the harassment issues discussed in the Board's statement. We
believe an important aspect of our efforts to combat harassment is
providing the volunteer community with better tools to more effectively
respond to instances of harassment as they arise.
We’re excited to announce that the Craig Newmark Foundation and craigslist
Charitable Fund have agreed to provide initial funding to help the
Wikimedia Foundation begin this work. The two seed restricted grants,
collectively a gift of $500,000, will enable the Foundation to scale up our
support of these efforts and provide us with the resources to do it right.
In preparing for this work, we’ve been discussing issues with the current
tools and processes with active administrators and functionaries. These
discussions have resulted in requested improvements in several key areas
where admins and functionaries see immediate needs—better reporting systems
for volunteers, smarter ways to detect and address problems early, and
improved tools and workflows related to the blocking process.
In the coming months, the Community Tech team, working with the Support and
Safety team, will be expanding their work on development of these tools.
The long-term goal for this effort is to build up the toolbox that
volunteers can use to combat harassment and other disruptive behavior on
Specifically, there are four areas where we think new tools will help:
1. Detection - Improve our detection and prevention tools, like
AbuseFilter, and build new features to detect aggressive behavior.
2. Reporting - Design ways to report harassment that are less chaotic, more
respectful of privacy, and less stressful than the current workflow.
3. Evaluating - Offer admins tools that make evaluating harassment reports
easier, so that they can make good decisions.
4. Blocking - When someone is blocked from the site, we can make it more
difficult for them to return under a different name or IP address.
Of course, these improvements need to be made with the participation and
support of the volunteers who will be using the tools. We don't want to
create new systems and workflows that create more work for an already
overburdened team of wiki administrators. We want to make these tasks less
grueling and able to more consistently produce effective outcomes.
Work in other areas - such as project policies and better training for
administrators and functionaries - still needs to be done in order to
comprehensively tackle the overall issue of harassment and harmful behavior
on the projects. However, we believe that improving and building better
tools for volunteers currently most engaged in this effort is a necessary
We welcome your feedback on this approach, and invite you to join us in
thanking Craig and his charitable organizations for their support of this
We’ll be sharing regular updates about the progress of this work in the
coming months. If you have any questions in the meantime, please reach out
to us on the talk page of the Meta-Wiki page where you can find more
You can also find more details about this announcement in this blog post:
Danny Horn (Product Manager, Community Tech) and Patrick Earley (Manager,
Support & Safety)
Today marks my 1 year anniversary with the WMF. What a ride it's been!
A little clarification, or a timeline if you will.
Work on Interactive was led by very energetic and talented technical folks
for a good chunk of time with out a lot structure around the work. Then,
about a quarter ago, the team tried to start with more planning - a
roadmap, team roles, checklists for deployment - the usual stuff. It didn't
go well. The structure was too burdensome for some team members and lacking
for others. It caused a bit of stress to all members of the team, myself
included. But, to be good stewards of our resources we need some structure.
Lacking that structure it was decided to put a pause on things, rethink the
approach, and figure out how this all fits into planning and strategy for
the comping months/years. In the middle of this was the holiday, dev summit
and WMF all-hands (a solid week away from the office for the WMF), then
Yuri's departure, and Katie's scheduled (and deserved) vacation. Looking
back at it, a mess of bad timing.
So, Dan posted the message on discovery-l. I can't speak for him, but my
interpretation was, "Hey, just a head's up. We're going to pause things
while we work some stuff out and we'll let you know more in the future". An
honest attempt to do what so many of us ask for - quick communication in
the open. Personal note, I really didn't expect so many people would
care/notice! I'm happy to see that I was wrong.
It's hard to talk about these things. It's a sign of vulnerability to do so
and this information puts you at risk for criticism and embarrassment
(deserved or otherwise!). We're professionals, we never make mistakes,
right? Nope. We do, and it's hard to talk about. It's also hard with
professional and legal reasons when talking about individual staff. Heck,
even writing this reply I'm worried I'm going to say something wrong. :)
Now a few of the folks in Discovery are coming up with a plan, to be
discussed with our director upon their return from vacation. I personally
believe it would be wrong to make a decision without their involvement. I,
like many of you, personally hope we figure out a good way to keep the work
the interactive team has done moving forward. Once we have a plan, we'll
let you all know.
At this point I feel like I'm repeating what others have already said. :) I
don't expect this will put all minds at ease, but I too ask for your
patience and assume good faith.
I’ve always respected Derk-Jan's perspective and thoughts in the community.
I appreciate his concerns and I hope he continues bring them forth.
Community Liaison - Discovery