There are some people who repeatedly argue that we raise way too much
money. Given a set of assumptions an argument can be constructed to make
this point. In my opinion there is little merit to the argument. We do need
money to operate the Wikimedia projects and a positive outcome per year
enables us to do more.the next year. I have some ideas about raising money
and raising expectations.
- We want to raise less money in the Anglo-Saxon world. When people
donate money everywhere they too will gain a sense of ownership. This sense
of ownership is to be distributed more equally around the globe
- With our projects owned more equitably around the globe, the notion
that "any child of nine year old can find pictures in Commons" is
reasonable and self-evident; the world pays for results that
are globally relevant ..
- We need a delivery manager, his/her task is to research and define
what it is our projects deliver to their public. The objective is to
increase both quantity and quality of what is delivered by a project and
discuss with project communities what it is that can be done to improve the
service to its public. Commons does provide material to Wikipedia, that is
good but not enough.
Both the Wikimedia Foundation and the Internet Archive have projects to
document all scientific papers / output. The Internet Archive provides an
important service to the Wikimedia Foundation and we can integrate the two
projects, reduce costs and have the WMF pay the IA for its services. Closer
ties with the Internet Archive provide many other benefits. One of these
benefits is that we can bring the Wikipedia references into a modern age.
For Wikidata there is a technical limit in what we can achieve on the
current platform. Because of Wikidata the WMF is a very big fish in the
data pond. We need to (imho) pick up the challenge and develop our own
software. This will cost significantly and it demonstrates that we accept
that Free software is not Free as in Beer. With the IA as a partner, we may
find a partner in this endeavour.
The notion that we raise too much money, the notion that there is no
urgency is a fallacy. It is all too easy to identify how our service is
lacking and where we can improve our service. The arguments why the WMF
raises too much money assumes that there is only one project, their project
and they consider that its status quo suffices. The question is, sufficient
for who,for what and for how long.
Please accept my apologies if you are receiving this a number of times
today. We have sent it out to multiple mailing lists in order to reach as
many community members as possible. Please feel free to forward this to any
other community mailing lists you believe are relevant.
It's coming close to time for annual appointments of community members to
serve on the Ombuds commission (OC). This commission works on all Wikimedia
especially in use of CheckUser and Oversight tools, and to mediate between
the complaining party and the individual whose work is being investigated.
They may also assist the General Counsel, the Executive Director or the
Board of Trustees in investigations of these issues. For more on their
duties and roles, see http://meta.wikimedia.org/wiki/Ombuds_commission
This is a call for community members interested in volunteering for
appointment to this commission. Volunteers serving in this role should be
experienced Wikimedians, active on any project, who have previously used
the CheckUser tool OR who have the technical ability to understand the
CheckUser tool and the willingness to learn it. They are expected to be
able to engage neutrally in investigating these concerns and to know when
to recuse when other roles and relationships may cause conflict.
Commissioners are required to sign the Access to NonPublic Information NDA
and must be willing to comply with the appropriate Wikimedia Foundation
board policies (such as the access to non-public data policy and the
discretion and trust. Commissioners must also be over 18 years of age.
If you are interested in serving on this commission, please write me an
email off-list to detail your experience on the projects, your thoughts on
the commission and what you hope to bring to the role. The commission
consists of twelve members; all applications are appreciated and will be
carefully considered. The deadline for applications is the end of day on
31 December, 2021.
Please feel free to pass this invitation along to any users who you think
may be qualified and interested.
On behalf of the Wikimedia Foundation Trust & Safety team
3. This deadline is flexible in terms of time zones; as long as your
application is in while it's 31 December somewhere in the world, you're
Trust & Safety Specialist
I am pleased to announce that the Wikimedia Foundation Board of Trustees
has appointed Maryana Iskander as the new CEO of the Wikimedia Foundation
Since 2013, Maryana has served as the CEO of Harambee Youth Employment
Accelerator , a South African non-profit social enterprise focused on
building African solutions for the global challenge of youth unemployment.
Prior to this, she spent six years as Chief Operating Officer of Planned
Parenthood Federation of America , a volunteer-led social movement
focused on access to women’s healthcare. Maryana has also worked in
academia as the Advisor to the President of Rice University , an
international research university based in the United States.
Her professional career has been motivated by breaking down systemic
barriers, creating opportunities for collaborative solution-building, and
community empowerment. She has a proven track record for leading complex
organisations shaped by shared decision-making.
In looking for the next CEO, we on the Board convened a Transition
Committee , primarily to guide us in finding the right person for this
critical role and secondly to oversee the executive Transition Team. The
Transition Committee conducted a far-reaching and competitive global
search, receiving around 400 recommendations and speaking to about 50
potential candidates. Throughout this selection process, Maryana impressed
us as someone who is deeply inspired by the Wikimedia vision and who
embodies the values of equity and community that inform all Wikimedia work.
She has extensive leadership experience working with volunteer-led
initiatives and building partnerships across public, private and social
sectors. Maryana also brings expertise in technology-led innovation to
accelerate meaningful social change. She does this with a global
perspective: Maryana was born in the Middle East, educated in the United
States and the United Kingdom, and has spent the last decade living and
working on the African continent.
Maryana joins the Wikimedia Foundation at a crucial time. The movement is
larger than ever, and it has never been more relevant or more trusted. This
is an inflection point, as decisions need to be made to execute a shared
vision for where the Movement wants to be in 2030. We believe that Maryana
is the right person to help lead the Foundation at this moment.
As Maryana begins, her priorities will include supporting movement efforts
to implement the Wikimedia 2030 recommendations, such as the development of
a Movement Charter and the finalization of a Universal Code of Conduct. She
will continue the Foundation’s focus on knowledge equity and exploring ways
to address the gaps in content and the diversity of contributors to
Wikimedia projects. She will be supported by the Board in this journey.
Maryana will officially start at the Wikimedia Foundation on January 5,
2022, as she transitions from her current job. Until then, the Foundation
will continue to be led by the Transition Team, with guidance from the
Board. In my conversations with her, I have seen that Maryana is a fan of
direct communication and excited to learn from the movement. In the coming
weeks, she will share ways to connect. Please join me in welcoming Maryana
(CCed) to the Foundation!
PS. For translations of this message, or to help translate it into more
languages, please visit Meta-Wiki 
antanana / Nataliia Tymkiv
Acting Chair, Wikimedia Foundation Board of Trustees
*NOTICE: You may have received this message outside of your normal working
hours/days, as I usually can work more as a volunteer during weekend. You
should not feel obligated to answer it during your days off. Thank you in
Please note: all replies sent to this mailing list will be immediately directed to Wikimedia-l, the public mailing list of the Wikimedia community. For more information about Wikimedia-l:
WikimediaAnnounce-l mailing list -- wikimediaannounce-l(a)lists.wikimedia.org
To unsubscribe send an email to wikimediaannounce-l-leave(a)lists.wikimedia.org
The #WPWPCampaign 2021 International Team is looking for experienced
volunteer Wikipedia editors to join the campaign international jury team.
Interested person(s) should email the international team via their mailing
list: wpwp-organizers(a)lists.wikimedia.org list.
Euphemia and Rajeeb
For: Wikipedia Pages Wanting Photos International Jury.
Dear fellow Wikimedians,
This is an update from the Movement Charter
<https://meta.wikimedia.org/wiki/Movement_Charter> process. We have closed
the call for candidates on September 14 (AoE
<https://en.wikipedia.org/wiki/Anywhere_on_Earth>) for the Drafting
now have a pool of candidates
with diverse backgrounds to choose from.
The 15 member committee will be selected with a 3-step process
Election process for project communities to elect 7 members of the
Selection process for affiliates to select 6 members of the committee.
Wikimedia Foundation process to appoint 2 members of the committee.
Communities elect 7 members
This announcement is related to the community elections, which will take
place in a time period of 2 weeks from October 11 to October 24. We look
forward to a wide participation across the communities to create the
committee to curate the writing of the Movement Charter. The Election
Results will be published on November 1.
Affiliates select 6 members
Parallel to the election process, all affiliates asked to contribute as
well: All affiliates were divided into eight geographic and one ‘thematic’
region (check the list), and each region chooses one person who will act as
a selector for that region. These 9 selectors will come together to select
6 of the committee (from the same pool of candidates). The selection
results will be published on November 1.
Wikimedia Foundation appoints 2 members
Finally, the Wikimedia Foundation will appoint two members to the committee
by November 1.
All three processes will be concluded by November 1, 2021, so that the
Movement Charter Drafting Committee can start working by then.
For the full context of the Movement Charter, its role, as well the process
for its creation, please have a look at Meta. You can also contact us at
any time on Telegram or via email (wikimedia2030(a)wikimedia.org).
Kaarel Vaidla (he/him)
Movement Strategy <https://meta.wikimedia.org/wiki/Strategy/2030>
Wikimedia Foundation <https://wikimediafoundation.org/>
Wikimedia is participating in the winter edition of this year's Outreachy <
https://www.outreachy.org/>  (December 2021–February 2022)! The deadline
to submit projects on the Outreachy website is *September 30th, 2021*.
If you would like to share an idea for a project that you would like to
mentor or you are not familiar with the program and want to learn anything
more about it, feel free to reply to this email or leave a note on <
*About the Outreachy program*:
Outreachy offers three-month internships to work remotely in Free and Open
Source Software (FOSS), coding and non-coding projects with experienced
mentors. These internships run twice a year–from May to August and December
to March. Interns are paid a stipend of USD 5,500 for the three months of
work. They also have a USD 500 stipend to travel to conferences and events.
Interns often find employment after their internship with Outreachy
sponsors or jobs that use the skills they learned during their internship.
This program is open to both students and non-students. Outreachy expressly
invites the following people to apply:
- Women (both cis and trans), trans men, and genderqueer people.
- Anyone who faces under-representation, systematic bias, or
discrimination in the technology industry in their country of residence.
- Residents and nationals of the United States of any gender who are
Black/African American, Hispanic/Latinx, Native American/American Indian,
Alaska Native, Native Hawaiian, or Pacific Islander.
See a blog post highlighting experiences and outcomes of interns who
participated in a previous round of Outreachy with Wikimedia <
Some tips for mentors for proposing projects:
- Follow this task description template when you propose a project in
Add #Outreachy (Round 23) tag to it.
- Remember, the project should require an experienced developer ~15 days
to complete and a newcomer ~3 months.
- Each project should have at least two mentors, and one of them should
hold a technical background.
- When it comes to picking a project, you could propose one that is:
- Relevant for your language community or brings impact to the
Wikimedia ecosystem in the future.
- Welcoming and newcomer-friendly and has a moderate learning curve.
- A new idea you are passionate about, there are no deadlines
attached to it; you always wanted to see it happen but couldn't
due to lack
of resources help!
- About developing a standalone tool (possibly hosted on Wikimedia
Toolforge), with fewer dependencies on Wikimedia's core
doesn't necessarily require a specific programming language.
See roles and responsibilities of an Outreachy mentor <
We look forward to your participation!
(On behalf of the organization team)
Senior Developer Advocate
Wikimedia Foundation <https://wikimediafoundation.org/>
Unfortunately (but not surprisingly) not a very in-depth interview,
*Give us a sense of your direction and vision for Wikimedia, especially in
such a fraught information landscape and in this polarized world.*
There are a few core principles of Wikimedia projects, including Wikipedia,
that I think are important starting points. It’s an online encyclopedia.
It’s not trying to be anything else. It’s certainly not trying to be a
traditional social media platform in any way. It has a structure that is
led by volunteer editors. And as you may know, the foundation has no
editorial control. This is very much a user-led community, which we support
The lessons to learn from, not just with what we’re doing but how we
continue to iterate and improve, start with this idea of radical
transparency. Everything on Wikipedia is cited. It’s debated on our talk
pages. So even when people may have different points of view, those debates
are public and transparent, and in some cases really allow for the right
kind of back and forth. I think that’s the need in such a polarized society
— you have to make space for the back and forth. But how do you do that in
a way that’s transparent and ultimately leads to a better product and
And the last thing that I’ll say is, you know, this is a community of
extremely humble and honest people. As we look to the future, how do we
build on those attributes in terms of what this platform can continue to
offer society and provide free access to knowledge? How do we make sure
that we are reaching the full diversity of humanity in terms of who is
invited to participate, who is written about? How are we really making sure
that our collective efforts reflect more of the global south, reflect more
women and reflect the diversity of human knowledge, to be more reflective
*What is your take on how Wikipedia fits into the widespread problem of
Many of the core attributes of this platform are very different than some
of the traditional social media platforms. If you take misinformation
around Covid, the Wikimedia Foundation entered into a partnership with the
World Health Organization. A group of volunteers came together around what
was called WikiProject Medicine, which is focused on medical content and
creating articles that then are very carefully monitored because these are
the kinds of topics that you want to be mindful around misinformation.
Another example is that the foundation put together a task force ahead of
the U.S. elections, again, trying to be very proactive. [The task force
supported 56,000 volunteer editors watching and monitoring key election
pages.] And the fact that there were only 33 reversions on the main U.S.
election page was an example of how to be very focused on key topics where
misinformation poses real risks.
Then another example that I just think is really cool is there’s a podcast
called “The World According to Wikipedia.” And on one of the episodes,
there’s a volunteer who is interviewed, and she really has made it her job
to be one of the main watchers of the climate change pages.
We have tech that alerts these editors when changes are made to any of the
pages so they can go see what the changes are. If there’s a risk that,
actually, misinformation may be creeping in, there’s an opportunity to
temporarily lock a page. Nobody wants to do that unless it’s absolutely
necessary. The climate change example is useful because the talk pages
behind that have massive debate. Our editor is saying: “Let’s have the
debate. But this is a page I’m watching and monitoring carefully.”
*One big debate that is currently happening on these social media platforms
is this issue of the censorship of information. There are people who claim
that biased views take precedence on these platforms and that more
conservative views are taken down. As you think about how to handle these
debates once you’re at the head of Wikipedia, how do you make judgment
calls with this happening in the background?*
For me, what’s been inspiring about this organization and these communities
is that there are core pillars that were established on Day 1 in setting up
Wikipedia. One of them is this idea of presenting information with a
neutral point of view, and that neutrality requires understanding all sides
and all perspectives.
It’s what I was saying earlier: Have the debates on talk pages on the side,
but then come to an informed, documented, verifiable citable kind of
conclusion on the articles. I think this is a core principle that, again,
could potentially offer something to others to learn from.
*Having come from a progressive organization fighting for women’s rights,
have you thought much about misinformers weaponizing your background to say
it may influence the calls you make about what is allowed on Wikipedia?*
I would say two things. I would say that the really relevant aspects of the
work that I’ve done in the past is volunteer-led movements, which is
probably a lot harder than others might think, and that I played a really
operational role in understanding how to build systems, build culture and
build processes that I think are going to be relevant for an organization
and a set of communities that are trying to increase their scale and reach.
The second thing that I would say is, again, I’ve been on my own learning
journey and invite you to be on a learning journey with me. How I choose to
be in the world is that we interact with others with an assumption of good
faith and that we engage in respectful and civilized ways. That doesn’t
mean other people are going to do that. But I think that we have to hold on
to that as an aspiration and as a way to, you know, be the change that we
want to see in the world as well.
*When I was in college, I would do a lot of my research on Wikipedia, and
some of my professors would say, ‘You know, that’s not a legitimate
source.’ But I still used it all the time. I wondered if you had any
thoughts about that!*
I think now most professors admit that they sneak onto Wikipedia as well to
look for things!
You know, we’re celebrating the 20th year of Wikipedia this year. On the
one hand, here was this thing that I think people mocked and said wouldn’t
go anywhere. And it’s now become legitimately the most referenced source in
all of human history. I can tell you just from my own conversations with
academics that the narrative around the sources on Wikipedia and using
Wikipedia has changed.
This summer we kicked off the second edition of the Wikimedia Accelerator
UNLOCK  with five project teams from across Europe (and one participant
from India) and intensively accompanied them with coaching, mentoring and
tons of input from expert within our movement and beyond that helped them
bring their ideas to life.
In no more than three months they have developed prototypes that offer
concrete and scalable solutions to the free knowledge and this year’s
challenge of (re)building trust in the digital age, ranging from a
crowdsourced directory that will help you find every official governmental
online account around the globe, a tool exposing and educating on
technology supply chains, to an open toolkit for language archivists on how
to create permanent, accessible and inclusive audiovisual archives .
You are invited to meet and learn from the teams as they showcase their
prototypes on *October 6th, 2021 at 4:00 - 5:30pm CEST*. Join us online
*What to expect on UNLOCK Demo Day?* All eyes on the project teams and the
prototypes! The first part will give you the chance to find out what the
participants have achieved in the last few months as well as the ups and
downs they have faced. There will be room and time for exchange and/or
exploring synergies (i.e. maybe the projects are interesting for your
communities, other ideas can be discussed for further development or
adaption of the projects in your particular context, etc.) We’ll conclude
the showcase with a discussion panel "Glimpse into the Future", in which we
will discuss with the teams their next steps and exchange ideas.
We would be more than happy to see you at the Demo Day. If you still have
open questions regarding the Demo Day, head to our blog post for answers
 or feel free to contact me. Also, we would appreciate if you could forward
this email and the attached invite to your network.
Looking forward to your participation!
👉 Stay up to date as we count down to Demo Day: Follow along on Twitter
Kannika Thaimai (she/her)Leitung / Lead UNLOCK Accelerator
UNLOCK Accelerator: We accelerate your ideas. Together we build the
future of Free Knowledge.
Keep up to date! Current news and exciting stories about Wikimedia,
Wikipedia and Free Knowledge in our newsletter (in German):
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
I have a psephological and election historical observation that I would like to share with Wikimedia.
Low-brow, crass, and manipulative political advertising and marketing, various hot-button, third-rail, dog-whistle, and wedge issues, have been deployed by candidates, campaigns, and political actors and organizations during American election seasons. These tactics are very much a part of our elections and appear to be subsequently omitted from encyclopedic (e.g., Wikipedia) and historical coverage of the elections (e.g., 2000 – 2020).
How low have election campaigns gone? Very. Yet, for some reasons, American encyclopedists and historians appear to be almost complicit, glossing over these problematic election campaign tactics. Each historical election appears to be reduced to a single encyclopedia article or small cluster of such articles, only some such articles attempt to list election issues, and no such article mentions campaign advertising and marketing themes and tactics deployed by campaigns, political actors, and organizations on radio, television, the Web, or social media.
I propose that encyclopedists, scholars, and scientists seek to attend to, remember, and record election campaign mass media tactics and manipulations lest we, the American people, be doomed to repeat them in future elections. Perhaps by remembering the election campaign advertising and marketing tactics utilized, including on social media, and listing them encyclopedically, a buoyant pressure can be created with which to elevate our American politics.
Thank you for your time and for considering these ideas with which to improve encyclopedic coverage of American elections.