Wikimedia is participating in the winter edition of this year's Outreachy <
https://www.outreachy.org/>  (December 2021–February 2022)! The deadline
to submit projects on the Outreachy website is *September 30th, 2021*.
If you would like to share an idea for a project that you would like to
mentor or you are not familiar with the program and want to learn anything
more about it, feel free to reply to this email or leave a note on <
*About the Outreachy program*:
Outreachy offers three-month internships to work remotely in Free and Open
Source Software (FOSS), coding and non-coding projects with experienced
mentors. These internships run twice a year–from May to August and December
to March. Interns are paid a stipend of USD 5,500 for the three months of
work. They also have a USD 500 stipend to travel to conferences and events.
Interns often find employment after their internship with Outreachy
sponsors or jobs that use the skills they learned during their internship.
This program is open to both students and non-students. Outreachy expressly
invites the following people to apply:
- Women (both cis and trans), trans men, and genderqueer people.
- Anyone who faces under-representation, systematic bias, or
discrimination in the technology industry in their country of residence.
- Residents and nationals of the United States of any gender who are
Black/African American, Hispanic/Latinx, Native American/American Indian,
Alaska Native, Native Hawaiian, or Pacific Islander.
See a blog post highlighting experiences and outcomes of interns who
participated in a previous round of Outreachy with Wikimedia <
Some tips for mentors for proposing projects:
- Follow this task description template when you propose a project in
Add #Outreachy (Round 23) tag to it.
- Remember, the project should require an experienced developer ~15 days
to complete and a newcomer ~3 months.
- Each project should have at least two mentors, and one of them should
hold a technical background.
- When it comes to picking a project, you could propose one that is:
- Relevant for your language community or brings impact to the
Wikimedia ecosystem in the future.
- Welcoming and newcomer-friendly and has a moderate learning curve.
- A new idea you are passionate about, there are no deadlines
attached to it; you always wanted to see it happen but couldn't
due to lack
of resources help!
- About developing a standalone tool (possibly hosted on Wikimedia
Toolforge), with fewer dependencies on Wikimedia's core
doesn't necessarily require a specific programming language.
See roles and responsibilities of an Outreachy mentor <
We look forward to your participation!
(On behalf of the organization team)
Senior Developer Advocate
Wikimedia Foundation <https://wikimediafoundation.org/>
Unfortunately (but not surprisingly) not a very in-depth interview,
*Give us a sense of your direction and vision for Wikimedia, especially in
such a fraught information landscape and in this polarized world.*
There are a few core principles of Wikimedia projects, including Wikipedia,
that I think are important starting points. It’s an online encyclopedia.
It’s not trying to be anything else. It’s certainly not trying to be a
traditional social media platform in any way. It has a structure that is
led by volunteer editors. And as you may know, the foundation has no
editorial control. This is very much a user-led community, which we support
The lessons to learn from, not just with what we’re doing but how we
continue to iterate and improve, start with this idea of radical
transparency. Everything on Wikipedia is cited. It’s debated on our talk
pages. So even when people may have different points of view, those debates
are public and transparent, and in some cases really allow for the right
kind of back and forth. I think that’s the need in such a polarized society
— you have to make space for the back and forth. But how do you do that in
a way that’s transparent and ultimately leads to a better product and
And the last thing that I’ll say is, you know, this is a community of
extremely humble and honest people. As we look to the future, how do we
build on those attributes in terms of what this platform can continue to
offer society and provide free access to knowledge? How do we make sure
that we are reaching the full diversity of humanity in terms of who is
invited to participate, who is written about? How are we really making sure
that our collective efforts reflect more of the global south, reflect more
women and reflect the diversity of human knowledge, to be more reflective
*What is your take on how Wikipedia fits into the widespread problem of
Many of the core attributes of this platform are very different than some
of the traditional social media platforms. If you take misinformation
around Covid, the Wikimedia Foundation entered into a partnership with the
World Health Organization. A group of volunteers came together around what
was called WikiProject Medicine, which is focused on medical content and
creating articles that then are very carefully monitored because these are
the kinds of topics that you want to be mindful around misinformation.
Another example is that the foundation put together a task force ahead of
the U.S. elections, again, trying to be very proactive. [The task force
supported 56,000 volunteer editors watching and monitoring key election
pages.] And the fact that there were only 33 reversions on the main U.S.
election page was an example of how to be very focused on key topics where
misinformation poses real risks.
Then another example that I just think is really cool is there’s a podcast
called “The World According to Wikipedia.” And on one of the episodes,
there’s a volunteer who is interviewed, and she really has made it her job
to be one of the main watchers of the climate change pages.
We have tech that alerts these editors when changes are made to any of the
pages so they can go see what the changes are. If there’s a risk that,
actually, misinformation may be creeping in, there’s an opportunity to
temporarily lock a page. Nobody wants to do that unless it’s absolutely
necessary. The climate change example is useful because the talk pages
behind that have massive debate. Our editor is saying: “Let’s have the
debate. But this is a page I’m watching and monitoring carefully.”
*One big debate that is currently happening on these social media platforms
is this issue of the censorship of information. There are people who claim
that biased views take precedence on these platforms and that more
conservative views are taken down. As you think about how to handle these
debates once you’re at the head of Wikipedia, how do you make judgment
calls with this happening in the background?*
For me, what’s been inspiring about this organization and these communities
is that there are core pillars that were established on Day 1 in setting up
Wikipedia. One of them is this idea of presenting information with a
neutral point of view, and that neutrality requires understanding all sides
and all perspectives.
It’s what I was saying earlier: Have the debates on talk pages on the side,
but then come to an informed, documented, verifiable citable kind of
conclusion on the articles. I think this is a core principle that, again,
could potentially offer something to others to learn from.
*Having come from a progressive organization fighting for women’s rights,
have you thought much about misinformers weaponizing your background to say
it may influence the calls you make about what is allowed on Wikipedia?*
I would say two things. I would say that the really relevant aspects of the
work that I’ve done in the past is volunteer-led movements, which is
probably a lot harder than others might think, and that I played a really
operational role in understanding how to build systems, build culture and
build processes that I think are going to be relevant for an organization
and a set of communities that are trying to increase their scale and reach.
The second thing that I would say is, again, I’ve been on my own learning
journey and invite you to be on a learning journey with me. How I choose to
be in the world is that we interact with others with an assumption of good
faith and that we engage in respectful and civilized ways. That doesn’t
mean other people are going to do that. But I think that we have to hold on
to that as an aspiration and as a way to, you know, be the change that we
want to see in the world as well.
*When I was in college, I would do a lot of my research on Wikipedia, and
some of my professors would say, ‘You know, that’s not a legitimate
source.’ But I still used it all the time. I wondered if you had any
thoughts about that!*
I think now most professors admit that they sneak onto Wikipedia as well to
look for things!
You know, we’re celebrating the 20th year of Wikipedia this year. On the
one hand, here was this thing that I think people mocked and said wouldn’t
go anywhere. And it’s now become legitimately the most referenced source in
all of human history. I can tell you just from my own conversations with
academics that the narrative around the sources on Wikipedia and using
Wikipedia has changed.
This summer we kicked off the second edition of the Wikimedia Accelerator
UNLOCK  with five project teams from across Europe (and one participant
from India) and intensively accompanied them with coaching, mentoring and
tons of input from expert within our movement and beyond that helped them
bring their ideas to life.
In no more than three months they have developed prototypes that offer
concrete and scalable solutions to the free knowledge and this year’s
challenge of (re)building trust in the digital age, ranging from a
crowdsourced directory that will help you find every official governmental
online account around the globe, a tool exposing and educating on
technology supply chains, to an open toolkit for language archivists on how
to create permanent, accessible and inclusive audiovisual archives .
You are invited to meet and learn from the teams as they showcase their
prototypes on *October 6th, 2021 at 4:00 - 5:30pm CEST*. Join us online
*What to expect on UNLOCK Demo Day?* All eyes on the project teams and the
prototypes! The first part will give you the chance to find out what the
participants have achieved in the last few months as well as the ups and
downs they have faced. There will be room and time for exchange and/or
exploring synergies (i.e. maybe the projects are interesting for your
communities, other ideas can be discussed for further development or
adaption of the projects in your particular context, etc.) We’ll conclude
the showcase with a discussion panel "Glimpse into the Future", in which we
will discuss with the teams their next steps and exchange ideas.
We would be more than happy to see you at the Demo Day. If you still have
open questions regarding the Demo Day, head to our blog post for answers
 or feel free to contact me. Also, we would appreciate if you could forward
this email and the attached invite to your network.
Looking forward to your participation!
👉 Stay up to date as we count down to Demo Day: Follow along on Twitter
Kannika Thaimai (she/her)Leitung / Lead UNLOCK Accelerator
UNLOCK Accelerator: We accelerate your ideas. Together we build the
future of Free Knowledge.
Keep up to date! Current news and exciting stories about Wikimedia,
Wikipedia and Free Knowledge in our newsletter (in German):
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
We are writing to invite you to take part in a new project that the
Wikimedia Foundation Trust & Safety Team is creating. This project aims to
maintain and expand peer-support networks within the Wikimedia communities,
improve the working atmosphere on our projects, and aid in the fight
In the Foundation, we believe that your participation and opinion is
important as it will enable us to gain diverse perspectives and new
information.This Peer Support Project endeavors to connect volunteers in a
way where they can support each other, and your knowledge, skills and
experience would be highly appreciated. For more information on the
project, please visit our Meta page
If you'd like to lend your expertise here, we are organizing Focus group
meetings as a next step. In those meetings, we hope to learn about already
existing networks, ideas on how volunteers could connect more and better,
and what support they would need to do so. These meetings will cover three
different time zones, and each meeting will last two hours. Feel free to
sign up for the meeting that is most comfortable for you!
Focus group meetings will be held on Google Meet on the days and times
1 October 2021, 7:30 - 9:30 UTC
15 October 2021, 15:00 - 17:00 UTC
22 October 2021, 19:00 - 21:00 UTC
Please feel free to encourage more members of your community or other
wikifriends to reach out to us if you believe that this project will be of
interest to them.
Do let us know if you are available to participate in one of those meetings
listed above and we will send out calendar invites to you. If you are
interested, but are unavailable in the time slots listed above, let us
know and we will look for ways to connect.
Please be aware that Trust & Safety will also reach out soon with a similar
offer to talk about training around the UCoC. Feel free to concentrate on
the issue that is most important to you, but of course you are also very
welcome to take part in both discussions. We are aware that capacity is
limited and will highly value any kind of participation!
Looking forward to hearing from you,
Christel Steigenberger (she/her)
Senior Trust and Safety Specialist
Wikimedia Foundation <https://wikimediafoundation.org/>
I have a psephological and election historical observation that I would like to share with Wikimedia.
Low-brow, crass, and manipulative political advertising and marketing, various hot-button, third-rail, dog-whistle, and wedge issues, have been deployed by candidates, campaigns, and political actors and organizations during American election seasons. These tactics are very much a part of our elections and appear to be subsequently omitted from encyclopedic (e.g., Wikipedia) and historical coverage of the elections (e.g., 2000 – 2020).
How low have election campaigns gone? Very. Yet, for some reasons, American encyclopedists and historians appear to be almost complicit, glossing over these problematic election campaign tactics. Each historical election appears to be reduced to a single encyclopedia article or small cluster of such articles, only some such articles attempt to list election issues, and no such article mentions campaign advertising and marketing themes and tactics deployed by campaigns, political actors, and organizations on radio, television, the Web, or social media.
I propose that encyclopedists, scholars, and scientists seek to attend to, remember, and record election campaign mass media tactics and manipulations lest we, the American people, be doomed to repeat them in future elections. Perhaps by remembering the election campaign advertising and marketing tactics utilized, including on social media, and listing them encyclopedically, a buoyant pressure can be created with which to elevate our American politics.
Thank you for your time and for considering these ideas with which to improve encyclopedic coverage of American elections.
Join the Conversation with the Wikimedia Foundation Board of Trustees
20, 2021 at 11:00 UTC (check for your local time
During the meeting, you will:
* Meet the incoming CEO, Maryana Iskander.
* Meet the candidates nominated through the latest Community-Selection
process to the Wikimedia Foundation’s Board of Trustees
* Learn about Board Committees’ work
* Engage through a Question & Answers session (partly pre-sent, and partly
Now in more details:
What are we announcing?
The Wikimedia Foundation’s Board of Trustees’ Community Affairs Committee
(CAC) is hosting its second Conversation with the Wikimedia Foundation
Board of Trustees (formerly called an Office Hour), which is an open forum
for the community to directly engage with Trustees.
Come and meet the Board of Trustees, including the candidates nominated for
the Board through the latest Community-Selection process, and the incoming
CEO of the Wikimedia Foundation; learn about what the Board of Trustees has
been up to lately; and engage through the Question & Answers section!
When & Where?
The meeting will be held on October 20, 2021 at 11:00 UTC (check for your
local time <https://zonestamp.toolforge.org/1634724017>)!
We promised at the last meeting to alternate the timings for these in order
to accommodate a wider range of time zones to be able to join in.
At least 3 Trustees and relevant Wikimedia Foundation staff will be in
The session will be streamed live and recorded, so those who cannot
participate live will be able to watch later.
Those who cannot attend are welcome to send questions in advance to the
session (details below).
How will it work?
The meeting will last for 90 minutes.
The first 40 minutes will include a short introduction to the session, and an
update on what the Board has been working on lately.
It will be followed by 50 minutes of Questions & Answers: around 20-30
minutes of answering questions sent in advance, and an additional 20-30
minutes of live questions.
We will be monitoring YouTube, the Wikimedia General Chat Telegram group
<https://t.me/WikimediaGeneral> and the Meta talk page for live questions.
We would like to encourage everyone, especially those who cannot
participate at the designated time, to send questions for the Board in
advance of the meeting. The structure is meant to enable the CAC to not
only update the community on current matters the Board is working on, but
also hear directly from the community. This will both increase the
transparency around the Board’s work and will help inform the CAC’s future
Setting the agenda with your Questions
In order to be as efficient as possible, and since we anticipate that some
questions will require answers from Wikimedia Foundation Staff, we are
community members to send questions in advance. Please send all questions
to askcac(a)wikimedia.org, by Wednesday, October 13 (midnight, whatever time
zone you may be in).
The Q & A section in the meeting agenda will be based on the main topics
related to the questions received. We will share more exact topics on Meta
3 days before the meeting, including final names of Trustees in attendance.
Please note --
* Participants will still be able to ask questions live if that is
* If you miss the deadline for sending questions (October 13th), you can
still send questions, if they are not addressed during the meeting, we will
follow up in writing after the meeting.
Registering to attend the meeting
For security reasons and specifically to avoid Zoombombing
<https://en.wikipedia.org/wiki/Zoombombing>, we will be sending the Zoom
link only to people who have registered in advance close to the meeting. In
order to register, please send an email to askcac(a)wikimedia.org. The title
should be: “Registration for the October 20 Conversation with Trustees”,
and it should indicate your name, username, affiliation if you have any,
and whether you will require live translation.
Requests for live translation must be sent at least a week before the
session, October 13th. As live translation is costly, there is a minimum of
5 people registering and requesting translation to a specific language that
is required to request for a live translation. If you miss the deadline of
requesting translation, or if there is no minimum of 5 people requesting
translation, you will still be able to see the main points after the
meeting, which will be translated to Arabic, Spanish, French, Russian,
Chinese, German and English.
Please note --
The session will be moderated based on the Universal Code of Conduct
Globally banned users will not be allowed in the zoom room, but can
still participate by watching live and sending questions in real-time.
Please help us spread the word by sharing this message with your community.
Hoping to see as many of you as possible,
Shani Evenstein Sigalov, on behalf of the CAC.
Shani Evenstein Sigalov
Acting Vice Chair, Board of Trustees
Wikimedia Foundation <https://wikimediafoundation.org/>
We hope that this email finds you well.
The Afrocine project core team is happy to inform you that the “Months of
African Cinema” Contest is happening again this year in October and
November. We invite Wikimedians all over the world to join in improving
content related to African cinema on Wikipedia!
Please list your username under the participants’ section
of the contest page to indicate your interest in participating in this
contest. The term "African" in the context of this contest, includes
people of African descent from all over the world, which includes the
diaspora and the Caribbean.
The following prizes would be recognized at the end of the contest, as gift
1st Prize - $500
2nd Prize - $200
3rd Prize - $100
Diversity Winner - $100
Gender-gap Filler - $100
Language Winners - up to $100*
For further information about the contest, the prizes, and how to
participate, please visit the contest page here.  For further inquiries,
please leave comments on the contest talk page. We look forward to your
Community Liaison, Afrocine project
(on-wiki: ; Google translated notice that there is a professional Chinese
translation of the email below - 中文翻譯見下文)
I’m Maggie Dennis, the Wikimedia Foundation’s Vice President of Community
Resilience & Sustainability. I’m reaching out to you today to talk about
a series of actions the Foundation has recently taken to protect
communities across the globe.
I apologize in advance for the length and the ambiguity in certain areas.
These are complicated issues, and I will try to summarize a lot of what may
be unfamiliar information to some of you succinctly. I will answer
questions to the best of my ability within safety parameters, and I will be
hosting an office hour in a few weeks where I can discuss these issues in
more depth. We’re currently getting that set up in regards to availability
of support staff and will announce it on Wikimedia-L and Meta as soon as
that information is prepared.
Many of you are already aware of recent changes that the Foundation has
made to its NDA policy. These changes have been discussed on Meta, and I
won’t reiterate all of our disclosures there, but I will briefly
summarize that due to credible information of threat, the Foundation has
modified its approach to accepting “non-disclosure agreements” from
individuals. The security risk relates to information about infiltration of
Wikimedia systems, including positions with access to personally
identifiable information and elected bodies of influence. We could not
pre-announce this action, even to our most trusted community partner groups
(like the stewards), without fear of triggering the risk to which we’d been
alerted. We restricted access to these tools immediately in the
jurisdictions of concern, while working with impacted users to determine if
the risk applied to them.
I want to pause to emphasize that we do not mean to accuse any specific
individual whose access was restricted by that policy change of bad intent.
Infiltration can occur through multiple mechanisms. What we have seen in
our own movement includes not only people deliberately seeking to
ingratiate themselves with their communities in order to obtain access and
advance an agenda contrary to open knowledge goals, but also individuals
who have become vulnerable to exploitation and harm by external groups
because they are already trusted insiders. This policy primarily served to
address the latter risk, to reduce the likelihood of recruitment or (worse)
extortion. We believe that some of the individuals impacted by this policy
change were also themselves in danger, not only the people whose personal
information they could have been forced to access.
Today, the Foundation has rolled out a second phase of addressing
infiltration concerns, which has resulted in sweeping actions in one of the
two currently affected jurisdictions. We have banned seven users and
desysopped a further 12 as a result of long and deep investigations into
activities around some members of the unrecognized group Wikimedians of
Mainland China. We have also reached out to a number of other editors
with explanations around canvassing guidelines and doxing policies and
requests to modify their behaviors.
When it comes to office actions, the Wikimedia Foundation typically
defaults to little public communication, but this case is unprecedented in
scope and nature. While there remain limits to what we can reveal in order
to protect the safety and privacy of users in that country and in that
unrecognized group, I want to acknowledge that this action is a radical one
and that this decision was not easily made. We struggled with not wanting
to discourage and destroy the efforts of good faith users in China who have
worked so hard to fight for free and open knowledge, including some of
those involved in this group. We do not want them to fear that their
contributions are unwelcome. We also could not risk exposing them to danger
by doing nothing to protect them after we became aware of credible threats
to their safety.
While some time ago we limited the exposure of personal information to
users in mainland China, we know that there has been the kind of
infiltration we describe above in the project. And we know that some users
have been physically harmed as a result. With this confirmed, we have no
choice but to act swiftly and appropriately in response.
I take it as both a triumph and a challenge that in the years of my own
involvement I have seen Wikimedia go from a suspect non-mainstream website
to a highly trusted and widely relied upon source across the world. When I
first started editing the projects in about 2007, I already believed
Wikimedia had the capacity to be one of the greatest achievements of the
world--collective knowledge, at your fingertips. What an amazing gesture of
goodwill on the part of all of its many editors. It didn’t take me long
after I started editing to realize how entrenched the battles could be over
how to present information and how that can be exploited to achieve
specific ends. I’m not trying to suggest that I was astonishingly
prescient; I think there were many who realized that risk long before I
stumbled naively on the scene. I do think that the risk is greater than
ever now, when Wikimedia projects are widely trusted, and when the stakes
are so high for organized efforts to control the information they share.
Community “capture” is a real and present threat. For years, the movement
has been widely aware of challenges in the Croatian Wikipedia, with
documentation going back nearly a decade. The Foundation recently set up a
disinformation team, which is still finding its footing and assessing the
problem, but which began by contracting an external researcher to review
that project and the challenges and help us understand potential causes and
solutions for such situations. We have also recently staffed a human
rights team to deal with urgent threats to the human rights of communities
across the group as a result of such organized efforts to control
information. The situation we are dealing with today has shown me how much
we need as a movement to grapple with the hard questions of how we remain
open to editing by anyone, anywhere, while ensuring that individuals who
take us up on that offer are not harmed by those who want to silence them.
With respect to the desysopping, we hope to connect with the international
Chinese language community in the near future to talk about approaches to
elections that avoid the risk of project capture and ensure that people are
and feel safe contributing to the Chinese language Wikipedia. We need to
make sure that the community can hold fair elections, without canvassing or
fraud. We hope that helping to establish such a fair approach to elections
will allow us to reinstate CheckUser rights in time.
I want to close this email by noting that I am personally deeply sorry to
those of you for whom this will be a shock. This will undoubtedly include
those who wonder if they should fear that their personal information has
been exposed (we do not believe so; we believe we acted in time to prevent
that) and also those who fear that further such bold action is in the works
which may disrupt them and their work and their communities (at this point,
with this action, we believe the identified risks have been contained in
the short to medium term). I am also truly sorry to those communities who
have been uneasy in the shadow of such threats for some time. The
Foundation continues to build our capacity to support every community that
wants or needs its support - and we are still learning how to do so well
when we do. One of the key areas we seek improvement is in our ability to
understand our human rights impact and in our ability to address those
challenges. You have not had the service you’ve deserved. We can’t fix
things immediately, but we are working to improve, actively, intentionally,
and with focus.
To the 4,000 active Chinese language Wikimedians distributed across the
world and serving readers in multiple continents, I would like to
communicate my sorrow and regret. I want to assure you that we will do
better. The work you do in sharing knowledge to Chinese readers everywhere
has great meaning, and we are committed to supporting you in doing this
work into the future, with the tools you need to succeed in a safe, secure,
and productive environment.
Again, I will answer what questions I can, also relying on the support of
others in Legal and perhaps beyond. We’re setting up a page on Meta to
talk, and I will be hosting an office hour in coming weeks.
我是 Maggie Dennis, 维基媒体基金会社团及延续性的领导。 今天我想和大家分享维基媒体基金会在全球保护社团采取的一系列办事行动。
office hour 在和大家更详细的研讨。我们正在设置有关于人力资源上的问题并会在 Wikimedia-L 和 Meta 发布讯息。
相信大家已经知道基金会在几周前对 NDA 政策的改变。这些改变已经在 Meta
今天，维基媒体基金会在两个受影响的区域之一，推出了第二阶段寻址浸入风险的扫荡行动。经过了深入调查非附属团体 Wikimedians of
Mainland China 的活动， 我们禁止了七个用户和删除了十二个管理员权限。
我们也必须确认中文项目的用户可以举办公平的选举，没有拉票或欺诈。 我们希望建设这些公平的法则来维持选举能够让我们在未来恢复 CheckUser 权利.
同样，我将回答我能回答的问题，也依赖于法律领域甚至其他领域的其他人的支持。我们正在 Meta 上建立一个页面来讨论，我将在未来几周内主办 office
Vice President, Community Resilience & Sustainability
Wikimedia Foundation, Inc.
Dear Community members,
Transitioning WikiIndaba online this year offers an unparalleled
opportunity to expand representation but it also requires participants to
have access to a quality internet connection to meaningfully and fully
engage and contribute.
THE CONNECTIVITY SCHOLARSHIP
For Wiki Indaba 2021, the first virtual edition of WikiIndaba Online, we
are launching a Connectivity Scholarship to provide direct financial
support for participants to connect and engage.
Session organizers, speakers, and participants can apply for 2 levels of
funding support ($50 USD or $100 USD), which they can use for:
- Direct connecting: Purchasing internet data or contributing to the
cost of internet access and use;
- Support connecting: Travel to access computers and internet points;
- Secure connecting: Purchasing privacy screens or VPNs;
- Other support: Cost of childcare or other services that enable active
You can apply to receive funding by clicking
https://bit.ly/wikiindaba2021scholarships The deadline for scholarship
applications is October 7, 2021 at 11:59pm EAT.
This is the first time a fund of this nature has been offered at Wiki
Indaba. We deem it necessary as we transition from the familiarity of
in-person conferences of yesteryear to the possibilities we are about to
explore at the online Wiki Indaba 2021. We welcome your questions and
feedback. You can reach out to our team by email at wikiuganda(a)gmail.com
Geoffrey Kateregga, on behalf of the Wiki Indaba 2021 Local Organizing Team