Hello everyone,
Wikimedia is participating in the winter edition of this year's Outreachy <
https://www.outreachy.org/> [1] (December 2021–February 2022)! The deadline
to submit projects on the Outreachy website is *September 30th, 2021*.
If you would like to share an idea for a project that you would like to
mentor or you are not familiar with the program and want to learn anything
more about it, feel free to reply to this email or leave a note on <
https://phabricator.wikimedia.org/T289893> [2].
*About the Outreachy program*:
Outreachy offers three-month internships to work remotely in Free and Open
Source Software (FOSS), coding and non-coding projects with experienced
mentors. These internships run twice a year–from May to August and December
to March. Interns are paid a stipend of USD 5,500 for the three months of
work. They also have a USD 500 stipend to travel to conferences and events.
Interns often find employment after their internship with Outreachy
sponsors or jobs that use the skills they learned during their internship.
This program is open to both students and non-students. Outreachy expressly
invites the following people to apply:
- Women (both cis and trans), trans men, and genderqueer people.
- Anyone who faces under-representation, systematic bias, or
discrimination in the technology industry in their country of residence.
- Residents and nationals of the United States of any gender who are
Black/African American, Hispanic/Latinx, Native American/American Indian,
Alaska Native, Native Hawaiian, or Pacific Islander.
See a blog post highlighting experiences and outcomes of interns who
participated in a previous round of Outreachy with Wikimedia <
https://techblog.wikimedia.org/2021/06/02/outreachy-round-21-experiences-an…>
[3]
Some tips for mentors for proposing projects:
- Follow this task description template when you propose a project in
Phabricator: <
https://phabricator.wikimedia.org/tag/outreach-programs-projects/> [4].
Add #Outreachy (Round 23) tag to it.
- Remember, the project should require an experienced developer ~15 days
to complete and a newcomer ~3 months.
- Each project should have at least two mentors, and one of them should
hold a technical background.
- When it comes to picking a project, you could propose one that is:
- Relevant for your language community or brings impact to the
Wikimedia ecosystem in the future.
- Welcoming and newcomer-friendly and has a moderate learning curve.
- A new idea you are passionate about, there are no deadlines
attached to it; you always wanted to see it happen but couldn't
due to lack
of resources help!
- About developing a standalone tool (possibly hosted on Wikimedia
Toolforge), with fewer dependencies on Wikimedia's core
infrastructure, it
doesn't necessarily require a specific programming language.
See roles and responsibilities of an Outreachy mentor <
https://www.mediawiki.org/wiki/Outreachy/Mentors> [5].
We look forward to your participation!
Cheers,
Srishti
(On behalf of the organization team)
[1] https://www.outreachy.org/
[2] https://phabricator.wikimedia.org/T289893
[3]
https://techblog.wikimedia.org/2021/06/02/outreachy-round-21-experiences-an…
[4] https://phabricator.wikimedia.org/tag/outreach-programs-projects/
[5] https://www.mediawiki.org/wiki/Outreachy/Mentors
*Srishti Sethi*
Senior Developer Advocate
Wikimedia Foundation <https://wikimediafoundation.org/>
Unfortunately (but not surprisingly) not a very in-depth interview,
US-centric.
https://www.nytimes.com/2021/09/23/technology/wikipedia-misinformation.html
*Give us a sense of your direction and vision for Wikimedia, especially in
such a fraught information landscape and in this polarized world.*
There are a few core principles of Wikimedia projects, including Wikipedia,
that I think are important starting points. It’s an online encyclopedia.
It’s not trying to be anything else. It’s certainly not trying to be a
traditional social media platform in any way. It has a structure that is
led by volunteer editors. And as you may know, the foundation has no
editorial control. This is very much a user-led community, which we support
and enable.
The lessons to learn from, not just with what we’re doing but how we
continue to iterate and improve, start with this idea of radical
transparency. Everything on Wikipedia is cited. It’s debated on our talk
pages. So even when people may have different points of view, those debates
are public and transparent, and in some cases really allow for the right
kind of back and forth. I think that’s the need in such a polarized society
— you have to make space for the back and forth. But how do you do that in
a way that’s transparent and ultimately leads to a better product and
better information?
And the last thing that I’ll say is, you know, this is a community of
extremely humble and honest people. As we look to the future, how do we
build on those attributes in terms of what this platform can continue to
offer society and provide free access to knowledge? How do we make sure
that we are reaching the full diversity of humanity in terms of who is
invited to participate, who is written about? How are we really making sure
that our collective efforts reflect more of the global south, reflect more
women and reflect the diversity of human knowledge, to be more reflective
of reality?
*What is your take on how Wikipedia fits into the widespread problem of
disinformation online?*
Many of the core attributes of this platform are very different than some
of the traditional social media platforms. If you take misinformation
around Covid, the Wikimedia Foundation entered into a partnership with the
World Health Organization. A group of volunteers came together around what
was called WikiProject Medicine, which is focused on medical content and
creating articles that then are very carefully monitored because these are
the kinds of topics that you want to be mindful around misinformation.
Another example is that the foundation put together a task force ahead of
the U.S. elections, again, trying to be very proactive. [The task force
supported 56,000 volunteer editors watching and monitoring key election
pages.] And the fact that there were only 33 reversions on the main U.S.
election page was an example of how to be very focused on key topics where
misinformation poses real risks.
Then another example that I just think is really cool is there’s a podcast
called “The World According to Wikipedia.” And on one of the episodes,
there’s a volunteer who is interviewed, and she really has made it her job
to be one of the main watchers of the climate change pages.
We have tech that alerts these editors when changes are made to any of the
pages so they can go see what the changes are. If there’s a risk that,
actually, misinformation may be creeping in, there’s an opportunity to
temporarily lock a page. Nobody wants to do that unless it’s absolutely
necessary. The climate change example is useful because the talk pages
behind that have massive debate. Our editor is saying: “Let’s have the
debate. But this is a page I’m watching and monitoring carefully.”
*One big debate that is currently happening on these social media platforms
is this issue of the censorship of information. There are people who claim
that biased views take precedence on these platforms and that more
conservative views are taken down. As you think about how to handle these
debates once you’re at the head of Wikipedia, how do you make judgment
calls with this happening in the background?*
For me, what’s been inspiring about this organization and these communities
is that there are core pillars that were established on Day 1 in setting up
Wikipedia. One of them is this idea of presenting information with a
neutral point of view, and that neutrality requires understanding all sides
and all perspectives.
It’s what I was saying earlier: Have the debates on talk pages on the side,
but then come to an informed, documented, verifiable citable kind of
conclusion on the articles. I think this is a core principle that, again,
could potentially offer something to others to learn from.
*Having come from a progressive organization fighting for women’s rights,
have you thought much about misinformers weaponizing your background to say
it may influence the calls you make about what is allowed on Wikipedia?*
I would say two things. I would say that the really relevant aspects of the
work that I’ve done in the past is volunteer-led movements, which is
probably a lot harder than others might think, and that I played a really
operational role in understanding how to build systems, build culture and
build processes that I think are going to be relevant for an organization
and a set of communities that are trying to increase their scale and reach.
The second thing that I would say is, again, I’ve been on my own learning
journey and invite you to be on a learning journey with me. How I choose to
be in the world is that we interact with others with an assumption of good
faith and that we engage in respectful and civilized ways. That doesn’t
mean other people are going to do that. But I think that we have to hold on
to that as an aspiration and as a way to, you know, be the change that we
want to see in the world as well.
*When I was in college, I would do a lot of my research on Wikipedia, and
some of my professors would say, ‘You know, that’s not a legitimate
source.’ But I still used it all the time. I wondered if you had any
thoughts about that!*
I think now most professors admit that they sneak onto Wikipedia as well to
look for things!
You know, we’re celebrating the 20th year of Wikipedia this year. On the
one hand, here was this thing that I think people mocked and said wouldn’t
go anywhere. And it’s now become legitimately the most referenced source in
all of human history. I can tell you just from my own conversations with
academics that the narrative around the sources on Wikipedia and using
Wikipedia has changed.
Dear all,
This summer we kicked off the second edition of the Wikimedia Accelerator
UNLOCK [1] with five project teams from across Europe (and one participant
from India) and intensively accompanied them with coaching, mentoring and
tons of input from expert within our movement and beyond that helped them
bring their ideas to life.
In no more than three months they have developed prototypes that offer
concrete and scalable solutions to the free knowledge and this year’s
challenge of (re)building trust in the digital age, ranging from a
crowdsourced directory that will help you find every official governmental
online account around the globe, a tool exposing and educating on
technology supply chains, to an open toolkit for language archivists on how
to create permanent, accessible and inclusive audiovisual archives [2].
You are invited to meet and learn from the teams as they showcase their
prototypes on *October 6th, 2021 at 4:00 - 5:30pm CEST*. Join us online
here: wikimedia.de/unlock
*What to expect on UNLOCK Demo Day?* All eyes on the project teams and the
prototypes! The first part will give you the chance to find out what the
participants have achieved in the last few months as well as the ups and
downs they have faced. There will be room and time for exchange and/or
exploring synergies (i.e. maybe the projects are interesting for your
communities, other ideas can be discussed for further development or
adaption of the projects in your particular context, etc.) We’ll conclude
the showcase with a discussion panel "Glimpse into the Future", in which we
will discuss with the teams their next steps and exchange ideas.
We would be more than happy to see you at the Demo Day. If you still have
open questions regarding the Demo Day, head to our blog post for answers
[3] or feel free to contact me. Also, we would appreciate if you could forward
this email and the attached invite to your network.
Looking forward to your participation!
Kannika
[1] https://www.wikimedia.de/unlock/
[2] https://www.wikimedia.de/unlock/projects/
[3] https://www.wikimedia.de/unlock-blog/demo-day-2021/
👉 Stay up to date as we count down to Demo Day: Follow along on Twitter
@unlock_acc <https://twitter.com/UNLOCK_Acc>
--
Kannika Thaimai (she/her)Leitung / Lead UNLOCK Accelerator
UNLOCK Accelerator: We accelerate your ideas. Together we build the
future of Free Knowledge.
www.wikimedia.de/unlock
Keep up to date! Current news and exciting stories about Wikimedia,
Wikipedia and Free Knowledge in our newsletter (in German):
https://www.wikimedia.de/newsletter/
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
https://www.wikimedia.de/
Wikimedia,
I have a psephological and election historical observation that I would like to share with Wikimedia.
Low-brow, crass, and manipulative political advertising and marketing, various hot-button, third-rail, dog-whistle, and wedge issues, have been deployed by candidates, campaigns, and political actors and organizations during American election seasons. These tactics are very much a part of our elections and appear to be subsequently omitted from encyclopedic (e.g., Wikipedia) and historical coverage of the elections (e.g., 2000 – 2020).
How low have election campaigns gone? Very. Yet, for some reasons, American encyclopedists and historians appear to be almost complicit, glossing over these problematic election campaign tactics. Each historical election appears to be reduced to a single encyclopedia article or small cluster of such articles, only some such articles attempt to list election issues, and no such article mentions campaign advertising and marketing themes and tactics deployed by campaigns, political actors, and organizations on radio, television, the Web, or social media.
I propose that encyclopedists, scholars, and scientists seek to attend to, remember, and record election campaign mass media tactics and manipulations lest we, the American people, be doomed to repeat them in future elections. Perhaps by remembering the election campaign advertising and marketing tactics utilized, including on social media, and listing them encyclopedically, a buoyant pressure can be created with which to elevate our American politics.
Thank you for your time and for considering these ideas with which to improve encyclopedic coverage of American elections.
Best regards,
Adam Sobieski
Greetings!
We hope that this email finds you well.
The Afrocine project core team is happy to inform you that the “Months of
African Cinema” Contest is happening again this year in October and
November. We invite Wikimedians all over the world to join in improving
content related to African cinema on Wikipedia!
Please list your username under the participants’ section
<https://en.wikipedia.org/wiki/Wikipedia:WikiProject_AfroCine/Months_of_Afri…>
of the contest page to indicate your interest in participating in this
contest.[1] The term "African" in the context of this contest, includes
people of African descent from all over the world, which includes the
diaspora and the Caribbean.
The following prizes would be recognized at the end of the contest, as gift
cards.
-
Overall winners
-
1st Prize - $500
-
2nd Prize - $200
-
3rd Prize - $100
-
Diversity Winner - $100
-
Gender-gap Filler - $100
-
Language Winners - up to $100*
For further information about the contest, the prizes, and how to
participate, please visit the contest page here. [2] For further inquiries,
please leave comments on the contest talk page. We look forward to your
participation!
Best regards,
Eben Mlay
Community Liaison, Afrocine project
1.
https://en.wikipedia.org/wiki/Wikipedia:WikiProject_AfroCine/Months_of_Afri…
2.
https://en.wikipedia.org/wiki/Wikipedia:WikiProject_AfroCine/Months_of_Afri…
(on-wiki: ; Google translated notice that there is a professional Chinese
translation of the email below - 中文翻譯見下文)
Hello, everyone.
I’m Maggie Dennis, the Wikimedia Foundation’s Vice President of Community
Resilience & Sustainability.[1] I’m reaching out to you today to talk about
a series of actions the Foundation has recently taken to protect
communities across the globe.
I apologize in advance for the length and the ambiguity in certain areas.
These are complicated issues, and I will try to summarize a lot of what may
be unfamiliar information to some of you succinctly. I will answer
questions to the best of my ability within safety parameters, and I will be
hosting an office hour in a few weeks where I can discuss these issues in
more depth. We’re currently getting that set up in regards to availability
of support staff and will announce it on Wikimedia-L and Meta as soon as
that information is prepared.
Many of you are already aware of recent changes that the Foundation has
made to its NDA policy. These changes have been discussed on Meta, and I
won’t reiterate all of our disclosures there,[2] but I will briefly
summarize that due to credible information of threat, the Foundation has
modified its approach to accepting “non-disclosure agreements” from
individuals. The security risk relates to information about infiltration of
Wikimedia systems, including positions with access to personally
identifiable information and elected bodies of influence. We could not
pre-announce this action, even to our most trusted community partner groups
(like the stewards), without fear of triggering the risk to which we’d been
alerted. We restricted access to these tools immediately in the
jurisdictions of concern, while working with impacted users to determine if
the risk applied to them.
I want to pause to emphasize that we do not mean to accuse any specific
individual whose access was restricted by that policy change of bad intent.
Infiltration can occur through multiple mechanisms. What we have seen in
our own movement includes not only people deliberately seeking to
ingratiate themselves with their communities in order to obtain access and
advance an agenda contrary to open knowledge goals, but also individuals
who have become vulnerable to exploitation and harm by external groups
because they are already trusted insiders. This policy primarily served to
address the latter risk, to reduce the likelihood of recruitment or (worse)
extortion. We believe that some of the individuals impacted by this policy
change were also themselves in danger, not only the people whose personal
information they could have been forced to access.
Today, the Foundation has rolled out a second phase of addressing
infiltration concerns, which has resulted in sweeping actions in one of the
two currently affected jurisdictions. We have banned seven users and
desysopped a further 12 as a result of long and deep investigations into
activities around some members of the unrecognized group Wikimedians of
Mainland China.[3] We have also reached out to a number of other editors
with explanations around canvassing guidelines and doxing policies and
requests to modify their behaviors.
When it comes to office actions, the Wikimedia Foundation typically
defaults to little public communication, but this case is unprecedented in
scope and nature. While there remain limits to what we can reveal in order
to protect the safety and privacy of users in that country and in that
unrecognized group, I want to acknowledge that this action is a radical one
and that this decision was not easily made. We struggled with not wanting
to discourage and destroy the efforts of good faith users in China who have
worked so hard to fight for free and open knowledge, including some of
those involved in this group. We do not want them to fear that their
contributions are unwelcome. We also could not risk exposing them to danger
by doing nothing to protect them after we became aware of credible threats
to their safety.
While some time ago we limited the exposure of personal information to
users in mainland China, we know that there has been the kind of
infiltration we describe above in the project. And we know that some users
have been physically harmed as a result. With this confirmed, we have no
choice but to act swiftly and appropriately in response.
I take it as both a triumph and a challenge that in the years of my own
involvement I have seen Wikimedia go from a suspect non-mainstream website
to a highly trusted and widely relied upon source across the world. When I
first started editing the projects in about 2007, I already believed
Wikimedia had the capacity to be one of the greatest achievements of the
world--collective knowledge, at your fingertips. What an amazing gesture of
goodwill on the part of all of its many editors. It didn’t take me long
after I started editing to realize how entrenched the battles could be over
how to present information and how that can be exploited to achieve
specific ends. I’m not trying to suggest that I was astonishingly
prescient; I think there were many who realized that risk long before I
stumbled naively on the scene. I do think that the risk is greater than
ever now, when Wikimedia projects are widely trusted, and when the stakes
are so high for organized efforts to control the information they share.
Community “capture” is a real and present threat. For years, the movement
has been widely aware of challenges in the Croatian Wikipedia, with
documentation going back nearly a decade. The Foundation recently set up a
disinformation team, which is still finding its footing and assessing the
problem, but which began by contracting an external researcher to review
that project and the challenges and help us understand potential causes and
solutions for such situations.[4] We have also recently staffed a human
rights team to deal with urgent threats to the human rights of communities
across the group as a result of such organized efforts to control
information. The situation we are dealing with today has shown me how much
we need as a movement to grapple with the hard questions of how we remain
open to editing by anyone, anywhere, while ensuring that individuals who
take us up on that offer are not harmed by those who want to silence them.
With respect to the desysopping, we hope to connect with the international
Chinese language community in the near future to talk about approaches to
elections that avoid the risk of project capture and ensure that people are
and feel safe contributing to the Chinese language Wikipedia. We need to
make sure that the community can hold fair elections, without canvassing or
fraud. We hope that helping to establish such a fair approach to elections
will allow us to reinstate CheckUser rights in time.
I want to close this email by noting that I am personally deeply sorry to
those of you for whom this will be a shock. This will undoubtedly include
those who wonder if they should fear that their personal information has
been exposed (we do not believe so; we believe we acted in time to prevent
that) and also those who fear that further such bold action is in the works
which may disrupt them and their work and their communities (at this point,
with this action, we believe the identified risks have been contained in
the short to medium term). I am also truly sorry to those communities who
have been uneasy in the shadow of such threats for some time. The
Foundation continues to build our capacity to support every community that
wants or needs its support - and we are still learning how to do so well
when we do. One of the key areas we seek improvement is in our ability to
understand our human rights impact and in our ability to address those
challenges. You have not had the service you’ve deserved. We can’t fix
things immediately, but we are working to improve, actively, intentionally,
and with focus.
To the 4,000 active Chinese language Wikimedians distributed across the
world and serving readers in multiple continents,[5][6] I would like to
communicate my sorrow and regret. I want to assure you that we will do
better. The work you do in sharing knowledge to Chinese readers everywhere
has great meaning, and we are committed to supporting you in doing this
work into the future, with the tools you need to succeed in a safe, secure,
and productive environment.
Again, I will answer what questions I can, also relying on the support of
others in Legal and perhaps beyond. We’re setting up a page on Meta to
talk, and I will be hosting an office hour in coming weeks.
Best regards,
Maggie
[1] https://meta.wikimedia.org/wiki/Community_Resilience_and_Sustainability
[2]
https://meta.wikimedia.org/wiki/Talk:Access_to_nonpublic_personal_data_poli…
[3] https://meta.wikimedia.org/wiki/Wikimedians_of_Mainland_China
[4]
https://meta.wikimedia.org/wiki/Croatian_Wikipedia_Disinformation_Assessmen…
[5] https://stats.wikimedia.org/#/zh.wikipedia.org
[6]
https://stats.wikimedia.org/#/zh.wikipedia.org/reading/page-views-by-countr…
***
大家好
我是 Maggie Dennis, 维基媒体基金会社团及延续性的领导。[1] 今天我想和大家分享维基媒体基金会在全球保护社团采取的一系列办事行动。
我在这里先向大家说声对不起。这封信会比较长,有些方面也会比较歧义。这些事的确比较复杂,但我会尽量简化但明确的把这些资料和大家分享。我会在安全范围内尽我所能的回答问题,我也会在未来的几个星期主办
office hour 在和大家更详细的研讨。我们正在设置有关于人力资源上的问题并会在 Wikimedia-L 和 Meta 发布讯息。
相信大家已经知道基金会在几周前对 NDA 政策的改变。这些改变已经在 Meta
讨论过了,我也不必在这里重申,[2]但让我在这里简要地说明。基金会收到了有关各人威胁的可信消息并调整了接受各人“non-disclosure
agreements”的姿态。这个安全风险是有关于浸入及索取基金会的系统,也包括取数个人识别资料和选举管理机构的影响。我们不能预先宣布这新的策略即使是我们最信任的团体
(stewards), 为了不触发这些风险。我们在受影响的区域限制了使用权并且和受影响的使用者讨论风险对它们的影响。
我想在这里强调不是某个受影响的人藏了恶意而是浸入发生可以有很多种。我们知道在维基百科里有不良角色和社团迎合为了就是取数和推进反开放知识的目标这也包括某人受了不良角色的影响而屈服因为他们已经是认可的知情人。这策略改变的目的是为了减少后者的风险,招募或更严重的敲诈勒索。我们相信有些受影响的用户自己有以上的风险而不限制与有可能被逼盗用有个人资料使用者。
今天,维基媒体基金会在两个受影响的区域之一,推出了第二阶段寻址浸入风险的扫荡行动。经过了深入调查非附属团体 Wikimedians of
Mainland China 的活动, 我们禁止了七个用户和删除了十二个管理员权限。[3]
我们还联系了一些其他编辑,解释了有关拉票指南和人肉政策的解释,,并要求它们调整这些行为。
有关于办事行动维基媒体基金会通常不会向外公开但这个案件的范围和性质是前所未有的。在安全和隐私的范围内我们不能透露在非附属团体的这些用户,但我想承认这行动是激进的,而且做出这一决定并不容易。
我们努力地不想阻止和破坏中文真诚用户的努力,他们为自由和开放的知识而努力奋斗,包括参与该群体的一些人。我们也不想让真诚的用户觉得不实欢迎,当我们收到了对它们安全可信的威胁,我们也不能冒险采取任何措施保护他们,从而使他们面临危险。
一些时间前,
我们限制了在中国用户的个人资料暴露,也知道在中文维基百科有相似的浸入。我们也确认了有些用户为某些因故而受了身体伤害。我们别无选者必须快速做回应。
当我回顾我在维基媒体的这些年,从一个非主流的网站转变成一个全求都信任的线上百科全书,我把这个案件当作是一个挑战和胜利。
在2007年当我钢开始改编的时候我已经相信维基媒体会是全球最大成就之一, 那就是集体知识,在手指上。
不用多久时间,我就发现了许多编辑人员善意的姿态和那些用来呈现资料角度的战争。我不是在暗示我有预见性的警告,我觉得很多用户在我参与前就知道这事会发生。我不认为这个风险在这个时候比较高,当维基媒体的项目收到这么庞大的信任,还有组织的努力来控制我们分享的知识。
团体“占领”
是一个真是的风险。多年已来,基金会意识到的克罗地亚维基百科面临的挑战。我们也有进十年的文档。基金会在最近设立了虚假信息团队,但是我们还在评估克罗地亚维基百科的问题。这些问题是基金会较早前聘请的承包商来帮我们理解原因和解决办法。[4]
为了应付团体组织的资料控制,我们也设立了一组人权团队来应对紧急人权危机。我们惊天所免领的问题也让我看到了我们所需要的来应付这些困难问题,像是如何继续开放编辑给每个人,在每个地方但能够确保我们的用户在编辑中受到被封的威胁下感到安全。
在管理员权限,我们希望能够与国际华语群体链接来参与及讨论选举的方向为了避免团体占领也绕着不知让中文维基百科的用户感到安全也绝对是安全。
我们也必须确认中文项目的用户可以举办公平的选举,没有拉票或欺诈。 我们希望建设这些公平的法则来维持选举能够让我们在未来恢复 CheckUser 权利.
我想在结束这封电子邮件时指出,我个人对你们中的那些感到震惊的人深表歉意。这无疑将包括那些想知道他们是否应该担心他们的个人信息被暴露的人(我们不这么认为;我们相信我们及时采取了行动以防止这种情况发生)以及那些担心进一步采取这种大胆行动的人可能会扰乱他们及其工作和社区(此时,通过这一行动,我们相信已识别的风险已在中短期内得到控制)。我也对那些在这种威胁的阴影下一段时间感到不安的社区深表歉意。基金会继续建设我们的能力,以支持每个想要或需要其支持的社区我们仍在学习如何在我们这样做的时候做得很好。我们寻求改进的关键领域之一是了解我们的人权影响的能力以及我们应对这些挑战的能力。你没有得到你应得的服务。我们无法立即解决问题,但我们正在积极、有意识地、专注地努力改进。
向分布在世界各地、服务于多个大洲的读者的4000名活跃中文维基人,[5][6]我想传达我的悲伤和遗憾。我想向你保证,我们会做得更好。您为世界各地的中文读者分享知识所做的工作意义重大,我们致力于支持您在未来开展这项工作,并提供您在安全、可靠和高效的环境中取得成功所需的工具。
同样,我将回答我能回答的问题,也依赖于法律领域甚至其他领域的其他人的支持。我们正在 Meta 上建立一个页面来讨论,我将在未来几周内主办 office
hour 在和大家更详细的研讨。
此致,
Maggie
[1] https://meta.wikimedia.org/wiki/Community_Resilience_and_Sustainability
[2]
https://meta.wikimedia.org/wiki/Talk:Access_to_nonpublic_personal_data_poli…
[3] https://meta.wikimedia.org/wiki/Wikimedians_of_Mainland_China
[4]
https://meta.wikimedia.org/wiki/Croatian_Wikipedia_Disinformation_Assessmen…
[5] https://stats.wikimedia.org/#/zh.wikipedia.org
[6]
https://stats.wikimedia.org/#/zh.wikipedia.org/reading/page-views-by-countr…
--
Maggie Dennis
She/her/hers
Vice President, Community Resilience & Sustainability
Wikimedia Foundation, Inc.
Dear Community members,
Transitioning WikiIndaba online this year offers an unparalleled
opportunity to expand representation but it also requires participants to
have access to a quality internet connection to meaningfully and fully
engage and contribute.
THE CONNECTIVITY SCHOLARSHIP
<https://meta.wikimedia.org/wiki/WikiIndaba_conference_2021/Scholarships>
For Wiki Indaba 2021, the first virtual edition of WikiIndaba Online, we
are launching a Connectivity Scholarship to provide direct financial
support for participants to connect and engage.
Session organizers, speakers, and participants can apply for 2 levels of
funding support ($50 USD or $100 USD), which they can use for:
- Direct connecting: Purchasing internet data or contributing to the
cost of internet access and use;
- Support connecting: Travel to access computers and internet points;
- Secure connecting: Purchasing privacy screens or VPNs;
- Other support: Cost of childcare or other services that enable active
participation.
Apply Now
You can apply to receive funding by clicking
https://bit.ly/wikiindaba2021scholarships The deadline for scholarship
applications is October 7, 2021 at 11:59pm EAT.
This is the first time a fund of this nature has been offered at Wiki
Indaba. We deem it necessary as we transition from the familiarity of
in-person conferences of yesteryear to the possibilities we are about to
explore at the online Wiki Indaba 2021. We welcome your questions and
feedback. You can reach out to our team by email at wikiuganda(a)gmail.com
Kind regards,
Geoffrey Kateregga, on behalf of the Wiki Indaba 2021 Local Organizing Team
Dear Everyone,
Our (WMHKG) annual report has just been passed in our annual meeting and thereafter published. The whole document is in .pdf form and uploaded to Wikimedia Commons.
Please feel welcome to read via this link: https://commons.wikimedia.org/wiki/File:WMHKG_Annual_Report_2020-2021.pdf
Cheers,
User:だ*ぜ (Dasze)
Director,
Wikimedia Community User Group Hong Kong (WMHKG)
Dear Community Members,
We are pleased to announce that registration for Wiki Indaba 2021
<https://meta.wikimedia.org/wiki/WikiIndaba_conference_2021> is now open.
This year’s Wiki Indaba will be hosted by the Wikimedia Community User
group Uganda and held virtually between 05 - 07 November 2021 under the
theme Rethink + Reset : Visions of the Future.
Click on this link to register http://bit.ly/wikiindaba2021
<http://bit.ly/wikiindaba2021>
The call for submissions is still open until September 24th, please submit
your session proposal either as a presentation, panel discussion, lightning
talk, or workshop before September 24th, 2021 by simply stating the title
of your session and providing a brief description via
https://pretalx.com/wiki-indaba-2021/cfp.
Please reach out to wikiuganda(a)gmail.com if you have any questions.
Regards,
Geoffrey Kateregga, on behalf of the Wiki Indaba 2021 Local Organizing
Committee.