FYI, I asked WMF Communication Team about any plans of using Mastodon in future.
Here is their response  "The Digital Communications team has been researching Mastodon and considering our potential involvement with the platform in the future. At this time, we have no plans to create an account for the Foundation or Wikipedia. This is mainly because our observations show us that Mastodon is not yet reaching a large audience, which is one of the key objectives of our communications activity on social media. We will continue to monitor the situation and adjust our recommendations and practices to keep within our objectives."
In 2 months from now Queering Wikipedia 2023 Conference
will be held (in hybrid mode online with offline 'nodes')
as trans-local events on 3 days of May:
Friday 12th, Sunday 14th and Wednesday 17th
- known for International Day Against Homophobia.
Abbreviated as QW2023 the event is following efforts
and events in 2020, 2021, 2022 in hope
that future will bring more diverse incarnations
for queering not only Wikipedia, but also Wikimedia
and conference making itself...
...in one week we are closing the #2 round
of Calls for Content Proposals for QW2023 and
start publishing first fixed program details.
After the #2 round QW2023 will only exceptionally
consider proposals in the areas recognized
as gaps that need urgent addressing.
For more details check out
...and consider to join us for live chats online on Mondays
For QW2023 organizing team Z. Blace
Some months have gone since I started this topic in this list, and still, we can't know how much engagement we have at Wikipedia, because data is not available. Twitter is now owned by Elon Musk, things are changing, there are more accounts in Mastodon daily, but still Twitter matters. I have been looking at the Twitter activity in the last days for @Wikipedia and I'm still very worried about the (lack of) strategy followed here. A full team, with staff members, which only produces one tweet per day, a lonely message in the vastness of the ocean, and gets really poor engagement numbers.
A couple of weeks ago Pelé, one of the greatest football players of all time, died. (English) Wikipedia Twitter account needed 7 days to tweet about it, even if the article was changed in a few minutes after the death (https://twitter.com/Wikipedia/status/1611363972174778368). The tweet had 13.729 impressions (now we can know the number of impressions), 14 RTs and 129 likes. Wikipedia account has nearly 644.000 followers. If we divide these two numbers, we get a rate of 2,13% of impressions per follower.
The same day Pelé died, Basque Wikipedia made a tweet. Not a week after, just when it was news (https://twitter.com/euwikipedia/status/1608541274491211776). The tweet had 964 impressions, 3 RTs and 2 likes. Basque Wikipedia account has 7,956 followers. This is a rate of 12,11% of impressions per follower. x5.68 times larger, relatively than (English) Wikipedia Twitter account.
(English) Wikipedia Twitter account has nearly 81 times more followers than the Basque one. English Wikipedia is more visible, because it has a (now golden) verified account symbol, so tweets are more often promoted. English has 1.500 million speakers around the world. Basque has fewer than one million. English Wikipedia should have around 1.000 more followers than Basque Wikipedia. English Wikipedia article about Pelé had 2,5 million pageviews in the two days after his death. Basque had 250 pageviews. This is 10.000 times more pageviews.
@Wikipedia has 644.000 followers, and @euwikipedia has nearly 8.000. Audience of English Wikipedia is 10.000 times larger for the same event. Why Wikipedia is not 10.000 times larger? Why doesn't Wikipedia account have 80 million followers? YouTube's Twitter account has 78 million followers. "By 2030, Wikimedia is to become the central infrastructure for Free Knowledge on the Internet.". How could we if Youtube's account has 100x more followers than we have? How can think that we are in a good shape if our tweets are only seen by less than 2% of our followers?
I hope that 2023 comes with a change. A change to open these accounts, have a fresh way of thinking on social media ,and building engagement, both with momentum, not losing opportunities, and promoting good content.
From: Galder Gonzalez Larrañaga <galder158(a)hotmail.com>
Sent: Tuesday, August 16, 2022 3:21 PM
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
Subject: Re: [Wikimedia-l] Re: @Wikipedia losing opportunities in Twitter
Some weeks ago, we had a discussion here about the different approaches we have for the @wikipedia account at Twitter. We don't know yet how many interactions does the account has, but as I said in the discussion, we try to find ways to measure our work at @euwikipedia. Today I want to share with you that this account was ranked last week as the most influential social-movements account in Basque language (https://umap.eus/ranking/gizartea) and the 10th most influential account in all categories (https://umap.eus/ranking/orokorra). This is a good metric we use to know if we are doing fine or not.
From: Andy Mabbett <andy(a)pigsonthewing.org.uk>
Sent: Friday, August 5, 2022 8:50 PM
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
Subject: [Wikimedia-l] Re: @Wikipedia losing opportunities in Twitter
On Mon, 18 Jul 2022 at 18:48, Lauren Dickinson <ldickinson(a)wikimedia.org> wrote:
> Also, Andy, we will follow up this week regarding your questions
> about the @WiktionaryUsers and @Wiktionary accounts.
Three working weeks have passed since the above was written; I've seen
no such follow-up. Have I missed something?
Wikimedia-l mailing list -- wikimedia-l(a)lists.wikimedia.org, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at https://firstname.lastname@example.org…
To unsubscribe send an email to wikimedia-l-leave(a)lists.wikimedia.org
The Community Affairs Committee of the Wikimedia Foundation Board of
Trustees would like to thank everyone who participated in the recently
concluded community vote on the Enforcement Guidelines for the Universal
Code of Conduct (UCoC)
The volunteer scrutinizing group has completed the review of the accuracy
of the vote and has reported the total number of votes received as 2,283.
Out of the 2,283 votes received, 1,338 (58.6%) community members voted for
the enforcement guidelines, and a total of 945 (41.4%) community members
voted against it. In addition, 658 participants left comments, with 77% of
the comments written in English.
We recognize and appreciate the passion and commitment that community
members have demonstrated in creating a safe and welcoming culture.
Wikimedia community culture stops hostile and toxic behavior, supports
people targeted by such behavior, and encourages good faith people to be
productive on the Wikimedia projects.
Even at this incomplete stage, this is evident in the comments received. The
Enforcement Guidelines did reach a threshold of support necessary for the
Board to review. However, we encouraged voters, regardless of how they were
voting, to provide feedback on the elements of the enforcement guidelines.
We asked the voters to inform us what changes were needed and in case it
was prudent to launch a further round of edits that would address community
Foundation staff who have been reviewing comments have advised us of the
emerging themes. As a result, as Community Affairs Committee, we have
decided to ask the Foundation to reconvene the Drafting Committee. The
Drafting Committee will undertake another community engagement to refine
the enforcement guidelines based on the community feedback received from
the recently concluded vote.
For clarity, this feedback has been clustered into four sections as follows:
To identify the type, purpose, and applicability of the UCoC training;
To simplify the language for more accessible translation and
comprehension by non-experts;
To explore the concept of affirmation, including its pros and cons;
To review the conflicting roles of privacy/victim protection and the
right to be heard.
Other issues may emerge during conversations, particularly as the draft
Enforcement Guidelines evolve, but we see these as the primary areas of
concern for voters. Therefore, we are asking staff to facilitate a review
of these issues. Then, after the further engagement, the Foundation should
re-run the community vote to evaluate the redrafted Enforcement Outline to
see if the new document is ready for its official ratification.
Further, we are aware of the concerns with note 3.1 in the Universal Code
of Conduct Policy. Therefore, we are directing the Foundation to review
this part of the Code to ensure that the Policy meets its intended purposes
of supporting a safe and inclusive community without waiting for the
planned review of the entire Policy at the end of the year.
Again, we thank all who participated in the vote and discussion, thinking
about these complex challenges and contributing to better approaches to
working together well across the movement.
*Rosie Stephenson-Goodknight *(she/her)
Acting Chair, Community Affairs Committee
Wikimedia Foundation <https://wikimediafoundation.org/> Board of Trustees
Last year, as part of our annual planning process, the Wikimedia Foundation
shared a list of external trends
that we believed were likely to significantly impact the context in which
the Wikimedia movement operates. Our focus at the time was on the changing
nature of search, the astronomical rise in the global demand for content,
and rich media content in particular, and the concerning rise of
misinformation and disinformation. We heard from many in our movement about
additional trends that our movement faces that we didn’t include in that
list, but that are critical to how we as a movement operate, including the
de-prioritization of investigative journalism, and the damage to GLAM
institutions wrought by the global pandemic.
As part of this year’s annual planning process, we set out to update that
list. In particular, we’ve been tracking recent advancements in artificial
intelligence (AI). In our recent Diff post on the topic,  we noted some
risks as well as some potential opportunities for our movement as this
technology continues to evolve. Since there has been a great deal of
interest in and discussion about AI products like ChatGPT and what it means
for Wikimedia over the past few months (including several threads on the
topic on this mailing list), we’d love to explore this topic in more depth
with you and continue the conversation about its implications for us as a
free knowledge movement.
I’d like to invite you all to an open call on 23 March at 18:00 UTC (find
your local time here)  where we can share reflections on the
opportunities, risks, and questions we see raised by new AI tools and
The call will be held on Zoom. If you’re interested in joining, email
answers(a)wikimedia.org and we will share the Zoom link with you via email.
We will work to coordinate interpretation for languages where there are 3
or more interested community members; please email answers(a)wikimedia.org
with interpretation requests as well.
For those who are unable to join the call, but interested in following and
contributing to the conversation, we plan to share notes on our External
Trends Meta page  afterward so that you can add your thoughts.
Whether in person or on-wiki, I hope you’ll share your ideas so that we can
all get a broader understanding of the potential benefits and challenges of
this emergent technology. Looking forward to the discussion!
*Yael Weissburg* (she/her)
VP, Partnerships, Programs & Grantmaking
Wikimedia Foundation <https://wikimediafoundation.org/>
M: (+1) 415.513.6643
I work from San Francisco. My time zone is UTC -7/-8.
The WikiForHumanRights 2023 Campaign is collaborating with the Human Rights
Team at the Wikimedia Foundation to deliver an event safety session for
community organizers and participants.
*Are you organizing or participating in the WikiForHumanRights 2023
Campaign or even interested in learning how to increase your safety when
participating in Wikimedia Projects and engaging with the internet? *
*Join us for a beginner-friendly session on 5 April 2023 at 16:00 UTC to
learn the very basics of digital security, key concepts, best practices,
pros and cons of some popular safety tools. *
*Please register for the session in the link below:*
If you are interested in assessing your digital security risks before the
session, feel free to check through this fun, easy and interactive course
*Taking the course is not required to participate in the session, but you
may find it helpful to get acquainted with the topic beforehand. *
We are happy to answer any questions or comments you might have.
The Universal Code of Conduct project team has completed the analysis of
the comments accompanying the ratification vote on the Revised Universal
Code of Conduct Enforcement Guidelines.
All respondents had the opportunity to provide comments regarding the
contents of the Revised Enforcement Guideline draft document. A total of
369 participants left comments in 18 languages; compared to 657 commenters
in 27 languages in 2022. The Trust and Safety Policy team completed an
analysis of these results, categorizing comments to identify major themes
and areas of focus within the comments. The report is available in
translated versions on Meta-wiki here
Please help translate into your language.
Again, we are thankful to all who participated in the vote and
discussions. More information about the Universal Code of Conduct and
Enforcement Guidelines can be found on Meta-wiki
On behalf of the Universal Code of Conduct project team,
Lead Trust & Safety Policy Manager
The Wikimedia Foundation Board of Trustees met in person in New York City
over March 8-11. The official Board of Trustees meeting was held on March
9. During the Board meeting, the Board approved the December meeting
minutes , and approved an update to the Talent and Culture Committee
composition, with Rosie Stephenson-Goodknight becoming the chair . The
Board also approved the Universal Code of Conduct Enforcement Guidelines
, as mentioned in Shani’s prior email to the list.
Also, the Product & Technology, Governance Committee, Community Affairs
Committee took advantage of the time together to conduct their own meetings.
Operational and committee updates were shared with the Board prior to the
meeting itself as a pre-read and did not require Board action, you can see
the board committee reports here . The Board also received updates on
the Foundation’s annual planning process for the 2023-2024 fiscal year,
ahead of a first draft of the plan that will be shared with communities in
Following the Board meeting, there was a two-day planning meeting where the
Board was brought together with the Wikimedia Endowment Board and eight
representatives from the Movement Charter Drafting Committee for the first
time. The goal of this meeting was to engage in longer-term planning about
the work of the Foundation across multiple years. There were discussions
around the Foundation’s approach to fundraising and how that can evolve in
coming years, a long-term view on product and technology strategy aligned
with the three buckets of work that CPTO Selena Deckelmann has already
and Foundation support for other priorities around the movement using
education as an example. Importantly, the retreat enabled these groups to
engage with each other as partners.
The Board will meet again virtually in June and then in person at Wikimania
in Singapore, where I hope to see some of you as well. Until Wikimania, you
are all invited to meet the Board in the next Open Conversation with
Trustees, that will be hosted on May 18
antanana / Nataliia Tymkiv
Chair, Wikimedia Foundation Board of Trustees
*NOTICE: You may have received this message outside of your normal working
hours/days, as I usually can work more as a volunteer during weekend. You
should not feel obligated to answer it during your days off. Thank you in
My Fellow Wikimedians,
Thanks you very much for your kind interest in submitting program
proposals. Program submissions officially closed at 29 March 2023 at 13:59
We received 594 program submissions and we will remove the spam submissions
accordingly. Program subcommittee *may* announce soon another round of
submissions on poster sessions (depending on the result of the first batch
of poster proposals) and details of the Hackathon in Wikimania 2023.
Please take opportunity to refine and edit your program submissions on
Pretalx.com. The jurors will start screening your applications in April.
Don't take risk of editing it within the month April as you don't know if
your submission is already scored before your refinements/ edits.
Please check the program submissions wiki page and FAQ on what qualities we
are looking for a program content in Wikimania 2023 and the next steps.
If you are interested on knowing the submitted program proposals, go to
Event lead, ESEAP Wikimania 2023 Core Organizing Team
Chair, Program Subcommittee
*For direct inquiries, please contact us organizers on our group mail :
wikimania(a)wikimedia.org <wikimania(a)wikimedia.org>. We discourage you to
click "reply all" as this is a large mailing list*.