Hi everyone,
The Affiliations Committee (AffCom) – the committee responsible for guiding
volunteers in establishing and sustaining Wikimedia chapters, thematic
organizations, and user groups – is seeking new members!
The main role of the AffCom is to guide groups of volunteers that are
interested in forming Wikimedia affiliates. We review applications from new
groups, answer questions and provide advice about the different Wikimedia
affiliation models and processes, review affiliate bylaws for compliance
with requirements and best practices, and update the Wikimedia Foundation
Board of Trustees as well as advise them on issues connected to chapters,
thematic organizations and Wikimedia user groups.
The committee consists of five to fifteen members, selected at least once
every year, to serve two-year terms. As the committee must hold mid-year
elections to replenish its members at this time, those joining the
committee during the current process will serve a slightly extended term
from July 2020 through December 2022.
AffCom continues to closely monitor the Wikimedia 2030 Strategy process
initiated in 2016. While the affiliation models continue to be discussed as
part of the broader strategy discussion, as no decisions have been made to
change the current affiliation models yet, AffCom continues to work in the
same manner with regard to affiliate recognitions and intervention support
for affiliates with issues of non-compliance in 2020. AffCom continues to
process applications for user group and chapter/thematic organization
creation, while we await the strategy next steps and begin to prepare for a
smooth transition of the committee and affiliates ecosystem to any changing
movement structures and systems in 2021.
Being a part of the AffCom requires communication with volunteers all over
the world, negotiating skills, cultural sensitivity, and the ability to
understand legal texts. We look for a mix of different skill sets in our
members.
==Responsibilities==
1.
Availability of up to 5-8 hours per month
2.
Participate in monthly one and two-hour voice/video meetings
3.
Commitment to carry out assigned tasks in a given time.
4.
Facilitate and support communications
5.
Affiliate Support and growth
== Required and Recommended Abilities, Skills, Knowledge for Affiliations
Committee Members ==
Strong interpersonal relationship among members of the committee and also
with the Wikimedia community members. Across all committee members, there
are additional relevant skills as well as requirements which help to
support the committee and its sustainability which include both required
and relevant general skills
===Required===
* Fluency in English
* Availability of up to 5 hours per week, and the time to participate in a
monthly one and two-hour voice/video meetings.
* Willingness to use one's real name in committee activities (including
contacts with current and potential affiliates) when appropriate.
* Strong track record of effective collaboration
* International orientation
===Relevant for all members===
* Public Communications (English writing and speaking skills)
* Skills in other languages are a major plus.
* Understanding of the structure and work of affiliates and the Wikimedia
Foundation.
* Documentation practices
* Interviewing experience
* Experience with, or in, an active affiliate is a major plus.
* Teamwork: Project and people management skills to coordinate and
collaborate with different parties on a shared plan and see it through to
completion.
* Problem-Solving: Ability to evaluate various solutions, consider multiple
interests and points of view, revisit unresolved issues, seek compromise
and work and communicate across languages and cultures.
Given the expectations for maintaining the course in 2020 and preparing for
potential 2021 transitions, it is important that we are also clear about
two different skill sets critical to committee support at this time. The
first skillset is oriented to understanding affiliate dynamics and
organizational development patterns to successfully process affiliate
applications for recognition; the other is oriented to conflict prevention
and intervention support for affiliates in conflict.
=== Relevant to Affiliate Recognitions===
* Administration & Attention to detail
* Readiness to participate in political discussions on the role and future
of affiliates, models of affiliation, and similar topics.
* Awareness of the affiliates ecosystem and models and understanding of
community building, organizational development, and group dynamics
===Relevant to Conflict Prevention & Intervention===
* Communication skills for active listening, clear instruction and
turn-taking.
* Stress Management skills for maintaining patience and positivity
* Emotional intelligence to maintain awareness of emotions of oneself and
others to practice empathy,
impartiality, and mutual respect.
* Facilitation, negotiation, and mediation skills to guide diverse
individuals and groups toward cooperation.
* Ability to work within a team
Do you have relevant skills and an interest to support movement affiliates?
We are looking for people who are excited by the challenge of empowering
volunteers to get organized and form communities that further our mission
around the world. In exchange, committee members selected will gain the
experience of supporting their world-wide colleagues to develop their
communities as well as personal development in guiding organizational
development, facilitating affiliate partnerships, and professional
communications.
==Selection process==
As a reflection of our commitment to openness, transparency, and bilateral
engagement with the Wikimedia community, the 2020 member selection process
will include a public review and comment period. Along with this, in order
to facilitate the process of nomination peruse, we are introducing a self
assessment survey for candidates to share about their skills and their
experience. This self assessment information will help the committee to
identify the skill sets relevant and supportive to our affiliate support
processes. We invite you to apply for membership to the committee and join
us in supporting the affiliates. The data shared in the self assessment
will be only be made available to the committee and the relevant staff
support The nomination,candidate Q&A, and endorsementswill still take place
posted on Meta for public review at
https://meta.wikimedia.org/wiki/Affiliations_Committee/Candidates/June_2020.
Here the global community is welcome to provide comments and feedback about
each candidate.
Once the nomination window closes June 30, 2020, the sitting members who
are not candidates for re-election in this cycle, will deliberate and then
vote, taking into account all inputs from the meta page, self assessment,
advisors, Wikimedia Foundation staff and board liaisons, and committee
member discussion.
A final decision will be made in late-July 2020, with new members to be
notified for onboarding in August.
Respectfully,
Rosie Stephenson-Goodknight (she/her)
Chair, Affiliations Committee
Awhile back I saw a joke that when reading a newspaper someone had
difficulty distinguishing between the business section and the crime
section. These days, the politics section could cause similar
confusion. Recently I have wondered about the extent to which WMF and
the affiliates take steps to prevent conflicts of interest in
financial decisions.
I am not aware of any evidence of recent financial conflicts of
interest, but I think that taking steps to prevent and detect any
problems would be prudent.
For example, is there any monitoring of the bank accounts of board
members and executives to ensure that they are not receiving kickbacks
from companies that have contracts with the organizations? Also, are
there "cooling off periods" which contractually require that
executives and board members of WMF and Wikimedia affiliates not
become employees of companies that have had financial relationships
with their organizations until at least a few years after their
employment or board membership with the Wikimedia organization?
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
Hi Lodewijk,
This ecosystem you are describing is exactly what we are hoping for.
And we absolutely agree that what you called "education" is needed.
We referred to it as "training" and "capacity building" in this sentence in
the statement:
"To that end, the Board further directs the Foundation, in collaboration
with the communities, to make additional i*nvestments in Trust & Safety
capacity*, including but not limited to: development of tools needed to
assist our volunteers and staff, research to support data-informed
decisions, development of clear metrics to measure success, *development of
training tools and materials* (*including building communities’ capacities
around harassment awareness and conflict resolution*), and consultations
with international experts on harassment, community health and children’s
rights, as well as additional hiring."
Best,
Shani.
>
> From: effe iets anders <effeietsanders(a)gmail.com>
> Date: Sat, May 23, 2020 at 4:26 AM
> Subject: Re: [Wikimedia-l] Trust and safety on Wikimedia projects
> To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
>
>
> Thanks for this step - I wish that it wouldn't be necessary. I'm not sure
> of all the implications, but was mostly wondering: will this be primarily a
> stick, or is the foundation also going to invest more heavily in carrots
> and education?
>
> I get the impression that we have much progress to make in training,
> educating and exposing correct behavior (some chapters have made attempts
> at this). So much of our energy already goes into the bad behavior, that it
> exhausts many community members. I'm confident that the Trust and Safety
> live through a more extreme version of that daily.
>
> I'd wish that we manage to build an ecosystem that encourages good
> behavior, diverts bad behavior at a very early stage, and removes the bad
> actors that cannot be corrected. Probably not as popular as punishing
> people, but hopefully more constructive for the community as a whole.
>
> Lodewijk
>
> On Fri, May 22, 2020 at 4:52 PM Nataliia Tymkiv <ntymkiv(a)wikimedia.org>
> wrote:
>
> > Hello, Dennis!
> >
> > Not at all. What it means is that this a not a process that goes into
> play
> > *before* a decision to act is made, but *after*. It should stand as an
> > option for those who want to ensure that actions taken are fair, as long
> as
> > the case does not relate to legal risks or other severe concerns.
> >
> > Best regards,
> > antanana / Nataliia Tymkiv
> >
> > NOTICE: You may have received this message outside of your normal working
> > hours/days, as I usually can work more as a volunteer during weekend. You
> > should not feel obligated to answer it during your days off. Thank you in
> > advance!
> >
> > On Sat, May 23, 2020, 01:58 Dennis During <dcduring(a)gmail.com> wrote:
> >
> > > "Work with community functionaries to create and refine a retroactive
> > > review process for cases brought by involved parties, excluding those
> > cases
> > > which pose legal or other severe risks "
> > >
> > > What does "retroactive review process" mean?
> > >
> > > I hope it doesn't mean applying standards that were not promulgated at
> > the
> > > time to past actions and applying severe sanctions to the alleged
> > > perpetrators.
> > >
> > > On Fri, May 22, 2020 at 5:59 PM María Sefidari <maria(a)wikimedia.org>
> > > wrote:
> > >
> > > > Hello everyone,
> > > >
> > > > Today, the Wikimedia Foundation Board of Trustees unanimously passed
> a
> > > > resolution and published a statement[1] regarding the urgent need to
> > make
> > > > our movement more safe and inclusive by addressing harassment and
> > > > incivility on Wikimedia projects. The statement builds on prior
> > > statements
> > > > from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> > > > universal code of conduct, and directs the Wikimedia Foundation to
> > > rapidly
> > > > and substantively address these challenges in complement with
> existing
> > > > community processes.
> > > >
> > > > This includes developing sustainable practices and tools that
> eliminate
> > > > harassment, toxicity, and incivility, promote inclusivity, cultivate
> > > > respectful discourse, reduce harms to participants, protect the
> > projects
> > > > from disinformation and bad actors, and promote trust in our
> projects.
> > > >
> > > > Over the past nearly twenty years, the movement has taken a number of
> > > > unique and sometimes extraordinary steps to create an environment
> > unlike
> > > > anything else online: a place to share knowledge, to learn, and to
> > > > collaborate together. In order for the movement to continue to thrive
> > and
> > > > make progress to our mission, it is essential to build a culture that
> > is
> > > > welcoming and inclusive.
> > > >
> > > > Research has consistently shown that members of our communities have
> > been
> > > > subject to hostility and toxic behavior in Wikimedia spaces.[4][5]
> The
> > > > Wikimedia 2030 movement strategy recommendations have also identified
> > the
> > > > safety of our Wikimedia spaces as a core issue to address if we are
> to
> > > > reach the 2030 goals, with concrete recommendations which include a
> > > > universal code of conduct, pathways for users to privately report
> > > > incidents, and a baseline of community responsibilities.[6]
> > > >
> > > > While the movement has made progress in addressing harassment and
> toxic
> > > > behavior, we recognize there is still much more to do. The Board’s
> > > > resolution and statement today is a step toward establishing clear,
> > > > consistent guidelines around acceptable behavior on our projects, and
> > > > guiding the Wikimedia Foundation in supporting the movement’s ability
> > to
> > > > ensure a healthy environment for those who participate in our
> projects.
> > > >
> > > > * Developing and introducing, in close consultation with volunteer
> > > > contributor communities, a universal code of conduct that will be a
> > > binding
> > > > minimum set of standards across all Wikimedia projects;
> > > >
> > > > * Taking actions to ban, sanction, or otherwise limit the access of
> > > > Wikimedia movement participants who do not comply with these policies
> > and
> > > > the Terms of Use;
> > > >
> > > > * Working with community functionaries to create and refine a
> > retroactive
> > > > review process for cases brought by involved parties, excluding those
> > > cases
> > > > which pose legal or other severe risks; and
> > > >
> > > > * Significantly increasing support for and collaboration with
> community
> > > > functionaries primarily enforcing such compliance in a way that
> > > prioritizes
> > > > the personal safety of these functionaries.
> > > >
> > > > Together, we have made our movement what it is today. In this same
> way,
> > > we
> > > > must all be responsible for building the positive community culture
> of
> > > the
> > > > future, and accountable for stopping harassment and toxic behavior on
> > our
> > > > sites.
> > > >
> > > > We have also made this statement available on Meta-Wiki for
> translation
> > > and
> > > > wider distribution.[1]
> > > >
> > > > On behalf of the Board,
> > > > María, Board Chair
> > > >
> > > > [1]
> > > >
> > > >
> > >
> >
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_…
> > > >
> > > > [2]
> > > >
> > > >
> > >
> >
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Nove…
> > > >
> > > > [3]
> > > >
> > > >
> > >
> >
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Arch…'s_ban_of_Fram
> > > >
> > > > [4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
> > > >
> > > > [5]
> > > >
> > > >
> > >
> >
> https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_o…
> > > >
> > > > [6]
> > > >
> > > >
> > >
> >
> https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recomme…
> > > >
> > > > == Statement on Healthy Community Culture, Inclusivity, and Safe
> Spaces
> > > ==
> > > >
> > > > Harassment, toxic behavior, and incivility in the Wikimedia movement
> > are
> > > > contrary to our shared values and detrimental to our vision and
> > mission.
> > > > They negatively impact our ability to collect, share, and disseminate
> > > free
> > > > knowledge, harm the immediate well-being of individual Wikimedians,
> and
> > > > threaten the long-term health and success of the Wikimedia projects.
> > The
> > > > Board does not believe we have made enough progress toward creating
> > > > welcoming, inclusive, harassment-free spaces in which people can
> > > contribute
> > > > productively and debate constructively.
> > > >
> > > > In recognition of the urgency of these issues, the Board is directing
> > the
> > > > Wikimedia Foundation to directly improve the situation in
> collaboration
> > > > with our communities. This should include developing sustainable
> > > practices
> > > > and tools that eliminate harassment, toxicity, and incivility,
> promote
> > > > inclusivity, cultivate respectful discourse, reduce harms to
> > > participants,
> > > > protect the projects from disinformation and bad actors, and promote
> > > trust
> > > > in our projects.
> > > >
> > > > Specifically, the Foundation shall:
> > > >
> > > > * Develop and introduce a universal code of conduct (UCoC) that will
> > be a
> > > > binding minimum set of standards across all Wikimedia projects.
> > > >
> > > > ** The first phase, covering policies for in-person and virtual
> events,
> > > > technical spaces, and all Wikimedia projects and wikis, and developed
> > in
> > > > collaboration with the international Wikimedia communities, will be
> > > > presented to the Board for ratification by August 30, 2020.
> > > >
> > > > ** The second phase, outlining clear enforcement pathways, and
> refined
> > > with
> > > > broad input from the Wikimedia communities, will be presented to the
> > > Board
> > > > for ratification by the end of 2020;
> > > >
> > > > * Take actions to ban, sanction, or otherwise limit the access of
> > > Wikimedia
> > > > movement participants who do not comply with these policies and the
> > Terms
> > > > of Use;
> > > >
> > > > * Work with community functionaries to create and refine a
> retroactive
> > > > review process for cases brought by involved parties, excluding those
> > > cases
> > > > which pose legal or other severe risks; and
> > > >
> > > > * Significantly increase support for and collaboration with community
> > > > functionaries primarily enforcing such compliance in a way that
> > > prioritizes
> > > > the personal safety of these functionaries.
> > > >
> > > > Until such directives are implemented, the Board instructs the
> > Foundation
> > > > to adopt and implement policies for reducing harassment and toxicity
> on
> > > our
> > > > projects and minimizing legal risks for the movement, in
> collaboration
> > > with
> > > > communities whenever practicable. Until these two phases of the UCoC
> > are
> > > > complete and operational an interim review process involving
> community
> > > > functionaries will be in effect. In this interim period, the Product
> > > > Committee of the Board of Trustees will also advise the Trust &
> Safety
> > > > team.
> > > >
> > > > To that end, the Board further directs the Foundation, in
> collaboration
> > > > with the communities, to make additional investments in Trust &
> Safety
> > > > capacity, including but not limited to: development of tools needed
> to
> > > > assist our volunteers and staff, research to support data-informed
> > > > decisions, development of clear metrics to measure success,
> development
> > > of
> > > > training tools and materials (including building communities’
> > capacities
> > > > around harassment awareness and conflict resolution), and
> consultations
> > > > with international experts on harassment, community health and
> > children’s
> > > > rights, as well as additional hiring.
> > > >
> > > > The above efforts will be undertaken in coordination and
> collaboration
> > > with
> > > > appropriate partners from across the movement, seek to increase
> > effective
> > > > community governance of conduct and behavioral standards, and reduce
> > the
> > > > long-term need of the Foundation to act. It is the shared goal of the
> > > Board
> > > > and Foundation that these efforts advance a sustainable Wikimedia
> > > movement
> > > > and support, rather than substitute, effective models of community
> > > > governance.
> > > >
> > > > We urge every member of the Wikimedia communities to collaborate in a
> > way
> > > > that models the Wikimedia values of openness and inclusivity, step
> > > forward
> > > > to do their part to create a safe and welcoming culture for all, stop
> > > > hostile and toxic behavior, support people who have been targeted by
> > such
> > > > behavior, assist good-faith people learning to contribute, and help
> set
> > > > clear expectations for all contributors.
> > > >
> > > > --
> > > >
> > > >
> > > > María Sefidari Huici
> > > >
> > > > Chair of the Board
> > > >
> > > > Wikimedia Foundation <https://wikimediafoundation.org/>
> > > > _______________________________________________
> > > > Wikimedia-l mailing list, guidelines at:
> > > > https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> > > > https://meta.wikimedia.org/wiki/Wikimedia-l
> > > > New messages to: Wikimedia-l(a)lists.wikimedia.org
> > > > Unsubscribe:
> https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > > > <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
> > >
> > >
> > >
> > > --
> > > Dennis C. During
> > > _______________________________________________
> > > Wikimedia-l mailing list, guidelines at:
> > > https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> > > https://meta.wikimedia.org/wiki/Wikimedia-l
> > > New messages to: Wikimedia-l(a)lists.wikimedia.org
> > > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > > <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
> > _______________________________________________
> > Wikimedia-l mailing list, guidelines at:
> > https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> > https://meta.wikimedia.org/wiki/Wikimedia-l
> > New messages to: Wikimedia-l(a)lists.wikimedia.org
> > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
> _______________________________________________
> Wikimedia-l mailing list, guidelines at:
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
>
Anecdotally, it seems people sometimes don't upload their photos to Commons because they don't realize that the scope of Commons is much broader than that of Wikipedia.
Has there been, or should there be, any research into this, or why people don't contribute more broadly?
~Benjamin
Dear all,
There is now a bill in the "deliberation" process of the Chinese People's
Political Consultative Conference ("CPPCC"). This bill is named as "Hong
Kong Special Administrative Region of People's Republic of China National
Security Law" (the "Bill"). This bill will probably be handed to the
People's Congress ("PC") after the process.
If passed in the PC, it will be added into the Annex III of the Basic Law.
At the same time, local approval will not be needed and will be applied in
Hong Kong. This seems to be a Beijing response to the near year-long
protests in Hong Kong.
The Board of Wikimedia Hong Kong User Group (the "Board") made a statement
hereinafter regarding the current developments regarding such a bill, as
this may possibly undermine the capability for Wikipedia to be accessed
unrestrictedly in Hong Kong.
Below is the statement issued by the Board :
//
We are currently very aware of Beijing authorities and local pro-Beijing
party members pushing for a bill which may limit the freedom of speech
within Hong Kong.
We are monitoring this issue closely, as it may affect the capability of
the User Group to carry out its mission for ensuring uncensored and
unrestricted access to Wikimedia projects, including Wikipedia and Wikinews.
- The Board, Wikimedia Community User Group Hong Kong
//
About the Protests: https://en.wikipedia.org/wiki/2019–20_Hong_Kong_protests
About the User Group:
https://meta.wikimedia.org/wiki/Wikimedia_Community_User_Group_Hong_Kong
Regards,
William Chan (User:1233)
Board Member, Wikimedia Community User Group Hong Kong
*issued on behalf of the board*
Hi there
I can only speak for Wikimedia UK, but as part of our statutory Annual
Report and Accounts all board members and senior managers have to declare
'Related Party Transactions' - of which since I joined the organisation (in
2015) there have been none.
Best wishes
Lucy
On Thu, 21 May 2020 at 13:01, <wikimedia-l-request(a)lists.wikimedia.org>
wrote:
> Send Wikimedia-l mailing list submissions to
> wikimedia-l(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
> or, via email, send a message with subject or body 'help' to
> wikimedia-l-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> wikimedia-l-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wikimedia-l digest..."
>
>
> Today's Topics:
>
> 1. Re: Preventing conflicts of interest in Wikimedia
> organizations' employment and financial relationships (Andy Mabbett)
> 2. WikiGap Nigeria Online Contest: Sharing our diversity.
> (Olushola Olaniyan)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Wed, 20 May 2020 23:14:59 +0100
> From: Andy Mabbett <andy(a)pigsonthewing.org.uk>
> To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
> Subject: Re: [Wikimedia-l] Preventing conflicts of interest in
> Wikimedia organizations' employment and financial relationships
> Message-ID:
> <CABiXOE=
> s8xk5WrUpk0iRxs_bN9Kb1DH9qZk0HvtQmuBxpgxWkg(a)mail.gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> On Wed, 20 May 2020 at 20:21, Pine W <wiki.pine(a)gmail.com> wrote:
>
> > For example, is there any monitoring of the bank accounts of board
> > members and executives
>
> I very much hope not. That would be an outrageous intrusion.
>
> --
> Andy Mabbett
> @pigsonthewing
> http://pigsonthewing.org.uk
>
>
>
> ------------------------------
>
> Message: 2
> Date: Thu, 21 May 2020 10:08:26 +0100
> From: Olushola Olaniyan <olaniyanshola15(a)gmail.com>
> To: "Carlos M. Colina" <wikimedia-l(a)lists.wikimedia.org>
> Cc: shola <shola(a)wikimedia.org.ng>
> Subject: [Wikimedia-l] WikiGap Nigeria Online Contest: Sharing our
> diversity.
> Message-ID:
> <CANr3wG3LB+fjF3NXDifvGdvKsJ+SzSYEdsJJ1X=
> N6WYzovL+zA(a)mail.gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> Dear friends,
>
> This is to inform you that WikiGap Nigeria Online contest has entered week
> three and we are glad to share our progress report with you all.
>
> We have over 300 articles published in less than three weeks by over thirty
> (30) editors in over 16 Wikipedia projects/languages ( enwiki, yowiki,
> Igwiki, hawiki,dewiki, Idwiki, eswiki, hewiki, ItWiki, kowiki, ptwiki,
> ruwiki,niwiki, wikidata and commons)
>
> The event ends in less than 7 days from now, we desire to read our women
> biography in your language.
>
> Kindly join us by signing up here
> <
> https://meta.wikimedia.org/wiki/WikiGap_Nigeria_Online_Challenge/Participan…
> >
>
> Keep Safe and keep well.
>
> <
> https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campai…
> >
> Virus-free.
> www.avast.com
> <
> https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campai…
> >
> <#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> Wikimedia-l mailing list, guidelines at:
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
>
> ------------------------------
>
> End of Wikimedia-l Digest, Vol 194, Issue 36
> ********************************************
>
--
Lucy Crompton-Reid
Chief Executive
Wikimedia UK
+44 (0) 203 372 0762
*Wikimedia UK* is the national chapter for the global Wikimedia open
knowledge movement, and a registered charity. We rely on donations from
individuals to support our work to make knowledge open for all. Have you
considered supporting Wikimedia? https://donate.wikimedia.org.uk
Company Limited by Guarantee registered in England and Wales, Registered
No. 6741827
Registered Charity No.1144513
Registered Office Ground Floor, Europoint, 5 - 11 Lavington Street, London
SE1 0NZ
The Wikimedia projects are run by the Wikimedia Foundation (who operate
Wikipedia, amongst other projects). Wikimedia UK is an independent
non-profit charity with no legal control over Wikipedia nor responsibility
for its contents.
Hello everyone
u:Alacoolwiki <https://meta.wikimedia.org/wiki/User:Alacoolwiki> and
myself worked on the portal page of the Gender Gap on meta.
Before :
https://meta.wikimedia.org/w/index.php?title=Gender_gap&oldid=19889900
New : https://meta.wikimedia.org/wiki/Gender_gap
Why ? The former page was very very outdated.
So first task was to clean it up, remove outdated info
Second task was to look for more recent data, do a bit of digging around.
Third task was to reorganize the whole portal. This was done along the
same lines than the *Wikimedia Resource Center
<https://meta.wikimedia.org/wiki/Special:MyLanguage/Wikimedia_Resource_Center>.
*
How you can help ?
Please note that the goal is not necessarily to have regular information
being updated over there (because we do know it will not happen, right ?
:)),
but to serve as a hub to link to all resources, groups, initiatives led
within the community (and beyond). The current pages are by no means
fully updated. Feel free to jump in and add.
Please do NOT drop links and be done with it. Have a _curated approach_
to make it useful AND practical.
In particuliar, I hope that gender-gap oriented groups will make the
effort of adding their groups to the page listing them if it is found to
be missing : https://meta.wikimedia.org/wiki/Gender_gap/Groups
And I hope that people leading initiatives will add them here :
https://meta.wikimedia.org/wiki/Gender_gap/Initiatives
I hope it can be useful and better reflect the diversity of our
mouvement and of the approaches we follow.
Last, I will outline that I am aware that the current design does not
permit translation. I gave much thinking about that and, in light of
former state of the page (very outdated) and former state of
translations (very very weak...), I decided to favor design and
useability :)
Cheers
Anthere
Hi all,
The next Research Showcase will be live-streamed on Wednesday, May 20, at
9:30 AM PDT/16:30 UTC.
This month we will learn about recent research on machine learning systems
that rely on human supervision for their learning and optimization -- a
research area commonly referred to as Human-in-the-Loop ML. In the first
talk, Jie Yang will present a computational framework that relies on
crowdsourcing to identify influencers in Social Networks (Twitter) by
selectively obtaining labeled data. In the second talk, Estelle Smith will
discuss the role of the community in maintaining ORES, the machine learning
system that predicts the quality in Wikipedia applications.
YouTube stream: https://www.youtube.com/watch?v=8nDiu2ebdOI
As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
This month's presentations:
*OpenCrowd: A Human-AI Collaborative Approach for Finding Social
Influencers via Open-Ended Answers Aggregation*
By: Jie Yang, Amazon (current), Delft University of Technology (starting
soon)
Finding social influencers is a fundamental task in many online
applications ranging from brand marketing to opinion mining. Existing
methods heavily rely on the availability of expert labels, whose collection
is usually a laborious process even for domain experts. Using open-ended
questions, crowdsourcing provides a cost-effective way to find a large
number of social influencers in a short time. Individual crowd workers,
however, only possess fragmented knowledge that is often of low quality. To
tackle those issues, we present OpenCrowd, a unified Bayesian framework
that seamlessly incorporates machine learning and crowdsourcing for
effectively finding social influencers. To infer a set of influencers,
OpenCrowd bootstraps the learning process using a small number of expert
labels and then jointly learns a feature-based answer quality model and the
reliability of the workers. Model parameters and worker reliability are
updated iteratively, allowing their learning processes to benefit from each
other until an agreement on the quality of the answers is reached. We
derive a principled optimization algorithm based on variational inference
with efficient updating rules for learning OpenCrowd parameters.
Experimental results on finding social influencers in different domains
show that our approach substantially improves the state of the art by 11.5%
AUC. Moreover, we empirically show that our approach is particularly useful
in finding micro-influencers, who are very directly engaged with smaller
audiences.
Paper: https://dl.acm.org/doi/fullHtml/10.1145/3366423.3380254
*Keeping Community in the Machine-Learning Loop*
By: C. Estelle Smith, MS, PhD Candidate, GroupLens Research Lab at the
University of Minnesota
On Wikipedia, sophisticated algorithmic tools are used to assess the
quality of edits and take corrective actions. However, algorithms can fail
to solve the problems they were designed for if they conflict with the
values of communities who use them. In this study, we take a
Value-Sensitive Algorithm Design approach to understanding a
community-created and -maintained machine learning-based algorithm called
the Objective Revision Evaluation System (ORES)—a quality prediction system
used in numerous Wikipedia applications and contexts. Five major values
converged across stakeholder groups that ORES (and its dependent
applications) should: (1) reduce the effort of community maintenance, (2)
maintain human judgement as the final authority, (3) support differing
peoples’ differing workflows, (4) encourage positive engagement with
diverse editor groups, and (5) establish trustworthiness of people and
algorithms within the community. We reveal tensions between these values
and discuss implications for future research to improve algorithms like
ORES.
Paper:
https://commons.wikimedia.org/wiki/File:Keeping_Community_in_the_Loop-_Unde…
--
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>