Dear Friends,
it is my pleasure to extend the following invitation to you:
Wikimedia Movement and the Free Knowledge Advocacy Group EU have the
pleasure to invite you to a presentation by
*First Advocate General of the Court of Justice of the European Union*
*Maciej Szpunar*
titled:
*New Technologies and Fundamental Rights*
on *June 27th at 17:00*
*Townhall Europe, square de Meeus 5-6 in Brussels*
After the presentation and Q&A we would love you to stay for an apéro.
Due to capacity limits, kindly R.S.V.P. until June 15th.
In the Monsters of Law <https://monstersoflaw.brussels/> series we ask
academics and practitioners of law how the intersections of technology and
law shape our everyday life. They answer with examples and perspectives
that help us navigate the interesting times of technological
transformation, both practically and from the policy perspective.
Looking forward to seeing you at the event!
--
--
Anna Mazgal
Senior EU Policy Advisor
Wikimedia
http://wikimedia.brussels
@a2na
mobile: +32 487 222 945
12 Rue Belliard
BE-1040 Brussels
--
Wikimedia Belgium vzw
BE 0563.775.480
- RPR Brussel
Antwerpselaan 40
Boulevard d’Anvers 1000 Brussel/Bruxelles
www.wikimedia.be
<https://www.wikimedia.be/>
info(a)wikimedia.be <mailto:info@wikimedia.be>
Dear digital rights and free knowledge supporters,
A major event in the month of June is RightsCon...and Wikimedia is showing up big!
Wikipedians will be hosting and/or participating in a total of ten sessions at RightsCon this week. Members of our movement will be championing Wikimedia approaches on emerging challenges to a free and open internet, including privacy and surveillance, internet access, inclusion, and internet shutdowns and disruptions.
You can find details and brief descriptions for all of the sessions below. You can also learn more about how we're showing up at and supporting RightsCon'22 in this blog post: https://diff.wikimedia.org/2022/06/02/meet-the-wikimedians-promoting-free-k…
We hope to see you there!
Best,
Ziski & the Global Advocacy team
_____
1) How lawmakers in Southeast Asia can safeguard human rights while addressing online disinformation during elections
Date: Wednesday, June 8 at 12:30 AM EDT
Format: Panel
Presenters: Rachel Arinii Judhistari (Wikimedia Foundation), Kristina Gadaingan (ASEAN Parliamentarians for Human Rights), Members of Parliament from the Philippines and Thailand
Details: This interactive panel seeks to broaden the discussion about human rights safeguards within internet regulation regimes in Southeast Asia, especially the nuances surrounding online campaigning and the rising threat of disinformation, how they influence political conversations, and also potentially undermine electoral processes. The panel will pose these questions to the members of parliaments, civil society, and platform hosts. It will allow participants to contribute to the free-flowing discussion, and to provide perspectives from their own experiences and contexts as well.
2) Fighting disinformation in Persian Wikipedia: The good, the bad, the AI
Date: Thursday, June 9 at 4:30 AM EDT
Format: Tech Demo
Presenter: Amir Sarabadani (Wikimedian)
Details: This tech demo covers two tools that members of Persian Wikipedia developed to combat government disinformation campaigns. These tools have made it possible to share and update information on Persian Wikipedia without the fear of persecution. As such, they have become crucial to foster the resilience of Persian Wikipedia and may inspire other groups to bring similar initiatives back to their communities.
3) #WikiforHumanRights: Creating and editing human rights content on Wikipedia
Date: Thursday, June 9 at 12:15 PM EDT
Format: Workshop
Presenters: Faisal Da Supremo (Wikimedia Ghana), Kolawole Oyewole (Wiki Fan Club, and Lagos State University), Iván Martínez (Wikimedia México), Luisina Ferrante (Wikimedia Argentina), Alex Stinson (Wikimedia Foundation)
Details: This workshop will introduce participants to the basic skills needed to create and edit human rights content on Wikipedia. Experienced Wikipedians will teach basic editing skills, share best practices around citing reputable sources, and answer participants’ questions during this interactive session. Participants are encouraged to identify articles on human rights concepts or content that are lacking or need to be bolstered in their linguistic communities before the session. The session will provide open editing time for participants to create or edit content on their selected topics with the assistance of experienced Wikipedians. It will conclude with a review of best practices, an update on the #WikiForHumanRights campaign, and a question and answer period.
4) Using Wikipedia to advance human rights and democracy, using constructive conflict to create quality articles
Date: Thursday, June 9 at 2:45 PM EDT
Format: Workshop
Presenters: Luisinia Ferrante (Wikimedia Argentina), Spencer Graves (Wikimedian), Franziska Putz (Wikimedia Foundation)
Details: This workshop will demonstrate how controversy can be a productive force behind “the wisdom of crowds” that makes it possible for websites like Wikipedia to share freely accessible information online. Case studies on Spanish, Chinese, French, and English Wikipedia articles will demonstrate how their development was informed by social, economic, and political debates in each of the contexts they describe as well as by the different perspectives and approaches between volunteer editors. This session will expose participants to the experience of co-creating knowledge about human rights online.
5) No “right” approach, but many effective ones: Moderation approaches for online information about political processes
Date: Friday, June 10 at 8:00 AM EDT
Format: Workshop
Presenters: Patricia Díaz-Rubio (Wikimedia Chile), Kate Levan (Wikimedia Foundation), Nathan Forrester (Wikimedia Foundation)
Details: This immersive workshop brings together organizations with unique, community-led moderation approaches in order to present participants with case studies on disinformation around electoral processes. Panelists will engage participants in analyzing the issues at hand, discussing challenges to moderating specific content, and will then walk the audience through the moderation process employed in their context. The goal of the session is for the audience to experience how hard the job is, as well as the variety of effective approaches there are to content moderation, debunking the idea that there is a single, perfect process for moderating online spaces.
In addition to these Wikimedia Foundation-organized events, Wikimedians will be hosting and participating in the following sessions, as well:
6) Empowering Community Content Moderation
Date: Tuesday, June 7 at 1:30 PM EDT
Format: Panel
Presenters: Jessica Ashooh (Reddit), Rose Coogan (Github), Allison Davenport (Wikimedia Foundation), Guillaume Rischard (OpenStreetMap Foundation)
Details: The panel will feature policy leadership from a variety of platforms with community content moderation, who will discuss best practices for fostering effective, scalable, and rights-based community content moderation online. Along with touching on the advantages of community moderation, the panel will also discuss challenges with the model, and how policies for digital communication can leave room for individuals to participate in effective self-regulation, collaboration, and good faith moderation of online content.
7) Building a digital rights initiative in the Caribbean
Date: Tuesday, June 7 at 4:15 PM EDT
Format: Social Hour
Presenters: Wikimedians of the Caribbean User Group, JAAKLAC initiative, AfroCrowd, Access Now
Details: Social hours are an informal space where participants with common interests can connect and expand a network or coalition. There is no participation limit, so come along!
8) The danger of neglecting “non-lucrative” languages
Date: Wednesday, June 8 at 10:00 AM EDT
Format: Lightning Talk
Presenters: Anass Sedrati (Wikimedia Morocco)
Details: Having access to information in your mother tongue is a basic human right. Wikimedia projects may be doing well compared to other actors, but how can they be improved as well? Although languages in the digital world are not represented equally, Wikimedia projects have helped to represent more languages online, since the only prerequisites to have a Wikipedia in a particular language is an International Organization for Standardization (ISO) code and an active community. Yet even on Wikipedia projects this process is imperfect. This lightning talk explores the fraught manner in which languages are represented online, and puts forward the argument that more individuals need to be involved in enriching Wikimedia content, and in diversifying the languages that are represented on other platforms.
9) When you can’t see your city (online): Why you don’t want a country without Freedom of Panorama (FOP)
Date: Wednesday, June 8 at 10:30 PM EDT
Format: Lightning Talk
Presenters: Ramzy Muliawan (Wikimedia Indonesia)
Details: This lightning talk examines the freedom of panorama (FOP), and how the absence of this limitation on copyright threatens the implementation of Wikimedia’s 2030 strategy to “provide for safety and inclusion,” especially in countries where Wikimedia communities are emerging. The talk will review the existing freedom of panorama regulations (or lack thereof) in Indonesia, and propose to Wikimedia organizations and communities in Indonesia, as well as other emerging Wikimedia communities and like-minded partners, how to best navigate the muddy waters of the clash between the underdeveloped policy landscape and the ever-changing nature of online efforts to preserve and free knowledge.
10) Regulation for the few or many?
Date: Thursday, June 9 at 10:45 AM EDT
Format: Panel
Presenters: Caroline Greer (TikTok), Konstantinos Komaitis (The New York Times), Rebecca MacKinnon (Wikimedia Foundation), Jillian York (EFF), Eliška Pírková (AccessNow)
Details: This panel will discuss the risks associated with policymakers and legislators around the world crafting legislation with a small subset of large companies in mind. The panelists will discuss the theme using the latest policy development initiatives and practical examples. What is the impact on the broader tech ecosystem of one-size-fits-all laws? How can we ensure equitable policymaking that works for users as well? The session seeks to make recommendations on how the risks can be minimized, and how we can evolve to a more sophisticated model of tech policy- and lawmaking.
*Plenty of news from national processes this month: Romania and Lithuania
have passed their copyright reforms, Czechia is on its way. Denmark sort of
tried to regulate online platforms on top of what the EU is doing, but
pulled back. Italy is trying to make everyone pay for the use of public
domain works. Meanwhile in Brussels, the DSA is still being worked on and a
new proposal to tackle child sexual abuse material is raising some
uncomfortable questions. *
====================
DIGITAL SERVICES ACT
====================
Post-Deal Editing: Remember how the French Presidency of the Council wanted
to get a deal done so badly, it pushed the Parliament and Member States to
accept a “political deal” putting the main cornerstones in place. [1]
However, a “technical deal” is still being negotiated. As part of this, the
Council presented a series of changes to the Recitals of the Regulation.
[2] The Recitals are opening paragraphs that lay down the intention of the
legislation.
—
Stay-Down Anger: One particular change was in a recital that indirectly
opens the door for so-called “stay-down provisions”. Meaning a platform
would have to re-scan its uploads constantly to make sure certain content
doesn’t re-appear. The issue here is similar to that of “upload filters”
and the fact that it is impossible to do without monitoring everything. At
the same time the courts have ruled against general monitoring obligations
and the DSA was so far staying clear of such provisions. On the civil
society side Wikimedia, EDRi, AccessNow and EFF communicated to lawmakers
that this is problematic. Industry associations also raised the issue,
while inside the European Parliament the Liberals and Greens groups pushed
back. The result is that the text will now be re-worked. So another round
of negotiations.
======
CSAM
======
Sensitive Issue: The European Commission has published its proposal for a
“Regulation on Fighting child sexual abuse: detection, removal and
reporting of illegal content online.” [3] The debate around this is very
tricky and cannot be allowed to escalate to copyright levels. The issue
boils down to this: How much self-policing and monitoring of users content
should platforms be required to do? How far-reaching should the content
blocking and removal powers of authorities be?
—
Proposal Cornerstones: The regulation suggests that platforms should
perform a risk analysis every three years on how vulnerable they are to
such abuse material and grooming. Based on this they should draw up a
mitigation plan. These analyses and mitigation measures are to be public.
Apart from that a designated national authority of the Member State where
the service provider is represented may ask a court or independent
administrative body (depending on the country) to issue detection or
removal orders. Both orders would require scanning for child sexual abuse
content, normally provided by a EU Centre to be established in The Hague. A
designated national authority may also ask for an order to internet service
providers to block URLs. This last provision also seems rather problematic
for a number of reasons, including the method and technology mentioned and
the lack of safeguards on the national level.
—
Private Chats: The most controversial parts of the Regulation are beyond
what covers Wikimedia projects. Detection and removal orders would also
cover providers of interpersonal communications, i.e. instant messaging.
Which would mean that if a detection order is issued, such providers would
have to scan each private conversation for the hashes of specific CSA
material. Again an issue with general monitoring. This is the main apple of
discord.
====================
Open Data Consultation
====================
The Open Data Directive (formerly known as the Public Sector Information
Directive) [4] allows the European Commission to lay down a list of
high-value datasets that must be opened up by authorities for re-use. They
have now published the draft implementation act containing this list, which
includes geographic, climate and industrial statistics data, for instance.
Feedback is open until 21 June. [5] We have in the past shared our position
not only on the types of data to be opened, but also on the quality
requirements for such datasets. [6] It is good to see some of it in the
proposal and we are likely to reiterate some of the points in the current
consultation. If you are interested in working on the submission with us,
get in touch!
=========
Italy
=========
Two weeks ago the Italian government published a draft national
digitisation plan and opened a public consultation until June 15th. This is
bad news for Wikimedia projects. The most worrisome elements are:
1. The plan doesn’t recognise Creative Commons Zero as a relevant license
for GLAMs. There is an explicit delegitimisation of this license.
2. The plan proposes to release all public domain content of cultural
institutions by default under a MIC BY NC license (MIC stands for the
Italian Ministry of Culture). The non-commercial restriction is apparently
not based on copyright, but on an administrative right that allows cultural
institutions to require a fee for commercial uses of heritage they manage,
even if it is under public domain.
3. The most important issue for Wikimedia is the specific reference to
Wikimedia Commons, which says that if you re-use Italian cultural heritage
content from there, you need to pay a fee to the Ministry of Culture.
—
Wikimedia Italia is in the process of gathering a coalition on this and
responding to the proposal but also kicking off a public campaign. The
Wikimedia Foundation and us in Brussels are engaged and will try to help.
It might be useful to raise this issue in international media (even if
specialised outlets), so if you have any contacts or platforms in your
country, please consider writing about it. Else, there should be more
information, including blog posts, about this soon.
=========
Denmark
=========
In April the Danish government put forth a legislative proposal for the
regulation of online platforms independent of the DSA. The gist of the law
is that social media platforms, generally defined as platforms with the
purpose of creating a profile and browsing other profiles and
user-submitted content with over 80.000 yearly users in Denmark, will be
obliged to take down illegal content within 24 hours of reporting. The
proposal contains an almost verbatim copy of the encyclopedia carveout of
the DSM directive.
—
Wikimedia Denmark sent a letter [7] to the legislative committee handling
the proposal. They addressed four main points:
(1) the breadth of the online encyclopedia exemption,
(2) the definition of "illegal content", which I argued was very broad and
could end with some undesirable scenarios where platforms would have to
e.g. enforce libel law on behalf of private parties,
(3) the risk of over-removal, which had already been addressed by a civil
rights org and
(4) protecting modération citoyenne. The letter is here in Danish:
<https://t.co/GMc2DEtQHI>
—
The parliamentary committee sent a question to the government based on this
letter. In the meantime, the government was told by the European Commission
that they wouldn't approve of this type of legislation so close to the DSA
coming into force, so the proposal is pulled back for now. It may or may
not resurface in the future, we will watch the space.
=============================
Copyright Directive Transposition
=============================
Both Romania and Lithuania have updated their copyright laws in the past
two months. You may take a glimpse at the machine translated English
versions of the texts for yourself [8][9]. You probably won’t, so here are
two highlights: Both countries have introduced full-fledged copyright
exceptions for parody and pastiche for all uses (not just uses on UGC
platforms as required by the directive). Also, both countries have adopted
the public domain safeguard. For a more in-depth (and fun) analysis, check
out Communia’s Eurovision DSM contest site [10] later today. We partnered
with ApTi (a EDRi member) in Romania and with the Baltic Audiovisual
Archival Council in Lithuania to follow the process and participate in
consultations.
—
Czech Republic: The amendment to the Copyright Act, which transposes the
Directive on Copyright in the Digital Single Market into the Czech law, has
reached the Chamber of Deputies. It will now be discussed by the relevant
committees. Open Content CZ (they are also the local Creative Commons
chapter) and WMCZ are following the process.
====
END
====
[1]
https://lists.wikimedia.org/hyperkitty/list/publicpolicy@lists.wikimedia.or…
[2]
https://docs.google.com/document/d/1_DDJQ0pK756rWrmCI7qGZ5o24mJPeWs2/edit?u…
[3]https://ec.europa.eu/commission/presscorner/detail/en/IP_22_2976
[4]
https://en.wikipedia.org/wiki/Directive_on_the_re-use_of_public_sector_info…
[5]
https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/1…
[6]
https://drive.google.com/file/d/1ZcaqQPOR083qBikCYSNfpVgUf-s29apL/view?usp=…
[7]https://ft.dk/samling/20211/lovforslag/L146/bilag/15/2567190.pdf
[8]
https://docs.google.com/document/d/1rmRyq5MPMoREmlbVB_Xle6tpwf3c88IZ4650Ibn…
[9]
https://docs.google.com/document/d/1SYWbt38YvDEsQCnBTF1QVT0QLE-I-MUkQX_snRq…
[10]https://eurovision.communia-association.org/
Hello everyone,
This is a friendly reminder that The Community Development team
<https://meta.wikimedia.org/wiki/Community_Calls> at the Wikimedia
Foundation is hosting our second community call
<https://meta.wikimedia.org/wiki/Community_Development/Community_Calls#How_t…>on
Wednesday, May 25th, 2022 from 15:00 - 16:30 UTC on Zoom
<https://wikimedia.zoom.us/j/81408843170?pwd=OVcwblBrQjR6WXpNVFNmZHJYWEFadz09>
[meeting link]
In this call, we will have 2 speakers from the community talk about their
work within capacity building. At the end of their presentations, we will
secure time for questions and answers.
To join in the community call:
-
To attend the call, please find the link here
<https://wikimedia.zoom.us/j/81408843170?pwd=OVcwblBrQjR6WXpNVFNmZHJYWEFadz09>
.
-
Please ensure you have zoom downloaded <https://zoom.us/download> on
your personal device prior to the call.
We are excited to see and hear from you in our community call! If you have
any questions, please feel free to email the Community Development team at
comdevteam(a)wikimedia.org.
Thank you,
The Community Development team
Cassie Casares
Program Support Associate
Community Development
Wikimedia Foundation
ccasares(a)wikimedia.org
Dear free knowledge supporting friends, pals, aficionados!
The WMF Global Advocacy team is delighted to share a new monthly update with you: our "Don't Blink" blog post series <https://diff.wikimedia.org/?s=don%27t+blink>. Think of Dimi's EU Policy Monitoring Report, but tailored to the work of the Foundation's Global Advocacy Team.
Every month is a busy one when it comes to legislative and regulatory developments around the world that shape people’s ability to participate in the free knowledge movement. In case you blinked, this monthly retrospective will catch you up. We review the most important developments that have preoccupied us and the actions we’ve taken to advance fundamental rights online. We’ll also highlight the team’s work to protect the public-interest Internet, and our vision of an online ecosystem in which everyone can freely produce, access, share and remix information.
Our April recap was just posted <https://diff.wikimedia.org/2022/05/03/dont-blink-public-policy-snapshot-for…>. Highlights include:
* Advocacy efforts that have protected access to knowledge for Russian people
* A major win for civil society in the Philippines
* The completion of trilogue negotiations over the Digital Services Act (DSA)...and what the outcomes mean for free knowledge online
* A new [not so] SMART Copyright Act in the USA
We hope you enjoy the new content!
Warmly,
Ziski & The Global Advocacy Team
Salut la liste !
The big event of the month was definitely the political deal on the EU’s
new content moderation rulebook, the Digital Services Act. There are a few
new obligations in there for Wikimedia and we will take you through them.
Next month we are also organising a Wikicheese event in Brussels. You may
spread the word!
====================
DIGITAL SERVICES ACT
====================
Weird procedure: The French Presidency of the Council wanted to get a deal
done so badly, it pushed the Parliament and Member States to accept a
“political deal” putting the main cornerstones in place. However, a
“technical deal” is still being negotiated. Which means that a lot of
details can still change, and details are important. This is a highly
unusual procedure for Brussels. We expect the technical side to take
another month and then a final DSA version to be voted on by Parliament in
July.
—
Modération citoyenne: We welcome that during the deliberations lawmakers
began making a distinction between rules created and imposed by the
services provider and rules written and applied by volunteer editing
communities. It is a pity that “citizen moderation”, something the internet
needs more of, wasn’t recognised explicitly. But the definitions and the
articles make clear that the DSA is about the service provider activities
and shall not interfere with community content moderation.
—
Positive safeguards: Further positive safeguards for intellectual freedom
online include a ban on targeted advertising using sensitive information
and a ban on “dark patterns”.
We regret that the so-called “crisis mechanism”, a provision allowing the
European Commission to ask very large platforms to tackle certain content
in times of crisis, came as a last minute addition and was not properly
publicly deliberated. Its safeguards remain vague.
—
Crisis mechanism: A provision allowing the European Commission to ask very
large platforms to tackle certain content in times of crisis, came as a
last minute addition and was not properly publicly deliberated. Its
safeguards remain vague, but together with civil society partners we
managed to include a few:
*A majority of Member States need to approve the mechanism;
*All requests sent to platforms must be immediately public;
*A three month sunset clause;
*Fundamental rights and proportionality language;
*The way in which problematic content is tackled is with the service
provider.
—
Further reading: You are welcome to check out our analysis of the DSA
result from a Wikimedia perspective
<https://wikimedia.brussels/dsa-political-deal-done/>. Else, you may also
check EDRi’s rundown for a more general digital rights perspective
<https://edri.org/our-work/eu-negotiators-approve-good-dsa-but-more-work-is-…>
.
=====================
WIKICHEESE BRUSSELS
=====================
Cheese: Together with Wikimédia France and the French Digital Ambassador we
are organising a Wikicheese apéro in Brussels. Fingerfood, drinks and of
course we will be taking images of cheeses for Wikipedia. You can still
register <https://wikimedia.brussels/wikicheese-registration/> and come!
Also, you may spread the word!
=========
DENMARK
=========
Danish DSA?: the Danish government has put forth their own legislative
proposal for the regulation of social media independent of the DSA. The
gist of the law is that social media platforms, generally defined as
platforms with the purpose of creating a profile and browsing other
profiles and user-submitted content with over 80.000 yearly users in
Denmark, will be obliged to take down illegal content within 24 hours of
reporting, with two exceptions (7 days if a more thorough investigation of
the content is required and even more in special circumstances). The law
contains an almost verbatim copy of the encyclopedia carveout of the
Copyright in the Digital Single Market directive. WMDK’s Matthias Smed
Larsen <https://twitter.com/MatthiasSmed> is working on this and looking
closely at four main issues:
*The scope of the law and the carveout;
*The definition of illegal content;
*The general issue of a 24-hour limit creating an incentive to remove
borderline content which is actually legal;
*How community content moderation fits into this.
====
END
====
The French Presidential elections have certainly changed dynamics in
Brussels. The French Presidency of the Council is willing to accept
compromises in order to wrap up reforms and show progress. Critics say this
leads to technically half-baked solutions. Our wrap up of the month of
March at warp speed!
====================
DIGITAL SERVICES ACT
====================
After its sister project, the Digital Markets Act, has been agreed upon
(read below) all eyes are seemingly on this “content moderation law” now.
The Council, Commission and Parliament are meeting at both technical and
political levels every second week now.
—
Actual Knowledge: All three institutions are in agreement to improve the
initial language about when a notice sent to a service provider leads to
actual knowledge of illegal content. It is now clear to everyone only some
notices constitute such proof and very clear in the text that service
providers have the freedom to make a call. A win we advocated for.
—
Whose rules? The DSA defines terms “terms of service” as pretty much any
rules valid on a platform. We are asking the institutions to make it
explicitly clear that there is a difference between rules imposed by the
service provider and rules created and enforced by communities of platform
users. The Parliament has incorporated this change, but the Council is
still somewhat sceptical, mainly because they believe the original wording
already covers this implicitly. Negotiations continue on this point.
—
Who will regulate? In a late amendment, the Council took the position that
the European Commission should be responsible for regulating Very Large
Online Platforms (VLOPs), while national regulators will be responsible for
the rest. For Wikimedia, only Wikipedia would be a VLOP. Meaning that
Wikipedia would be European Commission competence while Wikimedia Commons
and Wikidata would be competence of a national regulator where the
Wikimedia Foundation decides to appoint a legal representative. To make
things even messier, some of the obligations that VLOPs must also comply
with (e.g. trusted flaggers & out-out-court dispute settlements) will be
shared competence between the national regulator and the Commission. We’d
prefer a clearer separation, but at least for these the Commission
decisions would overrule the national ones.
—
Who will pay? If a user takes a platform to an out-of-court dispute
settlement body (over a content moderation decision), then the platform and
the user will pay a fee, but if the user wins, the platform will have to
cover both fees. On the other hand, if the user loses, the platform won’t
be able to push the legal cost on them. We still don’t know what the fees
will amount to.
—
Who will pay? #2 Somewhat surprisingly the Commission is now taking the
position that if it is to be the regulator for VLOPs it will need to charge
a fee to them in order to cover the additional costs. Apparently this is a
principle that already exists in the financial sector. Lawmakers involved
in the negotiations haven’t seen concrete wording yet, but from three
independent sources we have confirmation that if this gets accepted the DSA
won’t mention any actual amounts, rather give the Commission the power to
set up a fee structure in a delegated act. We have reached out to the
Commission who have ensured us that they are well aware of the different
nature of platforms (including their purpose and tax status) and will make
this a factor in calculating the amounts. Still a lot of fog shrouding this
point.
—
“A War Clause”: We kid you not, this is what an 11th hour suggestion goes
by in the corridors and chat windows. It is a proposal that would let the
Commission, in extraordinary circumstances, ask platform providers to
moderate certain dangerous content very quickly. Now, a similar provision
already exists in other (e.g. Anti-Terrorist) regulations, so this is not
unheard of. But we worry a lot about this, together with many EDRi members.
At the very least we are asking for a much clearer definition on what
constitutes an extraordinary circumstance, who establishes it and for how
long. Further, we demanded safeguards against censorship and overreach.
That being said, rules allowing authorities and the service providers to
act very quickly in case of threat to life and limb already exist and work
well, so there is a way to handle this.
—
Targeted Ads: It looks like all EU institutions can agree that sensitive
data (e.g. religious and political preferences) and data of minors should
be prohibited for targeted advertising. Not the really big coup the
Parliament was envisioning, but a major step.
—
Nota Bene: We normally share plenty of links and sources in this monitoring
report. For confidentiality reasons and to protect sources we are unable to
do so in this case. If you like additional insight, please get in touch
off-list.
====================
DIGITAL MARKETS ACT
====================
Stick a fork in it, it's done! The EU law on competition rules for online
platforms is coming into force next year. Several major wins for civil
society and competition there. Pre-installed apps that can’t be deleted
will become illegal. We will be able to send messages from one instant
messaging application to another. However the interoperability win has some
defects. A humongous lobbying push by dominant platform providers has
convinced the lawmaker that things like group calls are extremely hard to
do across different services. Result is that such features will become
available at a much later stage, if at all.
Anna Mazga has the deep dive for you:
https://wikimedia.brussels/dma-heated-trilogue-negotiations-concluded-with-…
Let’s start with the unpleasant: The war in Ukraine has of course made it
to the top of everyone’s agenda, as it should. Work on files still
continues, but the energy is different and we expect the pace to slow down.
At the same time, this war is being played out on online platforms and
certain aspects are being picked up by lawmakers when discussing their
regulation.
====================
DIGITAL SERVICES ACT
====================
The DSA is in trilogue, i.e. the three main EU bodies have adopted their
respective positions and are now trying to hammer out a common version. Our
main headache in the original proposal was the notion of automatic
assumption of “actual knowledge” of illegal content upon receipt of a user
notice. The fix to that seems uncontested. We are still waiting for news on
the definitions article, where the Parliament added a differentiation
between content moderation by the service provider and users. Something we
asked for and supported.
-
Else, the dark pattern prohibition proposed by the Parliament (designs that
nudge users to accept tracking) seems to be welcomed by Member States.
-
On a more general note, we are seeing the Russian Invasion starting to play
a role in discussions about content moderation. Lawmakers were called out
<https://twitter.com/Gary_Machado_/status/1497668398519656450> for having
proposed a “media exemption” which would prohibit online platforms from
interfering with media content. Something they are currently asked to do
to stem disinformation about the war. There is also some interest in how
Wikipedia handles such fast-developing news and events. We plan to reshape a
great Twitter thread <https://twitter.com/sdkb42/status/1497407518968012807>
into a blogpost for lawmakers, in order to explain community content
moderation better.
====================
Data Act
====================
Last week the European Commission presented its Data Act
<https://ec.europa.eu/newsroom/dae/redirection/document/83521> proposal. A
Regulation that is mainly focused on business to business data sharing and
portability, but that also includes a few elements we care about.
—
It empowers users to have access to data a service or device has produced
and be able to port it (articles 4&5), which is welcome. It also would allow
governments to access business data in extraordinary circumstances, like a
global pandemic (Chapter V). Safeguards and limitations need to be
waterproof here.
—
Most importantly, though, the Data Act also contains “a revision” of the sui
generis database right (SGDR). A copyright-like additional layer on
non-original databases that we would like to see abolished. In Chapter X
the Commission “clarifies” that these protections can’t apply to machine
generated data. We think that doesn’t go far enough
<https://wikimedia.brussels/data-act-a-small-step-for-databasees-an-even-sma…>
and are drafting an amendment
<https://docs.google.com/document/d/1KvvYz06hUCs2Z2u4HLFXKIQKfWSFjb-LmYWPaww…>
to get rid of a much larger chunk of unwanted SGDR protection. Our
umbrella association Communia is also organising a Salon on the SGDR
<https://vimeo.com/webinars/events/065f5169-9a7a-40a8-8f87-9dcaa37ae7f2> on
2 March.
====================
The Digital Markets Act
====================
The conversation about imposing an interoperability obligation on
gatekeepers is stuck. The Commission is “looking for evidence” from
messaging services that interoperability is actually needed, while smaller
providers are insecure about it as too many details are still unclear.
In the meantime, civil society (including us) circulated an open letter to
the Commission, the French Presidency and MEPs on involving users in
enforcement procedures
<https://www.politico.eu/wp-content/uploads/2022/02/21/BEUC-X-2022-023-open-…>
.
====================
Artificial Intelligence Act
====================
The AI Act that is now being mulled over in a number of parliamentary
committees
<https://oeil.secure.europarl.europa.eu/oeil/popups/ficheprocedure.do?refere…>
deals with three instances of AI use: prohibited, high-risk, and one that
requires special transparency. That last category includes instances of
individuals interacting with an AI-based bot, when emotion recognition or
biometric categorisation is required, or in the case of deep fakes. We
don’t think that any artificial intelligence tools Wikimedia editors and
staff currently use are covered by the new obligations, but as lawmakers
start editing the proposal we need to stay on top of changes. The
discussion around deep fakes is of particular interest to us, as Wikimedia
content can be used for their creation, but also such content could become
a misleading source of information.
====================
Online Political Advertising .
====================
Normally this regulation
<https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52021PC0731>
should only do what its name says: set clear rules on political
advertising, especially during electoral campaigns. However, the definition
of political advertising has us somewhat worried, so we are running an
extra check on it:
*Article 2.2*
‘political advertising’ means the preparation, placement, promotion,
publication or dissemination, by any means, of a message:
(a) by, for or on behalf of a political actor, unless it is of a purely
private or a purely commercial nature;
or
(b) which is liable to influence the outcome of an election or referendum,
a legislative or regulatory process or voting behaviour.
====================
Finland Copyright Reform
====================
The copyright reform in Finland was going rather well. User rights were
being enshrined and balanced exceptions drafted. But over the past months
collective management organisations (CMOs) in Finland have pushed very hard
and the lead civil servant was replaced essentially by a lobbyist hired as
a chief consultant to the Ministry. The Ministry has now redrafted the
document, but it refused to present the new version. Instead they just let
bits of information slip out. It seems like they are opting for a rewrite
that maximises CMO turnover and includes the bare minimum in terms of user
rights. For information on this Finnish drama we have only Finnish language
sources (ask a Finn or use deepl):
-
Electronic Frontier Finland complaint
<https://effi.org/lex-liedes-effi-kanteli-oikeuskanslerille/>
-
The presentation of the new version
<https://api.hankeikkuna.fi/asiakirjat/ea5be8e5-c718-4049-8836-5d7fa9589c18/…>
Hello everyone,
This is a friendly reminder that The Community Development team
<https://meta.wikimedia.org/wiki/Community_Calls> at the Wikimedia
Foundation is hosting our first ever community call on Wednesday, February
23, 2022 from 15:00 - 16:00 UTC on Zoom
<https://wikimedia.zoom.us/j/84302371758?pwd=b1ZFNTMyaFdJRCtxbzIyaU13OWlmUT09>
[meeting link]
We want to thank those who have expressed interest in speaking during this
call, we had an overwhelming response. Unfortunately we were only able to
choose 3 speakers from the community. We will be sending invitations to
those who have been selected to participate in the meeting shortly. At this
time sign ups for speaking during the call has closed.
To join in the community call:
-
To attend the call, please find the link here
<https://wikimedia.zoom.us/j/84302371758?pwd=b1ZFNTMyaFdJRCtxbzIyaU13OWlmUT09>
.
-
Please ensure you have zoom downloaded on your personal device prior to
the call.
We are excited to see and hear from you in our first ever Community
Development community call!
Thank you,
The Community Development team
Cassie Casares
Program Support Associate
Community Development
Wikimedia Foundation
ccasares(a)wikimedia.org
Hello everyone,
The Community Development team
<https://meta.wikimedia.org/wiki/Community_Development> at the Wikimedia
Foundation is hosting our first ever community call on February 23, 2022
from 15:00 - 16:00 UTC on Zoom <https://wikimedia.zoom.us/j/84302371758>.
The purpose of the community call is to provide a space for volunteers to
meet the Community Development team. The five-person Community Development
team is responsible for developing accessible opportunities for volunteers
to grow critical capacities and leadership skills for movement
sustainability and growth. The community call is also a shared space for
all volunteers to hear or speak about projects related to capacity building
& leadership development in the Wikimedia community.
Our first call, Meet the Community Development Team will include a sign up
sheet
<https://docs.google.com/forms/d/e/1FAIpQLSdIiyAlq7uKBp4eqmnbyVr9Zmi3LBjCQgZ…>for
those who would like to speak for 5-10 minutes about their
individual/community capacity building work.
If you would simply like to attend and listen, join the Zoom link at the
scheduled call time, please find the link here
<https://wikimedia.zoom.us/j/84302371758>.
Although the primary language of the call is English, we are committed to
providing simultaneous translation to the best of our ability. Please use
the same sign up sheet
<https://docs.google.com/forms/d/e/1FAIpQLSdIiyAlq7uKBp4eqmnbyVr9Zmi3LBjCQgZ…>
to indicate your language preference so that we can best accommodate
translation support as needed.
Some may be asking, what is capacity building and why is the Community
Development team supporting this work?
What is capacity building? Capacity building is any activity aimed at
developing a skill or capability in others; it can take many forms, ranging
from formal training through online self-study courses to individual
mentorship.
The “Invest in Skill and Leadership Development”
<https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recomme…>
recommendation of the 2030 Wikimedia Movement Strategy
<https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recomme…>encourages
movement-wide capacity building as a necessity to achieve our strategic
commitment to Knowledge Equity. Capacity building has the ability to
encourage diversity, redistribute and share resources, welcome newcomers
and grow communities.
Community Development is launching this series to ensure we can hear
directly from volunteers who are interested in or who already are leading
capacity building work in their communities. This is one one way we plan to
create a space for direct conversations with volunteers about the different
contexts, challenges, and opportunities in capacity building across the
movement.
How do our projects relate to capacity building and leadership development?
Our team's portfolio is a curation of online and offline learning programs
and resources that aim to enable skill development and resilience in the
movement.
Some of our recent capacity building projects include the Wikilearn Online
Learning Pilot
<https://meta.wikimedia.org/wiki/Community_Development/WikiLearn>, Board of
Trustees Candidate toolkit
<https://meta.wikimedia.org/wiki/Wikimedia_Foundation_elections/Candidate_Re…>and
the 2021 Wikimania Speaker Series
<https://meta.wikimedia.org/wiki/Community_Development/What_we_do>.
In order for us to continue building relevant curricula and programming, we
need to hear from the Wikimedia movement. Please visit the Community
Development team page
<https://meta.wikimedia.org/wiki/Community_Development> to learn more about
our capacity building and leadership development projects.
Key Dates:
-
First community call: February 23, 2022 from 15:00 - 16:00 UTC on Zoom
<https://wikimedia.zoom.us/j/84302371758>
-
Sign up
<https://docs.google.com/forms/d/e/1FAIpQLSdIiyAlq7uKBp4eqmnbyVr9Zmi3LBjCQgZ…>
to speak at the first community call. The application closes on February
February 14th, 2022.
To participate in the community call:
-
To attend the call, please find the link to the call here
<https://wikimedia.zoom.us/j/84302371758>.
-
Please ensure you have zoom downloaded on your personal devicer.
-
Use the sign up sheet to confirm your interest to speak about your work
in capacity building here
<https://docs.google.com/forms/d/e/1FAIpQLSdIiyAlq7uKBp4eqmnbyVr9Zmi3LBjCQgZ…>
We are excited to see and hear from you in our call!
Thank you from the Community Development team
Cassie Casares
Program Support Associate
Community Development
Wikimedia Foundation
ccasares(a)wikimedia.org