Wow! What a month! The Terrorist Content Regulation passed without a final
vote, an Artificial Intelligence law was proposed unexpectedly quickly and
over 600 amendment proposals to the Data Governance Act were tabled. And,
and... we started a blog! A lot to unpack, so we will spare you the update
on the Data Services Act this time around, as no big shifts occurred there
anyway.
Anna & Dimi
This and previous reports on Meta-Wiki:
https://meta.wikimedia.org/wiki/EU_policy/Monitor
======
TERREG
In an unexpected turn of events, Terrorist Content Regulation has been
adopted without a final vote. It has been possible due to a procedural
peculiarity [01] that defaults legislation “inherited” from the previous
legislative term to an adoption without a vote. A vote is a possibility if
a political group or 71 MEPs puts a motion to reject it or to open it up
for amendments. But since nobody filed one within a given deadline, the
adoption was simply announced at the plenary session, to the surprise of
many MEPs.
--
This way, the dangers of content filtering, over-policing of content by
state and private actors, and the cross-border prerogatives for governments
will become law in 12 months from now without a final stamp from the
elected representatives of the European citizens. As much as we didn’t
expect a miracle of rejection of a hard-fought-for proposal [02], in
democracy it is important to see where your representatives stand through a
vote.
We can only hope that next time, the MEPs and staffers who fought hard for
this text to be better, won’t miss a deadline and run a procedure check as
part of their preparations to an important vote.
======
AI Regulation
The European Commission proposed the world’s first AI law. Curiously, the
EU and US didn’t seem out of sync on this - the Federal Trade Commission
published its own set of guidance [03] with partially overlapping
requirements. But back to Europe: The proposal wants to ban some uses of AI
(real-time facial recognition in public places & social scoring) and to
impose obligations on “high-risk” uses (think credit scoring, self-driving
cars, social benefits). It requires high-quality data sets, testing for
discriminatory outcomes and a certain amount of transparency. The devil is,
as always, in the detail.
---
Bans: The proposed regulation outlines a list of banned artificial
intelligence applications that includes government-conducted social
scoring, real-time biometric recognition systems (e.g. facial recognition)
and practices that “manipulate persons through subliminal techniques beyond
their consciousness” or “exploit vulnerable groups such as children or
people with disabilities''. [04] As you can expect, these bans come with
numerous exceptions. Real-time facial recognition, for instance, shall be
allowed when looking for missing children or in the case of imminent
terrorist threat. Expect long debates and wrestling on concrete wordings.
---
High-risk Uses: A further category of regulated AI applications are
“high-risk uses”. Of course, the details of the definition will be key
here. Expect some fluffy wording combined with a list of concrete examples
in an annex [05], which is supposed to be updated by the European
Commission over the years. The proposed Annex includes uses in transport
(think self-driving cars), education, employment, credit scoring or
benefits applications, asylum and border control management. This list will
be a major lobbying battle. lists uses where AI will always be “high
risk,” such as employment and migration control.
When applying AI to high risk uses the operator, producer or distributor is
required to have a quality management system, undergo a conformity
assessment (through national authority or self-assessment), keep
documentation & logs, notify a national authority, ensure human oversight,
take corrective actions when risks are recognised and apply the CE marking.
[06]
A lot to unpack here and, of course, the devil is in the details. Expect us
to look very closely into the education AI uses and what exactly will be
covered.
---
Transparency Obligations: There are even fluffier transparency obligations
for “certain AI systems”. In a very simplified translation from legalese
the rule basically wants to say that if an AI system interacts with natural
persons, the person must know that it is AI/ML and what it does (e.g. if it
recognises emotions).
---
First reactions and legislative process: We think the proposal is filled
with good intentions that can end up as very sensible general rules for AI
development and deployment or can terminate in a bureaucratic hell for
everyone. Not sure we mentioned this before, but it looks like the devil
will be in the details. The European Consumer Protection Bureau (BEUC)
criticised that consumers aren’t given a straightforward way to enforce
their rights and access to redress and remedies. [07] EDRi and the European
Data Protection Supervisor call for adding predictive policing and all
forms of biometric surveillance in public places into unacceptable uses
category. [08] Tech Industry trade lobbies such as CCIA and DOT Europe were
quick to warn against unnecessary red tape, but also seemed to see some
sense in the approach.[09] We are now waiting for the European Parliament
committee to fight over and agree which one will be responsible - a
three-way race between the Internal Market, Legal Affairs and Civil Rights
committees.
======
Data Governance Act
---
We now have over 600 amendments tabled on the DGA. A lot to unpack, but we
will basically support the types of changes:
1. Amendments that will ensure that general interest projects (such as
freely licensed knowledge resources) aren’t obliged to register with a
national authority (a requirement planned for some cross-industry
data-sharing clearinghouses). Currently the wording is unclear.
2. Amdements that will restrict the use of the sui generis database rights.
3. Amendments that will ensure that the DGA doesn’t interfere with the
GDPR.
The meetings of the MEPs to discuss their amendments and look for
compromises are scheduled for May and April, but will likely continue after
summer. All amendments: [10][11]
======
wikimedia.brussels
---
Now that stand-alone blogs aren’t cool and hip anymore, we have finally
gotten around to starting one :/ The idea behind it is to have a place to
write more regularly on legislative files and to establish it as a source
for EU policymakers. Here are some reads that are already online:
-
E-Evidence: trilogues kick off on safeguards vs. efficiency - Dimi lets
us in on the sensitivities around passing on user data for the purposes of
criminal investigations
-
https://wikimedia.brussels/e-evidence-trilogues-kick-off-on-safeguards-vs-e…
-
What happens in Geneva shouldn’t stay in Geneva: Wikimedia and
international copyright negotiations - Justus (WMDE) explains why the
transparency of international negotiations on intellectual property matters
should be increased
-
https://wikimedia.brussels/what-happens-in-geneva-shouldnt-stay-in-geneva-w…
-
Sanctioning the giants – will the internet be better with the Digital
Markets Act? - Anna weighs in if the hopes for a reform of the
platforms’ ecosystem have been fulfilled in the DMA
-
https://wikimedia.brussels/sanctioning-the-giants-will-the-internet-be-bett…
-
How the DSA can help Wikipedia – or at least not hurt it - because terms
and conditions and community moderation rules are different and they both
matter
-
https://wikimedia.brussels/how-dsa-can-help-wikipedia-or-at-least-not-break…
======
======
END
======
[01]
https://www.europarl.europa.eu/doceo/document/RULES-9-2021-01-18-RULE-069_E…
[02]https://data.consilium.europa.eu/doc/document/ST-14308-2020-REV-1/en/pdf
[03]
https://www.ftc.gov/news-events/blogs/business-blog/2021/04/aiming-truth-fa…
[04]
https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-laying…
[05]https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=75789
[06]https://en.wikipedia.org/wiki/CE_marking
[07]
https://www.beuc.eu/publications/eu-proposal-artificial-intelligence-law-we…
[08]https://twitter.com/edri/status/1386968653996888069
[09]
https://techcrunch.com/2021/04/21/europe-lays-out-plan-for-risk-based-ai-ru…
[10]https://www.europarl.europa.eu/doceo/document/ITRE-AM-692584_EN.pdf
[11]https://www.europarl.europa.eu/doceo/document/ITRE-AM-691468_EN.pdf
Good afternoon everyone,
In 2018, the European Commission signed, with online platforms and
stakeholders, a voluntary Code of practice on disinformation. After two
years, the Commission notices some inconsistences in its application.
So, an initiative was launched the 1st of April, to provide advice on
how to better apply the code :
https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/1…
Dimi and I worked on a document, and we would be happy to have your
feedbacks, before Tuesday night.
https://docs.google.com/document/d/17iyPSnf2qsS-1_JEW9gPPnA9yj6TODSZIs6LSlh…
If you have some questions, do not hesitate.
Have a good day,
Take care,
--
*Naphsica Papanicolaou
*
*/Chargée de plaidoyer
/**/01 42 36 26 24/**/
/*
*/06 09 36 10 59/*
*/-----------------------------------/**/-------------------------/*
*WIKIMEDIA FRANCE
*Association pour le libre partage de la connaissance
*/www.wikimedia.fr <http://www.wikimedia.fr/> /*
/40 rue de Cléry, //75002 Paris/
<http://www.openstreetmap.org/node/691082430#map=19/48.86814/2.34683>
Salut la liste !
There is little real progress on the Digital Services Act, as the European
Parliament committees are bickering about competencies. On the other hand
TERREG and the Data Governance Act and E-Evidence are moving ahead. Sorry
for the longer read, but plenty of nuance to unpack.
Anna & Dimi
This and previous reports on Meta-Wiki:
https://meta.wikimedia.org/wiki/EU_policy/Monitor
======
TERREG
We have the final vote date for the long-debated terrorist content
regulation proposal. [01] The Plenary will take their decision in the last
week of April, most likely on April 28th. Despite many interinstitutional
clashes [02], the regulation is still too blurry, too broad, and infringing
on rights to express political views and to access information.
---
EDRi together with Wikimedia Deutschland, Access Now, and Civil Liberties
Union for Europe launched an open letter [03] to MEPs urging them to vote
“no” on this proposal. We cite the danger of content filtering, the over
policing of content by state and private actors, and the cross-border
prerogatives as main reasons why the proposal should be rejected. Please
spread the word and, if you can, contact your favourite MEP to make sure
they are on the right side of history with this one.
======
Digital Services Act
---
Our basic position hasn’t changed [11] and we are still making the rounds
with MEPs and Member States so chart the map.
---
Meanwhile compétence conflicts between the Internal Market Committee (IMCO)
and the Judicial Affairs Committee (JURI) as well as the Industry, Trade
and Research Committee (ITRE) and keeping MEPs from actually delving into
the text. We expect a final decision to be reached during the coming two
weeks and to be adopted during the plenary session thereafter. [12]
---
In the Council, Germany worries about how the DSA will impact its national
NetzDG law. Denmark and Ireland want more requirements for smaller
platforms “which can host extremist groups”. France was asking about
illegal products on marketplaces. The Dutch seemed to complain to the
Commission about not including disinformation provisions. Malta is
concerned that by having to respect national law, platforms would be forced
to disable access in one country or another, which would lead to
fragmentation of the online environment. We know all this from written
Commission answers to the Member States, which are not public.
======
Data Governance Act
---
Both the Portuguese Presidency and the European Parliament rapporteur
Angelika Niebler (EPP DE) published their draft for changes to the Data
Governance Act. [13][14] Unfortunately neither takes into account our
demands to explicitly state that “collaborative knowledge” projects are
“general interest” services, which could be important for the requirements
our projects fall under.
---
However, Ms. Niebler proposes to define “data intermediaries” as providers
of data sharing services with the main objective of establishing a
business, a legal and potentially also technical relation between an
indefinite number of data holders, including data subjects and an
indefinite number of potential data users and which assists both parties in
a transaction of data assets between the two”. The Presidency of the
Council on the other hand makes clear that “data sharing services” are
“commercial services”, which might be good enough.
---
Both drafts keep the limitation on the sui generis database rights in
place, which is welcome.
======
Digital Markets Act
This legislative proposal [21] is important on its own as it can
potentially reign in the so-called gatekeepers - the most influential
internet platforms - into more transparency and accountability. We read it
and we lay the basics out [22] for you: what the DMA, as proposed by the
European Commission, regulates - and (equally as important) - what it
doesn’t regulate.
---
Also, the MEP Rapporteur team in the IMCO (Internal Market and Consumers
Committee) has finally formed. If you want to tell them what you think
about the DMA, they are: Andreas Schwabb (EPP, DE - at the helm); Evelyne
Gebhardt (S&D, DE); Andrus Ansip (RE, EE); Virginie Joron (ID, FR), Martin
Schirdewan (GUE, DE), Marcel Kolaja (Greens, CZ), and Adam Bielan (ECR,
PL).
======
E-Evidence
---
The Regulation on European production and preservation orders for
electronic evidence in criminal matters (E-Evidence) [31] aims to create
clear rules on how a prosecutor in one Member State can request electronic
evidence. One such use case would be requesting user data from a platform
in another EU country during an investigation. We wrote about our main
issues in the past. [32]
---
We mainly worried about a new category - “access data” - which would allow
prosecutors to demand information such as IP addresses, date and time of
use, and the “interface” accessed by the user without judicial oversight.
In the Wikipedia context this would mean being able to follow what someone
has read.
The second question we have is whether the hosting country’s authority will
have the right to intervene in some cases where fundamental rights of its
citizens are concerned.
---
After the European Parliament and the Council reached their negotiating
positions last year the trilogue rounds (meetings between the two
co-legislators) have now begun. Last week was the second trilogue. So far
the good news is that both co-legislators and the Commission are in favour
of dropping the “access data” category. However, in exceptional emergency
situations prosecutors will be allowed to request data needed to identify
persons quickly. Details important here.
---
On the second main point, we are seeing that both European Parliament and
Council are coming closer around some sort of “notification” regime for the
hosting country’s authorities. The European Commission didn’t have anything
in its proposal. The European Parliament wants to make sure citizens are
(doubly) protected against prosecutorial misuse, having in mind that not
all EU members have a great record on this. The Council mainly worries
about simplicity and speed of procedures. Again, we will need to see a
possible compromise, as details are key.
---
Furthermore a Common EU Exchange System for such requests is discussed
through which platforms would receive production orders. This would solve
the issue of each platform knowing which the competent authority is in each
Member State. There would also be an option for the platform to request
reimbursement from national authorities for the costs incurred by such
orders.
---
The next trilogue will be on 20 May, still under the Portuguese Presidency.
In the meantime there will be technical meetings (assistants, experts and
advisors). Participants are expecting the talks to continue through the
Slvoenian and French Presidencies, so until first half 2022.
======
END
======
[01]
https://ec.europa.eu/commission/sites/beta-political/files/soteu2018-preven…
[02]
https://wikimedia.brussels/upside-down-is-all-content-terrorist-until-deter…
[03]
https://wikimedia.brussels/wp-content/uploads/2021/03/MEP_TERREG_Letter_EN.…
[11]
https://wikimediapolicy.medium.com/how-europes-proposed-digital-services-ac…
[12]
https://oeil.secure.europarl.europa.eu/oeil/popups/ficheprocedure.do?refere…
[13]
https://drive.google.com/file/d/1GSsmlIPF5BWwNuaw6_z0fWdRPCNpK9wQ/view?usp=…
[14]https://www.europarl.europa.eu/doceo/document/ITRE-PR-691139_EN.pdf
[21]https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52020PC0842
[22]
https://wikimedia.brussels/sanctioning-the-giants-will-the-internet-be-bett…
[31]
https://oeil.secure.europarl.europa.eu/oeil/popups/ficheprocedure.do?lang=&…
[32]
https://wikimedia.brussels/e-evidence-lets-keep-reader-data-well-protected/