The EU institutions, as most of Europe, are shutting down for the next few weeks, so we are giving you an update of all relevant files before we do the same. It is way longer to read than usual, sorry! But we will use this ourselves as a cheat sheet to refresh our memories during la rentrée in September.


Dimi & Anna


====

DSA

====

The Digital Services Act has been agreed upon. The good news is that it shouldn’t disrupt community moderation processes, it focuses on the service provider, i.e. the Wikimedia Foundation. It also gives a decent framework for notice-and-action procedures. Our analyses: [1a][1b]

Procedure: It needs to pass another Council vote in September. Then it will be published in the Official Journal of the EU and enter into force 20 days after the publication date. The DSA obligations will apply fifteen months, or from 1 January 2024, whichever comes later. However, obligations on very large online platforms (VLOPs) will apply earlier: Four months after a platform has been designated as such by the European Commission. 

Much of the actual work on the DSA is still to be done. For one, the European Commission needs to set up a process for designating VLOPs. Depending on whether Wikipedia is declared one (possible, but not certain), a number of additional obligations will apply. One such obligation will be to carry out an annual risk assessments that analyse issues such as disinformation or election manipulation, cyber violence against women, or harms to minors online. Based on the results, mitigation measures might need to be put in place by the service provider. There are many details here that will need to be figured out, for instance exactly which criteria the Commission will use to designate VLOPs and whether they will issue additional guidelines for assessments and mitigation measures.

====

DMA

====

The Digital Markets Act has been pushed through the trilogue negotiations even faster than the DSA with the final agreement on 24 March. It will adopted by the European Parliament during the plenary session in Strasbourg between 12-15 September. The main win, from our perspective, is the expansion of interoperability measures.

Interoperability had been drafted narrowly in the European Commission’s proposal [2a], obligating gatekeepers to offer interoperability for so-called ancillary services, like payment or identification services, that wish to operate within their systems. In the final version [2b] it emcopasses messaging services and voice and video calls, but the functionalities will be developed and added gradually over years. For more details see this: [2c]

After the September vote, the new law will be published in the Official Journal and will enter into force in 20 days after. Two years after that the DMA has to be reviewed and a decision will be taken by the European Commission whether to expand interoperability requirements to social networks. 

==========

E-Evidence

==========

The so-called E-Evidence Regulation is trying to come up with a straightforward process by which a prosecutor or judge from one EU Member State can request electronic evidence from a service provider based or represented in another Member State within criminal proceedings. The Wikimedia Foundation gathers very little information about users and editors on its projects. However, the Foundation’s servers do record the IP addresses of users who have accessed Wikipedia, and the individual articles they have viewed. The proposed regulation would have packaged the information about the individual articles viewed under an “access data” category, meaning a prosecutor or judge in a EU member state could have requested it in the process of any criminal proceeding. [3]

The European Parliament had suggested making it much harder to access data about what a user read, while the Council wanted to keep the proposed processes. After two and half years of stalemate, a partial compromise was reached last month. Within any criminal investigation, prosecutors and judges will be able to request “IP addresses [...] for the sole purpose of intensifying the user". On the other hand "electronic communications metadata" and data identifying user ID, which would give away which articles a user read, may only be requested only "for criminal offences punishable in the issuing state by a custodial sentence of a maximum of at least 3 years". In other words, if a user is investigated for a crime where the potential imprisonment is longer than three years, a judge or a prosecutor would be allowed to ask for this data. See the negotiation documents: [4]

While the new text is an improvement as compared to the original proposal, it is a step down from what the European Parliament suggested. One major aspect still being debated is the involvement of authorities of the country where the service provider is represented. Say that a Bulgarian prosecutor orders user content from an e-mail provider in The Netherlands. Should the Dutch authorities have a role in this procedure? It looks like there is an appetite to set up a common exchange platform which would at least notify the host country’s authorities. The details are still being hammered out. 

======

CSAM

======

The European Commission is tyring to establish rules on tackling Child Sexual Abuse Material (CSAM) online. The proposed provisions cover our projects and would require us, upon judicial order or order by a relevant authority, to scan our projects for CSAM content or grooming activities. That being said, as our projects’ content is open, anyone can scan it already, even without an order. [5a]

-

More worrisome is a provision that would require service providers to scan all private conversations (e.g. chats, direct messages) for CSAM. While this would also only by judicial order or order of a relevant authority, serious questions about basic principles arise. Upon receiving such an order a provider would have to scan *all* private conversations on its platform for specific or similar content. In practice, such orders would trigger searching through the conversations of millions of people. This would also not be possible on systems with end-to-end encryption, which could lead to them being de facto outlawed. [5b]

========

AI Act

========

The Artificial Intelligence Act[6a] deals with three instances of AI use: prohibited, high-risk, and one that requires special transparency. That last category includes instances of individuals interacting with an AI-based bot, when emotion recognition or biometric categorisation is required, or in the case of deep fakes. The discussion around deep fakes is of particular interest to us, as Wikimedia content can be used for their creation, but also such content could become a misleading source of information. 

The AI Act has been mulled over in a  number of parliamentary committees[6b]. It seemed that JURI (Legal Affairs Committee) was on track to vote on it even before summer recess, but disagreements prevailed. Next opportunity will be in September, if they are resolved.

========

Data Act

========

The Data Act was supposed to be a central pillar of the EU’s data economy framework, but frankly reads more like a medley of decent ideas. [7]

Perhaps the most important for us (and Wikidata) is that the sui generis database right cannot be invoked when a user wants to share data generated by a device they use. A small step, but in the right direction. We will, of course, try to extend this. Nevertheless, we are also demanding the European Commission to include a full-blown revision of the Database  Directive in its next work plan. [8]

– 

A key element of the Data Act are provisions which mandate all devices that generate data (think IoT) to allow users to access this data by default. Users may also mandate that the data is shared with third parties, e.g. a competing producer or service. We do welcome the approach here. Legislators until now had the reflex to create new exclusive rights whenever they wanted to give someone control over data. Instead the Commission is trying to boost user control and sharing by creating access rights that trump property rights. 

A somewhat worrisome provision is that during “public emergencies [...] a data holder shall make data available to a public sector body or to a Union institution, agency or body demonstrating an exceptional need to use the data requested”. This comes from the Coronavirus pandemic and the thinking that the public sector could have reacted better and in a more targeted manner if it had certain data at its disposal. However, the term public emergency is not well defined and could  be used in many situations. 

One more welcome provision targets “switching between data processing services”, so cloud providers. Providers of cloud services will not be allowed to create commercial, technical, contractual or organisational obstacles when a customer decides to switch to another provider. Furthermore users will be allowed to terminate contracts with a maximum period of 30 days. The vision is that changing cloud and processing providers should be as easy as changing a mobile phone subscription. 

======

Disinfo

=======

In June the EU Commission presented a new Code of Practice on Disinformation. [9] It is essentially a beefed up version of the 2018 Code of Practice. Back then Wikimedia participated in the drafting process, but didn’t sign it, since it mostly focused on advertising and paid content. The same is true for the measures in the new code, which include:

The code is voluntary, but signing and adhering to it will be considered by the Commission as fulfilment of some of the obligations under the Digital Services Act. Which, we must admit, is a clever way to give it some legislative weight. 

=============

Net Neutrality

=============

After the Court of Justice of the EU confirmed that zero rating music streaming apps breaches the bloc’s net neutrality rules, the Body of Telecoms Regulators (BEREC) updated its guidelines. Offering zero-tariffs or different speeds, for instance at night or during weekends, will still be possible, as long as they apply to all traffic, not just to specific apps. 

Meanwhile large european telecommunications companies are trying to convince the Commission to propose regulation that would make the largest online platforms pay for parts of their infrastructure. We are expecting a public consultation after the break. [10]

=============

EMFA

=============

The European Media Freedom Act is still on the drawing board. It is still unclear what it will contain. It will aim to provide protections for media pluralism and to increase the safety of journalists. It might also increase transparency and accountability mechanism already laid down in the currently active Audiovisual Media Services Directive. [11]

One thing that many broadcasters and publishers will attempt is to add a “media exemption” to the EMFA. This was already attempted with the DSA, but failed. Many media houses want to have protections against their content being deleted from social media. Such suggestions in the past boiled down to a prohibition for online platforms to remove content by registered media, which is very troublesome and which we also oppose. It is unclear whether Wikimedia projects would even fall into the scope of the definitions, so we will wait to see the actual proposal. A compromise between social networks and media houses that has been floated in the past is that registered media get a notice when their content is taken down and get a fast-track for contesting such decisions. 

====

ePrivacy Regulation

====

The ePrivacy Regulation[12] could set a firm standard on how online tools can and cannot be used in contacting, profiling and surveilling individuals. Currently, in several Member States, based on the ePrivacy Directive and subsequent national laws, nonprofits have the right to contact individuals who they were in touch with before, on an opt-out basis. This option is used by some Wikimedia chapters for fundraising. In practice it means that a chapter may contact people with whom there was previous interaction, as long as they provide a possibility to refuse further communication. Naturally, our position is that we want to maintain this opportunity. See our ideas here: [13] 

During the French Presidency of the Council (January-June 2022) negotiations were practically stalled, probably mainly because of conflict on how to reshape cookie notices and behavioural advertising rules. We will see if the Czech Presidency can unblock the talks. 

====

END

====

[1a]https://wikimedia.brussels/dsa-political-deal-done/

[1b]https://medium.com/wikimedia-policy/the-eu-digital-services-act-whats-the-deal-with-the-deal-33ecc4a5432

[2a]https://eur-lex.europa.eu/legal-content/en/TXT/?qid=1608116887159&uri=COM%3A2020%3A842%3AFIN

[2b]https://www.europarl.europa.eu/RegData/docs_autres_institutions/commission_europeenne/com/2020/0842/COM_COM(2020)0842_EN.pdf

[2c]https://wikimedia.brussels/dma-heated-trilogue-negotiations-concluded-with-partial-interoperability-gains/

[3]https://wikimediapolicy.medium.com/new-e-evidence-rules-in-europe-lets-keep-reader-data-well-protected-76766987646f

[4]https://www.statewatch.org/news/2022/july/eu-e-evidence-an-agreement-has-been-reached-on-the-core-elements-of-the-instruments/

[5a]https://docs.google.com/document/d/1dD5AF8-uk2LFG7mu62AK7S80H4CrF1ftV7lheE6ZJBM/edit

[5b]https://edri.org/our-work/european-commissions-online-csam-proposal-fails-to-find-right-solutions-to-tackle-child-sexual-abuse/

[6a]https://ec.europa.eu/info/research-and-innovation/research-area/industrial-research-and-innovation/key-enabling-technologies/artificial-intelligence-ai_en

[6b]https://oeil.secure.europarl.europa.eu/oeil/popups/ficheprocedure.do?reference=2021/0106(COD)&l=en

[7]https://ec.europa.eu/commission/presscorner/detail/en/ip_22_1113

[8]https://wikimedia.brussels/data-act-a-small-step-for-databasees-an-even-smaller-step-for-the-eu/

[9]https://digital-strategy.ec.europa.eu/en/library/2022-strengthened-code-practice-disinformation

[10]https://wikimedia.brussels/update-on-net-neutrality-in-the-eu/

[11]https://www.europarl.europa.eu/legislative-train/theme-a-new-push-for-european-democracy/file-european-media-freedom-act

[12]https://digital-strategy.ec.europa.eu/en/policies/eprivacy-regulation

[13]https://wikimedia.brussels/e-privacy-our-quick-fix-to-help-nonprofits-and-protect-consent/