Hello everyone! 

Many things are happening in Brussels and across Europe, but first things first: We have a new colleague! Michele Failla has been working with the Brussels team since last week. His last gig was at the Conseil Supérieur de l’Audiovisuel (CSA), the French language media regulator in Belgium. During his time there he worked on EU level media regulation and was participating in the European Regulators Group for Audiovisual Media Services. Before that he was working for a MEP on a diverse set of files in the Legal Affairs committee. Our paths had actually crossed in the past during the EU copyright reform process. Please welcome him warmly, he is on this list but can also be reached at michele.failla@wikimedia-europe.eu

=== Digital Services Act ===

WE ARE VLOP. Officially. [1] For those less acquainted with EU terminology: Wikipedia has been designated as a Very Large Online Platform by the European Commission, which means that the WMF will have to comply with the strictest obligations under the Digital Services Act, including regular risk assessments for systemic risks (including things like public health, kids’ safety and freedom of expression), publishing mitigation measures based on them, and then undergoing an external audit. Wikipedia is the only not-for-profit service that has been designated as a VLOP; the other 18 are for-profit. 

This is a chance for Wikimedia to demonstrate that compliance with such rules can be done in a manner that respects user rights and keeps communities - not the platform operator - in the driving seat. The Wikimedia Foundation (WMF) is working on compliance and dedicating significant resources to this. The challenges are serious too. If regulators cannot be convinced that Wikipedia is properly addressing “systemic risks”, like election manipulation, then WMF and the community will be challenged to find additional responses to that.  There’s also substantial “bureaucracy”: VLOP designation means that the WMF needs to appoint a legal representative in the EU. It needs to adjust internal processes so they are in line with the new “notice & action” framework, it has to set up a process on how to conduct risk assessments and mitigation reports (annually, and before making significant changes to things), and to find an appropriate auditor who will grill it on all of these things. 

VLOPs are also expected to contribute to the European Commission’s moderation decision database (though whether and how such a database can comply with EU privacy laws, remains to be seen). Plus, there remains the not inconsequential task of ensuring all of the other Wikimedia projects - such as Commons - comply with the DSA’s more general rules.  That’s a lot of work - and so is convincing the regulators to not forget our model when they’re writing the guidance and implementing rules - and it is handled by very lean teams (as openly and collaboratively as they can manage - witness the TOU update). Bon courage !    

=== CSAM ===

The Regulation to Prevent and Combat Child Sexual Abuse Material (CSAM) online is a proposal that targets providers of hosting services, providers of interpersonal communications services, and internet service providers, hence also Wikimedia projects. [2] Each EU Member State will designate a national authority that may ask a court or independent authority to issue orders to providers. These orders may be for detecting CSAM (based on a hash database provided by a new EU Centre in The Hague), for removal of identified CSAM or for reporting if the provider has become aware of CSAM. ISP may also be asked to block CSAM based on URLs.

It is the encrypted communication services that are causing the most controversy in this debate, as it would be impossible to maintain end-to-end encryption and also comply with detection orders, since messaging apps would need to scan through all conversations as per proposal. In this regard the lead MEP, Javier Zarzalejos (EPP ES), has published his draft report with several amendment proposals. On encrypted messaging services he is suggesting that apps like WhatsApp and Signal could only be faced with orders to “use and process metadata”, but not the actual messages. [3] Meanwhile the Council negotiations seem to be skirting this central issue. [4]

While the Wikimedia Foundation is not a provider of interpersonal communications services, Wikimedia still has a position [5] and is following the file. We are worried about some onerous compliance timeframes, which would be a challenge. We want to make sure that age-verification doesn’t require platforms to collect additional user data. We are strongly concerned by the proposed scanning and algorithmic evaluation of interpersonal communication. 

=== Liability on Free Software ===

The Cyber Resilience Act (CRA) sets out cybersecurity requirements for a range of hardware and software products placed on the EU market, including operating systems and software products. Part of its obligations are aimed at ensuring maintenance and security updates. The instrument of choice is to impose liability on developers and deployers of software. [6]

Our main worry is what sorts of obligations would hit developers of free software. The CRA has a recital seemingly excluding free software “outside the course of a commercial activity”, but the language chosen  would fail to address a large part of software that will not be covered but is deployed. 

Together with the Free Software Foundation Europe and EDRi we have come to an agreement to ask lawmakers to improve the carve-out by replacing the concept of “commercial activity” with an approach that focuses on deployment and significant financial benefit. To achieve this, the regulation should be targeted at those deploying software on the market, instead of covering developers. We also need an exclusion of non-profit entities and microenterprises that publish or deploy free and open source software should be introduced as an article (instead of a recital).[7] 

=== Net Neutrality ===

There is an internal struggle within the European Commission on whether to propose a principle where the largest generators of online traffic would have to pay an additional fee to telecoms. ETNO, the telecommunications lobby group, is the central stakeholder asking for this and French Commissioner Thierry Breton is pushing it. Wikimedia has a public position [8]. 

The European Commission is running an “exploratory” public consultation on the topic with a potential legislative proposal expected by year’s end. The opposition is also strong, from parts of the Commission, to French rural municipalities to conservative lawmakers (who claim that if Netflix pays more money to Orange they would spend less money on producing European films and series). 

We plan on only answering some of the questions of this very biassed consultation. If you want to have a look at our draft and edit it a little, have fun here:


=== Data Act ===

The trilogue negotiations on the Data Act have picked up quickly, with the ambition to get it wrapped up in weeks rather than months. 

The Data Act is a proposed regulation that wants to boost data sharing in-between businesses and between businesses and governments. Plus it wants to ensure switching users and business can seamlessly switch between cloud services. As such it touches a couple of thorny issues, such as data sharing rules and the sui generis database right (SGDR).[9]

Wikimeida shared its position with negotiators ahead of the trilogues. We support the Parliament's position on Article 35, which will de facto abolish the SGDR on machine generated data for the purpose of re-use by users. The Council supports this, but has stricter conditionalities. Another aspect we addressed is Chapter V, which gives governments the right to request data from businesses in emergency situations, but is too vaguely framed.











Wikimedia Europe ivzw