Salut la liste !
The big event of the month was definitely the political deal on the EU’s
new content moderation rulebook, the Digital Services Act. There are a few
new obligations in there for Wikimedia and we will take you through them.
Next month we are also organising a Wikicheese event in Brussels. You may
spread the word!
====================
DIGITAL SERVICES ACT
====================
Weird procedure: The French Presidency of the Council wanted to get a deal
done so badly, it pushed the Parliament and Member States to accept a
“political deal” putting the main cornerstones in place. However, a
“technical deal” is still being negotiated. Which means that a lot of
details can still change, and details are important. This is a highly
unusual procedure for Brussels. We expect the technical side to take
another month and then a final DSA version to be voted on by Parliament in
July.
—
Modération citoyenne: We welcome that during the deliberations lawmakers
began making a distinction between rules created and imposed by the
services provider and rules written and applied by volunteer editing
communities. It is a pity that “citizen moderation”, something the internet
needs more of, wasn’t recognised explicitly. But the definitions and the
articles make clear that the DSA is about the service provider activities
and shall not interfere with community content moderation.
—
Positive safeguards: Further positive safeguards for intellectual freedom
online include a ban on targeted advertising using sensitive information
and a ban on “dark patterns”.
We regret that the so-called “crisis mechanism”, a provision allowing the
European Commission to ask very large platforms to tackle certain content
in times of crisis, came as a last minute addition and was not properly
publicly deliberated. Its safeguards remain vague.
—
Crisis mechanism: A provision allowing the European Commission to ask very
large platforms to tackle certain content in times of crisis, came as a
last minute addition and was not properly publicly deliberated. Its
safeguards remain vague, but together with civil society partners we
managed to include a few:
*A majority of Member States need to approve the mechanism;
*All requests sent to platforms must be immediately public;
*A three month sunset clause;
*Fundamental rights and proportionality language;
*The way in which problematic content is tackled is with the service
provider.
—
Further reading: You are welcome to check out our analysis of the DSA
result from a Wikimedia perspective
<https://wikimedia.brussels/dsa-political-deal-done/>. Else, you may also
check EDRi’s rundown for a more general digital rights perspective
<https://edri.org/our-work/eu-negotiators-approve-good-dsa-but-more-work-is-…>
.
=====================
WIKICHEESE BRUSSELS
=====================
Cheese: Together with Wikimédia France and the French Digital Ambassador we
are organising a Wikicheese apéro in Brussels. Fingerfood, drinks and of
course we will be taking images of cheeses for Wikipedia. You can still
register <https://wikimedia.brussels/wikicheese-registration/> and come!
Also, you may spread the word!
=========
DENMARK
=========
Danish DSA?: the Danish government has put forth their own legislative
proposal for the regulation of social media independent of the DSA. The
gist of the law is that social media platforms, generally defined as
platforms with the purpose of creating a profile and browsing other
profiles and user-submitted content with over 80.000 yearly users in
Denmark, will be obliged to take down illegal content within 24 hours of
reporting, with two exceptions (7 days if a more thorough investigation of
the content is required and even more in special circumstances). The law
contains an almost verbatim copy of the encyclopedia carveout of the
Copyright in the Digital Single Market directive. WMDK’s Matthias Smed
Larsen <https://twitter.com/MatthiasSmed> is working on this and looking
closely at four main issues:
*The scope of the law and the carveout;
*The definition of illegal content;
*The general issue of a 24-hour limit creating an incentive to remove
borderline content which is actually legal;
*How community content moderation fits into this.
====
END
====
The French Presidential elections have certainly changed dynamics in
Brussels. The French Presidency of the Council is willing to accept
compromises in order to wrap up reforms and show progress. Critics say this
leads to technically half-baked solutions. Our wrap up of the month of
March at warp speed!
====================
DIGITAL SERVICES ACT
====================
After its sister project, the Digital Markets Act, has been agreed upon
(read below) all eyes are seemingly on this “content moderation law” now.
The Council, Commission and Parliament are meeting at both technical and
political levels every second week now.
—
Actual Knowledge: All three institutions are in agreement to improve the
initial language about when a notice sent to a service provider leads to
actual knowledge of illegal content. It is now clear to everyone only some
notices constitute such proof and very clear in the text that service
providers have the freedom to make a call. A win we advocated for.
—
Whose rules? The DSA defines terms “terms of service” as pretty much any
rules valid on a platform. We are asking the institutions to make it
explicitly clear that there is a difference between rules imposed by the
service provider and rules created and enforced by communities of platform
users. The Parliament has incorporated this change, but the Council is
still somewhat sceptical, mainly because they believe the original wording
already covers this implicitly. Negotiations continue on this point.
—
Who will regulate? In a late amendment, the Council took the position that
the European Commission should be responsible for regulating Very Large
Online Platforms (VLOPs), while national regulators will be responsible for
the rest. For Wikimedia, only Wikipedia would be a VLOP. Meaning that
Wikipedia would be European Commission competence while Wikimedia Commons
and Wikidata would be competence of a national regulator where the
Wikimedia Foundation decides to appoint a legal representative. To make
things even messier, some of the obligations that VLOPs must also comply
with (e.g. trusted flaggers & out-out-court dispute settlements) will be
shared competence between the national regulator and the Commission. We’d
prefer a clearer separation, but at least for these the Commission
decisions would overrule the national ones.
—
Who will pay? If a user takes a platform to an out-of-court dispute
settlement body (over a content moderation decision), then the platform and
the user will pay a fee, but if the user wins, the platform will have to
cover both fees. On the other hand, if the user loses, the platform won’t
be able to push the legal cost on them. We still don’t know what the fees
will amount to.
—
Who will pay? #2 Somewhat surprisingly the Commission is now taking the
position that if it is to be the regulator for VLOPs it will need to charge
a fee to them in order to cover the additional costs. Apparently this is a
principle that already exists in the financial sector. Lawmakers involved
in the negotiations haven’t seen concrete wording yet, but from three
independent sources we have confirmation that if this gets accepted the DSA
won’t mention any actual amounts, rather give the Commission the power to
set up a fee structure in a delegated act. We have reached out to the
Commission who have ensured us that they are well aware of the different
nature of platforms (including their purpose and tax status) and will make
this a factor in calculating the amounts. Still a lot of fog shrouding this
point.
—
“A War Clause”: We kid you not, this is what an 11th hour suggestion goes
by in the corridors and chat windows. It is a proposal that would let the
Commission, in extraordinary circumstances, ask platform providers to
moderate certain dangerous content very quickly. Now, a similar provision
already exists in other (e.g. Anti-Terrorist) regulations, so this is not
unheard of. But we worry a lot about this, together with many EDRi members.
At the very least we are asking for a much clearer definition on what
constitutes an extraordinary circumstance, who establishes it and for how
long. Further, we demanded safeguards against censorship and overreach.
That being said, rules allowing authorities and the service providers to
act very quickly in case of threat to life and limb already exist and work
well, so there is a way to handle this.
—
Targeted Ads: It looks like all EU institutions can agree that sensitive
data (e.g. religious and political preferences) and data of minors should
be prohibited for targeted advertising. Not the really big coup the
Parliament was envisioning, but a major step.
—
Nota Bene: We normally share plenty of links and sources in this monitoring
report. For confidentiality reasons and to protect sources we are unable to
do so in this case. If you like additional insight, please get in touch
off-list.
====================
DIGITAL MARKETS ACT
====================
Stick a fork in it, it's done! The EU law on competition rules for online
platforms is coming into force next year. Several major wins for civil
society and competition there. Pre-installed apps that can’t be deleted
will become illegal. We will be able to send messages from one instant
messaging application to another. However the interoperability win has some
defects. A humongous lobbying push by dominant platform providers has
convinced the lawmaker that things like group calls are extremely hard to
do across different services. Result is that such features will become
available at a much later stage, if at all.
Anna Mazga has the deep dive for you:
https://wikimedia.brussels/dma-heated-trilogue-negotiations-concluded-with-…