The French Presidential elections have certainly changed dynamics in
Brussels. The French Presidency of the Council is willing to accept
compromises in order to wrap up reforms and show progress. Critics say this
leads to technically half-baked solutions. Our wrap up of the month of
March at warp speed!
DIGITAL SERVICES ACT
After its sister project, the Digital Markets Act, has been agreed upon
(read below) all eyes are seemingly on this “content moderation law” now.
The Council, Commission and Parliament are meeting at both technical and
political levels every second week now.
Actual Knowledge: All three institutions are in agreement to improve the
initial language about when a notice sent to a service provider leads to
actual knowledge of illegal content. It is now clear to everyone only some
notices constitute such proof and very clear in the text that service
providers have the freedom to make a call. A win we advocated for.
Whose rules? The DSA defines terms “terms of service” as pretty much any
rules valid on a platform. We are asking the institutions to make it
explicitly clear that there is a difference between rules imposed by the
service provider and rules created and enforced by communities of platform
users. The Parliament has incorporated this change, but the Council is
still somewhat sceptical, mainly because they believe the original wording
already covers this implicitly. Negotiations continue on this point.
Who will regulate? In a late amendment, the Council took the position that
the European Commission should be responsible for regulating Very Large
Online Platforms (VLOPs), while national regulators will be responsible for
the rest. For Wikimedia, only Wikipedia would be a VLOP. Meaning that
Wikipedia would be European Commission competence while Wikimedia Commons
and Wikidata would be competence of a national regulator where the
Wikimedia Foundation decides to appoint a legal representative. To make
things even messier, some of the obligations that VLOPs must also comply
with (e.g. trusted flaggers & out-out-court dispute settlements) will be
shared competence between the national regulator and the Commission. We’d
prefer a clearer separation, but at least for these the Commission
decisions would overrule the national ones.
Who will pay? If a user takes a platform to an out-of-court dispute
settlement body (over a content moderation decision), then the platform and
the user will pay a fee, but if the user wins, the platform will have to
cover both fees. On the other hand, if the user loses, the platform won’t
be able to push the legal cost on them. We still don’t know what the fees
will amount to.
Who will pay? #2 Somewhat surprisingly the Commission is now taking the
position that if it is to be the regulator for VLOPs it will need to charge
a fee to them in order to cover the additional costs. Apparently this is a
principle that already exists in the financial sector. Lawmakers involved
in the negotiations haven’t seen concrete wording yet, but from three
independent sources we have confirmation that if this gets accepted the DSA
won’t mention any actual amounts, rather give the Commission the power to
set up a fee structure in a delegated act. We have reached out to the
Commission who have ensured us that they are well aware of the different
nature of platforms (including their purpose and tax status) and will make
this a factor in calculating the amounts. Still a lot of fog shrouding this
“A War Clause”: We kid you not, this is what an 11th hour suggestion goes
by in the corridors and chat windows. It is a proposal that would let the
Commission, in extraordinary circumstances, ask platform providers to
moderate certain dangerous content very quickly. Now, a similar provision
already exists in other (e.g. Anti-Terrorist) regulations, so this is not
unheard of. But we worry a lot about this, together with many EDRi members.
At the very least we are asking for a much clearer definition on what
constitutes an extraordinary circumstance, who establishes it and for how
long. Further, we demanded safeguards against censorship and overreach.
That being said, rules allowing authorities and the service providers to
act very quickly in case of threat to life and limb already exist and work
well, so there is a way to handle this.
Targeted Ads: It looks like all EU institutions can agree that sensitive
data (e.g. religious and political preferences) and data of minors should
be prohibited for targeted advertising. Not the really big coup the
Parliament was envisioning, but a major step.
Nota Bene: We normally share plenty of links and sources in this monitoring
report. For confidentiality reasons and to protect sources we are unable to
do so in this case. If you like additional insight, please get in touch
DIGITAL MARKETS ACT
Stick a fork in it, it's done! The EU law on competition rules for online
platforms is coming into force next year. Several major wins for civil
society and competition there. Pre-installed apps that can’t be deleted
will become illegal. We will be able to send messages from one instant
messaging application to another. However the interoperability win has some
defects. A humongous lobbying push by dominant platform providers has
convinced the lawmaker that things like group calls are extremely hard to
do across different services. Result is that such features will become
available at a much later stage, if at all.
Anna Mazga has the deep dive for you: