[apologies for cross-posting]
Dear colleagues,
I am reaching out because we’re comparing opinions with key experts and
stakeholders on the EU’s new Guidance for the Code of Practice on
Disinformation, and we would love to hear from you. We know this is just
the proposal from the EU Commission and it could be completely watered down
in the negotiations with platforms, but having read the text in detail we
feel that - within the constraints of the Code - the Commission is going
all-in with one of the most advanced frameworks to fight disinformation.
What do we like about it?
-
Considered in the framework of the DSA, the proposal has teeth: It
explicitly states that one of the goals of the Guidance is to evolve the
Code of Practice into a Code of Conduct provisioned by the DSA, which would
turn the commitments assumed under the Code of Practice - as a minimum -
into required risk mitigation measures (art. 27), and potentially directly
enforceable under the DSA. And very large platforms cannot cherry-pick what
suits them, nor opt out of any of the commitments.
-
It encourages the platforms to open up the algorithmic black box through
accountability and transparency, with a commitment to reduce the spread of
disinformation through “safe design” and by excluding false or misleading
content and repeated “disinformers”. It has specific KPIs requiring
disclosure of how many sources have been demoted, and what was their reach
before and after demotion. It also contains a requirement to be transparent
with users about why they are served specific content and give them
algorithmic
choice, that is, the possibility to customise their recommended content,
including by prioritising trustworthy sources. This is combined with the
need to set up a transparency centre where, for each commitment, platforms
need to specify what policies they have implemented, in which languages and
countries they’re available and what are the related KPIs.
-
Its moderation proposals are based on the independent input of
fact-checkers, resting decision-making on what to remove or label out of
the closed policy rooms of the platforms to finally allow open scrutiny of
who has been silenced or corrected - an essential bulwark to freedom of
expression.
-
It treats users with respect - and includes a commitment to inform all
users who have been exposed to disinformation - including alerting them
through retroactive notifications.
-
Above all, it is a detailed proposal across disinformation,
misinformation and influence operations on how the signatories should
write a revised COP, including a rich set of “minimum” service-level KPIs.
Among those, a requirement to share the amount and reach of fact-checked
disinformation identified on the platform - which could become a key
metric, together with structural KPIs, to measure the impact.
What are we more worried about?
-
Absence of participation of the taskforce or extended signatories in the
drafting of the Code itself. Although we welcome the potential to set up
a permanent multi-stakeholder forum to discuss the monitoring,
implementation and future tweaks needed - we think they should have a role
in drafting the Code. Especially considering that commitments will have a
big impact on users, consumers and fundamental rights, it is fundamental
that all voices are included in its drafting. We are encouraging all our
civil society partners to consider joining as signatories to ensure that
the Code is true to the Guidance.
-
The interim monitoring, before the DSA is in place, is not well defined,
and it doesn’t include soft penalties (e.g. increased reporting needs for
those who are not compliant).
-
Big reliance on EDMO, for both data sharing and structural KPIs
development. Considering that EDMO is a new body, with no regulatory
experience, we are concerned about whether or not it will have enough
authority to deal with platforms. Structural KPIs are not outlined yet:
their quality will be crucial to measure impact and hold platforms
accountable and these should be defined through a process that allows for
external scrutiny.
-
Lack of fact-checking sustainability commitments and metrics. The
prominence given to the fact-checking community is a fantastic tribute to
their journalism and places them at the heart of the system, but it is not
accompanied by an indication of how they will be sustained financially -
the sustainability of this independent sector should be addressed.
-
The heavy reliance on the DSA for sanctions when platforms do not put
risk mitigation measures in place means that our advocacy work is even more
important to ensure the DSA keeps and improves articles 26, 27 and 35 to
ensure the co-regulatory backstop and cover all disinformation harmful to
our societies.
This summarises our position, and now we would love to hear yours, either
in writing or we could organise a quick call this week or the next! We are
organising a webinar to discuss and highlight these issues on 24th June,
with a keynote speech from Commission Vice President Vera Jourová. If you
have any recent research on subjects directly affected by the Guidance that
you feel could be highlighted as part of the presentations, please do let
us know as soon as possible.
Thank you in advance,
*Nádia Guerlenda Cabral* *| Avaaz*
Based in Brussels (Belgium)
e: nadia(a)avaaz.org
m: +32 486 463879
s: nadia.gc