Hi everyone
Many of you will have already heard about the European Commission
publishing its proposal for a Digital Services Act
<https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en>
that will govern how internet hosts operate in Europe, and beyond. We
wanted to share our first impressions
<https://medium.com/@wikimediapolicy/how-europes-proposed-digital-services-act-can-preserve-wikimedia-or-let-it-get-constantly-trolled-11309e71dbe4>
of
the proposed law. I am pasting them below.
Hope this finds you safe and well. Have a good break and a good start into
the new year!
Best,
Jan
==
Jan Gerlach
Lead Public Policy Manager
Wikimedia Foundation
1 Montgomery Street, Suite 1600
San Francisco, CA 94104
jgerlach(a)wikimedia.org
@pd_w <https://twitter.com/pd_w>
@wikimediapolicy <https://twitter.com/wikimediapolicy>
Early Impressions: How Europe’s Proposed Digital Services Act Can Preserve
Wikimedia, or Let it Get Constantly Trolled
The Wikimedia Foundation’s take on the DSA: Cautious optimism, but also
concerns about empowering bad-faith actors
The European Commission recently released
<https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en>
its proposal for the Digital Services Act (DSA)
<https://eur-lex.europa.eu/legal-content/en/TXT/?qid=1608117147218&uri=COM%3A2020%3A825%3AFIN>,
a law that will change the legal underpinnings of online life in Europe,
and, by extension, the world. One of the main components of the proposal
creates a framework of obligations for online hosts--a group which includes
the Wikimedia Foundation in its role as the host of Wikipedia.
The current law on the liability of hosts, governed by Article 14 of
the e-Commerce
Directive
<https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32000L0031>,
says that online hosts aren’t liable for what their users post if they
don’t know about any illegal activity, and if they act upon illegal
activity once they know about it. Article 15, meanwhile, says that a host
can’t be legally required to monitor its services, hunting for any
potentially illegal activity by its users.
There’s a lot to analyze and consider in the DSA proposal, but we wanted to
share a few early impressions. First of all, we’re glad to see that the DSA
preserves these provisions of the e-Commerce Directive, which ensure that
the Foundation can continue hosting the knowledge of countless editors and
contributors. Unique, user-driven platforms like Wikipedia thrive under
strong intermediary liability protections, and we are happy to see this
recognition from the Commission. In addition, there are lots of new
provisions in the DSA as well, intended to encourage more effective and
responsive content moderation. While some of these improve transparency,
such as making it easier for people to understand why they see a certain
piece of information, and are intended to promote fundamental rights, there
are also others that, if applied poorly, could actually make some hostile
aspects of the internet worse.
In particular, our initial concerns center on two things. The first is in
Article 12, which says that an online service has to disclose in its terms
of service the rules and the tools of its content moderation. While we
agree that terms of service should be as clear and transparent as possible,
our concern lies with language like that in Article 12.2, which says that
online services must apply and enforce their terms of service “in a
diligent, objective, and proportionate manner.” That’s an ideal goal, but
we worry that “diligent, objective and proportionate” can mean very
different things depending on who you ask, and that community-governed
platforms would be hurt by unclear standards and a lack of discretion.
Terms of use (like the Foundation’s Terms
<https://foundation.wikimedia.org/wiki/Terms_of_Use/en#4._Refraining_from_Certain_Activities>,
or even the Universal Code of Conduct
<https://meta.wikimedia.org/wiki/Universal_Code_of_Conduct/Draft_review>)
frequently include provisions prohibiting clearly harmful but often
hard-to-define and even platform-specific things like harassment,
disruptive behavior, or trolling
<https://en.wikipedia.org/wiki/Internet_troll>. At what point would a
regulator or a litigious user think that a certain volume of trolling meant
that a service wasn’t being “diligent” in enforcing its “don’t troll other
users” rule? Or what happens when someone whose posts are moderated, or who
thinks someone else’s behavior should be moderated, decides that the
moderators aren’t being “objective?” These situations certainly happen
often enough, but usually don't give rise to legal disputes. Under the
proposed rule, we are concerned that the resulting uncertainty about what
“diligent, objective, and proportionate” moderation should be would lead
disgruntled users to bring such costly cases while the world waits for more
definitive and uniform guidance from the European Court of Justice. Keep in
mind this is taking place in the context of an information sphere currently
struggling with countless motivated and well-funded bad-faith arguments,
disinformation, and conspiracy theories. And the number of disgruntled
users is only going to grow.
The other concern we want to raise comes up in Article 14, which says that
an online provider will be presumed to know about illegal content--and thus
be liable for it--once it gets a notice from anyone that that illegal
content exists. There’s a number of different ways that ambiguities in this
section can create problems, including potentially contradicting the
prohibition on general monitoring obligations. For example, if the
Foundation got a notice from someone alleging they had been defamed on one
article, what would the Foundation be responsible for, if the alleged
defamation was referenced in or spread across multiple articles, or talk
pages, that the user may not have specified? There must be significantly
more clarity around this provision if it is going to operate as intended
and not pose an undue burden on platforms.
Finally, we want to make sure that the particular structure, mission,
operation, and community self-governance of Wikimedia projects and other
participatory platforms are accounted for in this piece of legislation that
was likely designed with very different kinds of platforms in mind. We
still see some gaps and omissions in the Commission’s proposal and look
forward to collaborating with colleagues and members of the Wikimedia
movement in Europe (with a particular shout-out to the tireless work
of the Free
Knowledge Advocacy Group EU <https://meta.wikimedia.org/wiki/EU_policy>) to
work with lawmakers to ensure that the law can support and foster the kind
of free, open, collaborative, and collegial space that is the best of the
Wikimedia movement.