Many of you will have already heard about the European Commission
publishing its proposal for a Digital Services Act
that will govern how internet hosts operate in Europe, and beyond. We
wanted to share our first impressions
the proposed law. I am pasting them below.
Hope this finds you safe and well. Have a good break and a good start into
the new year!
Lead Public Policy Manager
1 Montgomery Street, Suite 1600
San Francisco, CA 94104
Early Impressions: How Europe’s Proposed Digital Services Act Can Preserve
Wikimedia, or Let it Get Constantly Trolled
The Wikimedia Foundation’s take on the DSA: Cautious optimism, but also
concerns about empowering bad-faith actors
The European Commission recently released
its proposal for the Digital Services Act (DSA)
a law that will change the legal underpinnings of online life in Europe,
and, by extension, the world. One of the main components of the proposal
creates a framework of obligations for online hosts--a group which includes
the Wikimedia Foundation in its role as the host of Wikipedia.
The current law on the liability of hosts, governed by Article 14 of
says that online hosts aren’t liable for what their users post if they
don’t know about any illegal activity, and if they act upon illegal
activity once they know about it. Article 15, meanwhile, says that a host
can’t be legally required to monitor its services, hunting for any
potentially illegal activity by its users.
There’s a lot to analyze and consider in the DSA proposal, but we wanted to
share a few early impressions. First of all, we’re glad to see that the DSA
preserves these provisions of the e-Commerce Directive, which ensure that
the Foundation can continue hosting the knowledge of countless editors and
contributors. Unique, user-driven platforms like Wikipedia thrive under
strong intermediary liability protections, and we are happy to see this
recognition from the Commission. In addition, there are lots of new
provisions in the DSA as well, intended to encourage more effective and
responsive content moderation. While some of these improve transparency,
such as making it easier for people to understand why they see a certain
piece of information, and are intended to promote fundamental rights, there
are also others that, if applied poorly, could actually make some hostile
aspects of the internet worse.
In particular, our initial concerns center on two things. The first is in
Article 12, which says that an online service has to disclose in its terms
of service the rules and the tools of its content moderation. While we
agree that terms of service should be as clear and transparent as possible,
our concern lies with language like that in Article 12.2, which says that
online services must apply and enforce their terms of service “in a
diligent, objective, and proportionate manner.” That’s an ideal goal, but
we worry that “diligent, objective and proportionate” can mean very
different things depending on who you ask, and that community-governed
platforms would be hurt by unclear standards and a lack of discretion.
or even the Universal Code of Conduct
frequently include provisions prohibiting clearly harmful but often
hard-to-define and even platform-specific things like harassment,
disruptive behavior, or trolling
<https://en.wikipedia.org/wiki/Internet_troll>. At what point would a
regulator or a litigious user think that a certain volume of trolling meant
that a service wasn’t being “diligent” in enforcing its “don’t troll other
users” rule? Or what happens when someone whose posts are moderated, or who
thinks someone else’s behavior should be moderated, decides that the
moderators aren’t being “objective?” These situations certainly happen
often enough, but usually don't give rise to legal disputes. Under the
proposed rule, we are concerned that the resulting uncertainty about what
“diligent, objective, and proportionate” moderation should be would lead
disgruntled users to bring such costly cases while the world waits for more
definitive and uniform guidance from the European Court of Justice. Keep in
mind this is taking place in the context of an information sphere currently
struggling with countless motivated and well-funded bad-faith arguments,
disinformation, and conspiracy theories. And the number of disgruntled
users is only going to grow.
The other concern we want to raise comes up in Article 14, which says that
an online provider will be presumed to know about illegal content--and thus
be liable for it--once it gets a notice from anyone that that illegal
content exists. There’s a number of different ways that ambiguities in this
section can create problems, including potentially contradicting the
prohibition on general monitoring obligations. For example, if the
Foundation got a notice from someone alleging they had been defamed on one
article, what would the Foundation be responsible for, if the alleged
defamation was referenced in or spread across multiple articles, or talk
pages, that the user may not have specified? There must be significantly
more clarity around this provision if it is going to operate as intended
and not pose an undue burden on platforms.
Finally, we want to make sure that the particular structure, mission,
operation, and community self-governance of Wikimedia projects and other
participatory platforms are accounted for in this piece of legislation that
was likely designed with very different kinds of platforms in mind. We
still see some gaps and omissions in the Commission’s proposal and look
forward to collaborating with colleagues and members of the Wikimedia
movement in Europe (with a particular shout-out to the tireless work
of the Free
Knowledge Advocacy Group EU <https://meta.wikimedia.org/wiki/EU_policy>) to
work with lawmakers to ensure that the law can support and foster the kind
of free, open, collaborative, and collegial space that is the best of the
SPECIAL! SPECIAL! The European Commission has published its long
anticipated Digital Services Act and Digital Markets Act. A few very rough
first notes on them and a few key other dossiers to keep you warm over the
This and previous reports on Meta-Wiki:
Digital Services Act: It revampts the responsibility framework for internet
services. If you are a telecom or a hosting provider, not much will change.
If you are an online platform you might have to revamp your content
moderation practices on the service providers’ side at least. If you are a
very large service (45 million users within the EU) you will have to be
even more transparent than everyone else, to self-assess systemic risks
regularly (e.g.: How prone is your service to be abused by disinformation
networks during an election campaign?) and to conduct independent, external
audits of how your content moderation practices.
We are still unpacking the text, but here our first key reactions:
We very much like the idea of safeguarding fundamental rights and
freedom of speech even within online services.
Communities/community-driven moderation and platforms are not really
thought of anywhere, which leads to a few risks.
In Article 12-2 we are worried that people claiming rights in bad faith
might try to alter Wikipedia articles by going over a legal process
supposed to force platform operators to defend fundamental rights. We will
need a better safeguard here.
In Articles 14-19 (basically the content moderation systems) we need
stronger safeguards and rights for communities and individual users
("counter-notices" within the Notice & Action system being one basic
You can join the fun by crunching the text yourself here:
Digital Markets Act: This basically is a list of “dos and don’t” for very
large platforms that have a so-called gatekeeper position on the internal
market. To be part of that club you need to have a turnover of over 6.5
billion euro, so Wikimedia is out. The regulation will impose restrictions
on bundling services, self-preferencing blocking others from accessing
online marketplaces. Another goodie: No pre-installed apps that you can’t
uninstall shall be allowed.
Here’s the proposal:
Data Governance Act: While we extensively covered this in past monitoring
reports, we have now “caught” a new potential glitch by the European
Commission in this proposal. They want to establish registered and
voluntary “data sharing services” (Articles 9-13) in order to foster
exchange of industrial data between companies. But the exception to who is
covered is worded in a way that Wikidata and Europeana might have issues
with it. Article 14 reads:
“This Chapter shall not apply to not-for-profit entities whose activities
consist only in seeking to collect data for objectives of general interest,
made available by natural or legal persons on the basis of data altruism.”
See full text of the regulation:
On December 10th a compromise was adopted on terrorist content regulation.
It must have been a matter of not losing face for the German Presidency by
passing on this hot potato into 2021 unresolved, that they agreed to some
proposals that had not been seen as acceptable before. Here is a short
summary of the most important points.
1. Exception for journalists, artistic and educational purposes
The doubtful legitimacy check of what is journalism, artistic expression or
accepted research has been dropped. Article 1(2)(a) will exclude material
disseminated for educational, journalistic, artistic or research purposes
from the scope. Moreover, purposes of preventing or countering terrorism
shall not be considered terrorist content including the content which
represents an expression of polemic or controversial views in the course of
2. Definitions that on HSP and terrorist content
For the hosting service provider, those who enable users to share content
to the public (potentially unlimited number of persons) are under the
scope. As for terrorist content, it is now closer tied to the definition of
terrorist offences from the Directive on combatting terrorism.
3. One hour rule
That one is still in. The rule has been softened a bit by introducing
"justifiable technical and operational reasons" a platform may evoke if it
cannot comply with the order on time.
4. Upload filters kind of out/in
We end up with a prohibition for the authorities to impose upload filters.
HSPs can still use them voluntarily.
5. Competent authorities
The competent authorities will not be judicial or independent. There is,
however, the following safeguard: "Competent authorities shall not seek or
take instructions from any other body in relation to the exercise of the
tasks assigned to them pursuant to [this Regulation]"
6. Cross-border removal orders
The desired outcome on that topic has not been reached.
Currently we are awaiting for the final text and its formal adoption by the
Council of the EU. Next up - a vote in the European Parliament, possibly
already in January.
E-Evidence: The regulation on electronic evidence and preservation orders
was also covered extensively in past reports. The European Parliament has
now agreed on a position and is moving to negotiate the final text with
Council. From where we stand we got one fix - prosecutors won’t be able get
data about which IP accessed with Wikipedia article without a judge’s
stamp. But we also have one fail - the hosting Member States agency won’t
have a veto right on production orders if fundamental rights are violated
by the issuing prosecutor. Now the Parliament will need to defend its
positive amendments to the end of the process.
The Community Development
<https://meta.wikimedia.org/wiki/Community_Development> team at the
Wikimedia Foundation is launching the WikiLearn (Online Learning Pilot)
capacity-building project. This project will deliver two courses over an
8-week period that will begin on February 1, 2021. Courses are free to take
part in and will be hosted on Moodle <https://moodle.com/>. During this
pilot project, each course will be taught, and assignments are to be
prepared, in English. At the end of the pilot, we will be making course
materials available for anyone interested in self-study, and for
The two courses we will be launching on February 1, 2021, are:
Identifying and Addressing Harassment
This course will focus on developing skills that will help volunteers
respond to on- and off-wiki harassment.
This course will combine scheduled sessions and independent study.
The weekly commitment of this course is 2-3 hours per week.
Target Audience: Volunteers with administrator or other advanced user
intensive course will provide an in-depth curriculum on how to develop
meaningful programmatic and organizational partnerships within the movement
and with external partners.
This course will consist of scheduled sessions and individual
assignments with a weekly total commitment of up to 6 hours.
Target Audience: Volunteers with intermediate-level experience
contributing to Wikimedia projects and beginner-level experience with
building partners with Non-Wikimedia groups and organizations.
To ensure our small team can provide each learner individual support, we
are limiting each course in this pilot to 40 participants. We ask that all
interested applicants commit to attending weekly for the full course.
All efforts will be made to ensure that scheduled sessions are time-zone
friendly to the greatest number of enrolled participants. However, with the
wide distribution of volunteers, that may not always be possible.
Applications open Today (December 15, 2020) and will close on January
Successful applicants will be notified on January 18, 2021.
The first session for each course will be held on February 1, 2021.
To Take Part in the Online Learning Pilot:
To participate in the Partnership Building online course, fill out this
To participate in the Identifying and Addressing Harassment Online course,
fill out this application form
For more details on the Online Learning Pilot, please see our Meta page.
Feel free to ask questions on the project Talk
page or contact us at comdev(a)wikimedia.org.
The Community Development Team
Might there be interest in organizing a webinar with Dean Baker? He
advocates citizen-directed subsidies for media, with the recipients
required to place all they produce in the public domain. Part of his
(2016) book "Rigged" summarizes research suggesting that US intellectual
property law, especially changes over the past 50 years, have violated
their purpose under the US Constitution: "To promote the progress of
science and the useful arts." He's a co-founder of the Center for
Economic and Policy Research in Washington, DC. He appeared Oct. 3 on a
Forum I organized on the "Local Journalism Sustainability Act",
currently before the US Congress.
If the Public Policy Group for Wikimedia has not already talked with
him, it might be smart to ask for his input. With luck, you might be
able to arrange a webinar and maybe even arrange for him to testify
before appropriate committee(s) in Brussels. His help could be
instrumental in limiting inappropriate grants of rights to major media
organizations. Alternatively, he might help get any such grants
conditioned on future research, which he could help design and manage.
Spencer Graves, PhD
Journalist, 90.1 FM, KKFI.org, Kansas City Community Radio
4550 Warwick Blvd 508
Kansas City, MO 64111
 pp. 120-127 (127-134 of 263 in the PDF) of Baker (2016) Rigged: How
Globalization and the Rules of the Modern Economy Were Structured to
Make the Rich Richer (Center for Economic and Policy Research),
available for free at "https://deanbaker.net/books/rigged.htm". I just
added a summary of that book to the Wikipedia article on him.
 If you like the idea of a webinar, I could see if we could arrange
co-sponsorship by organizations like FreePress.net and the Center for
Media and Democracy.
I'd like to share with you an event hosted by Europeana (one of our
partners in GLAM) about the DSM (directive on copyright in the single
Paul Keller (consultant on copyright policies, vice-chair of Kennisland
(Amsterdam), and active for Creative Commons international) will be
speaking this Thursday 16-17 (UTC+1).
Begin doorgestuurd bericht:
*Van: *Ariadna Matas <ariadna.matas(a)EUROPEANA.EU>
*Onderwerp: **Webinar DSM Directive*
*Datum: *1 december 2020 om 11:37:58 CET
*Antwoord aan: *Ariadna Matas <ariadna.matas(a)EUROPEANA.EU>
I wanted to remind you of the webinar taking place on Thursday about "The
copyright directive: new approaches to the public domain and to out of
commerce works" with guest speaker Paul Keller
The webinar will take place on Thursday, 3 December 2020 at 16:00 – 17:00
CET and you can sign up and find out more via this link
The webinar is part of a longer webinar series primarily intended at
Europeana data partners. You can read more about it here
and also feel free to check out our webinars page
<https://pro.europeana.eu/page/webinars> for more!
Ariadna Matas | Policy Advisor |
E: ariadna.matas(a)europeana.eu | Skype: ariadna.matas | Twitter: @ariamatas
We continue to work from home until 1 January 2021. You can read more about
how we are doing that here
Be part of Europe's online cultural movement - join the Europeana Network
Association: Sign up for the Association
<https://pro.europeana.eu/network-association/sign-up> | #AllezCulture|
@Europeanaeu <https://twitter.com/Europeanaeu> | Europeana Pro website
Disclaimer: This email and any files transmitted with it are confidential
and intended solely for the use of the individual or entity to whom they are
addressed. If you have received this email in error please notify the
system manager. If you are not the named addressee you should not
distribute or copy this email. Please notify the sender immediately by
email if you have received this email by mistake and delete this email from
To unsubscribe from the EUROPEANA-COPYRIGHT list, click the following link:
We are all waiting for the Digital Services Act and the Digital Markets Act
- this legislative term’s cornerstone digital dossiers expected to
reshuffle the responsibilities of online platforms.
To pass the time, the Commission has published its proposal for a Data
Governance Act - a regulation that wants to open up European data for
business and research without relying on very large platforms collecting
Meanwhile, the Terrorist Content Regulation is seemingly stuck in
This and previous reports on Meta-Wiki:
Data Governance Act
After getting shaky knees and postoping its publication three times, the
European Commision finally got around to share with the rest of the world a
proposal for a Regulation on European data governance, a.k.a. the Data
Governance Act.  No radical changes as compared to the leak we covered
in last month’s monitoring report , but bear with us for a basic
The European Commission wants more European data (public, private and
personal) to be shared for the purposes of innovation, research and
business. It also wants to avoid a system where only a few large platforms
control all the data. It thus wants to create mechanisms and tools to get
Public Sector Data: It creates a mechanism for re-using protected (e.g.
because of privacy rules, statistical confidentiality or IP) public sector
data. Public sector bodies are to establish secure environments where data
can be mined within the institution. Anonymised data could be provided
outside of the body if the re-use can’t happen within its infrastructure.
In case the data can’t be anonymised and can’t be processed within the
public body, there needs to be a legal basis under the GDPR for its
transmission outside of the public body (i.e. getting explicit consent from
all subjects). To help both private entities looking for data and public
sector bodies who need to provide it, governments are to designate one or
more competent bodies.
Commercial Data: The European Commission wants Member States to create a
notification regime (de facto a public registry) for “data sharing
providers”. Such organisations are meant to boost B2B data sharing by
acting as neutral clearinghouses for the data several companies share. They
must be an entity that has no other purpose and is either registered in the
EU or has a legal representative in one of the Member States.
“Data Altruism Organisations”: The Commission wants to establish a
possibility for organisations engaging in data altruism to register as
‘Data Altruism Organisation recognised in the EU’. As a real-life example
you may imagine a project gathering activity tracker data to research
COVID-19 syptoms. The label will come with rules and strings. Being a legal
entity constituted to meet objectives of general interest and operating on
a non-for-profit basis and independently from any for-profit entity. The
Commission will create a “common European data altruism consent form” by
which data subjects may share their personal data with such organisations
for a general interest goal. Data Altruism Organisations will also have to
be either established in the EU or have a legal representative within the
Lastly, a formal expert group - the *European Data Innovation Board* - will
be created which shall facilitate the emergence of best practices and
advise the Commission on standardisation and guidelines.
While there has been no outright opposition to the proposal, some
organisations defending privacy (AccessNow, noyb.eu) have raised concerns
that parts of this proposed Regulation overlap with the GDPR and we will
must be careful not to undermine its rules. The main point of debate
remains about the fact that the Commission wants to enshrine that personal
data can only be transferred out of the EU if adequate protection is
guaranteed. This comes very close to GDPR language (think of the struck
down EU-US agreements allowing data flows across the Atlantic). It is also
an open question as to how much the European legislator can restrict data
sharing without violating non-localisation principles written into trade
Special mention: The European Commission want public sector bodies to not
use the *sui generis database right*. The text reads: "The right of the
maker of a database [...] shall not be exercised by public sector bodies in
order to prevent the re-use of data or to restrict re-use beyond the limits
set by this Regulation." To my knowledge this might become the first time a
European legislator states that an existing IP protection should not be
Terrorist Content Regulation
The German Presidency of the EU is on the roll with TERREG aiming at
closing this debate before the end of 2020. A general proposal  seems to
suggest that DE adopted a strategy of pushing the envelope further to make
EC proposals seem moderate in comparison. Notably, journalistic, artistic,
and research content is exempted, but only if a government or a platform
would recognise them as legitimate journalistic artistic or research
purposes (!). There are so many problems with introducing such mechanisms,
and we already have many examples on how "terrorist content" narrative
harms initiatives such as condemnation of claims made by terrorists and
leads to silencing journalists. We have it covered for you in this
analysis. Specific measures are designed in a way that will coerce
platforms into using content filtering for terrorism, which is even more
difficult to execute than for copyright due to context.
There is still time to talk to your government and your MEPs about this:
ask them to push back and support the solutions that the European
Parliament has proposed back in 2019 as they overwhelmingly supported the
Digital Services Act/Digital Markets Act
A huge dossier on rules and regulations covering online platforms that is
expected to span everything from content moderation to competition rules.
We have written about it in 10/10 monitoring reports this year. There is
little new in the books. We are waiting, along with everyone else, for the
Commission to publish its proposals on 9 December. In the meanwhile they
have cranked up their security to avoid leaks. .
We promise to be back on 10 December with a DSA/DMA special edition and in
the meantime we’ll spare you another round of “who said what”, as it is
mostly clear where the various stakeholder groups stand.
Copyright Reform - Article 17 Hearing in CJEU
When the Copyright in the Digital Single market Directive was passed in
Council, the Republic of Poland referred the case to the Court of Justice
of the EU, claiming Article 17 and its de facto provision for “upload
filters” violate EU law. A first hearing took place in Luxembourg and as
these aren’t streamed publicly, we sent Communia’s Paul Keller to take
Guidelines for Civil Servants - Open Source in the Public Sector
After its “Think Open’’ communication whereby the Commission committed
itself to increase its use of open source technology , there are now
guidelines for civil servants, project managers, and IT officials looking
to engage with open source in the public sector.  These might come in
handy when we talk to public sector bodies about practicalities.
Austrian Hate Speech Law
A number of MEPs have asked the Commission to request Austria to postpone
its draft hate speech law , because it could lead to more fragmentation
in the single market. While there is nothing Brussels could do to force
Vienna to delay its plans, we have raised similar concerns with the
European Commision. It would be harmful to have different definitions of
platforms and rules that apply to each of them depending on Member State.
The so-called NetzDG wants to make platforms more responsible for and more
proactive in fighting offenses such as hate speech, coercion or stalking on
their services. 
French Terrorism Law
A French law that would ban sharing images or video of police officers
“with the aim of harm” passed the lower chamber of the national parliament.
Civil society organisations, including Amnesty International and Reporters
Without Borders, criticised the law for being extremely vague on the
definition’s side. On a very practical level the issue is that the person
who initially needs to assess whether the content is with malicious intent
and confiscate it is the police officer who is being filmed.