Hi Spencer,

thanks for this, quick reactions in your emai textl. The text still is not final in the sense that the trilogues may change it, so don't take it as a definite answer.

On 08.04.19 18:01, Spencer Graves wrote:
Hi, Anna: 


      Thanks for this. 


      How does this affect the financial incentives for commercial social media companies to sell xenophobic ads with no public traceability? 
It doesn't. See this thread for more background on the objectives of the regulation https://twitter.com/a2na/status/1115231329849884672


      How does this legislation compare with the USA Patriot Act, which has been used to imprison people who tried to teach nonviolence to terrorist or people raising money for hospitals on Palestine? 
I don't know the Patriot Act in detail so it would need a more thorough comparison, but potentially such thing could happen on the grounds of the anti-terrorist directive that is separate act on combating terrorism (so not only content, as this one is) - depending on whether this would be considered contributing to terrorist offences or not.


      Could this legislation be used to make it difficult for people to question the wisdom of supporting the current government of Saudi Arabia, e.g., with transfers of arms and nuclear technology? 
Probably not, I think this would be a big stretch of the law even in its worst form, as we saw by EC. But looking at the imaginative policies of Orban for example I tend not to underestimate dictator's drive to have things their way, so depending on what they consider as dissent, public threat, etc. The trouble is, that once removal orders are issues, the platforms must remove content and they do not have the option to contest them on the grounds of being absurd.


      I ask for several reasons:  (a) Terrorism is miniscule as a cause of death.  More Americans drown in the average year in bathtubs, hot tubs and spas than succumb to terrorism;  similar silliness doubtless also applies to Europe.  (b) A 2008 RAND study showed that only 7% of terrorist groups that ended between 1968 and 2006 were defeated militarily.  83% ended with negotiations or law enforcement.  (c) The primary recruiter and supporter for Islamic terrorism since at least 1999 and continuing today appears to be Saudi Arabia, followed closely by the US.  For detailed references, see the Wikiversity article on "Winning the War on Terror".[1] 
I agree, I wish they tackled global warming with such fierce attitude.


      Would this legislation force the Wikimedia Foundation to take down what I've already written in that article?[1]  I assume the answer to that question is, "no".  However, it seems like it could potentially make it more difficult for people to understand "Why do they hate us" (as US President Bush asked on 2001-09-20. 

The legislation does not apply many direct measures to platforms, other than duty of care (which of course is something already). If platforms know about criminal terrorist content they need to inform authorities (LIBE version), they also need to adjust their ToC and may use automation for flagging but may not actively seek terrorist content. Any measures forced on them cannot amount to general monitoring. But if a formally correct removal order comes to remove this article, then we'll have to comply.

I wouldn't be too worried about us in terms of normal activity and relationships with EU states. If a malicious government would like to use this against us, things may get worse, however.



      Thanks,
      Spencer Graves, PhD
      Founder
      EffectiveDefense.org
      4550 Warwick Blvd 508
      Kansas City, MO 64111 USA


[1] https://en.wikiversity.org/wiki/Winning_the_War_on_Terror


On 2019-04-08 10:28, Anna Mazgal wrote:
Dear All,

Civil Liberties committee has just adopted MEP Dalton's compromises as the report on the Regulation on preventing the dissemination of terrorist content online. There are some amendments added that require further analysis but the key changes are positive:

1. Improvements on definition of terrorist content with exclusions to artistic, journalistic, educational and research purposes
2. Specific measures referring to platforms receiving a substantial number of removal orders; the regulation applies to those that make content available to the public (so a positive limitation)
3. Referrals removed from the proposal with a caveat that Europol referrals should be taken as a priority by platforms when content is flagged (in recital)
4. Specific measures steer away from content filtering and forbid general monitoring.

Unfortunately, the 1h deadline for removing content has been sustained. Interestingly, Rapporteur Dalton called out the European Commission on not understanding the democratic parliamentary process, applying pressure on quick adoption of the file and dismissing debate on controversial issues as unnecessary and going against the goals of the regulation. It is quite a serious accusation as the EC is not supposed to pressure MEPs that way.

After the text is published we will have a more in-depth analysis of other changes and how the whole proposal could affect Wikimedia. The vote in the plenary is planned for next week. After that - the trilogues.

Happy to take on any questions you may have regarding the report.

best wishes,
Anna

--
Anna Mazgal
EU Policy Advisor
Wikimedia
anna@wikimedia.be 
@a2na
mobile: +32 487 222 945
51 Rue du Trône
BE-1050 Brussels

_______________________________________________
Publicpolicy mailing list
Publicpolicy@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/publicpolicy


_______________________________________________
Publicpolicy mailing list
Publicpolicy@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/publicpolicy
-- 
Anna Mazgal
EU Policy Advisor
Wikimedia
anna@wikimedia.be 
@a2na
mobile: +32 487 222 945
51 Rue du Trône
BE-1050 Brussels