News and developments
User Content Moderation under the Digital Services Act – 10 key takeaways
On the one hand, the DSA defines the procedure for and consequences of moderating or not moderating content, while on the other hand it also lays down the rights of users when their content is moderated. The fulfillment of these requirements has direct implications firstly with respect to the possible civil or criminal liability of intermediary service providers for distributing others’ content of an illegal or harmful nature, and secondly with respect to administrative liability under the Digital Services Act (e. g. a financial penalty imposed by the Digital Services Coordinator under Article 52(3) of the DSA).
Below are the ten most important points on content moderation under the Digital Services Act.
Author: Xawery Konarski, Attorney-at-law, Senior Partner, Co-Managing Partner
- The concept of moderation based on DSA.
- Types of User Content under the Digital Services Act.
-
- illegal content (e.g. Article 2(g)),
- content that is incompatible with the terms and conditions of services (e.g. Article 3(t)),
- harmful content (e.g. recital 82).
-
- inciting terrorism,
- depicting the sexual exploitation of children,
- inciting racism and xenophobia, and
- infringing intellectual property rights.
Author: Xawery Konarski, Attorney-at-law, Senior Partner, Co-Managing Partner