2022
DOI: 10.2139/ssrn.4210278
|View full text |Cite
|
Sign up to set email alerts
|

Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…Platforms exercise their governance of cultural production, particularly through automated content moderation (Gorwa et al, 2020). With a view to copyright, we understand with Quintais et al (2022) copyright content moderation as the activities of platforms aimed at detecting, identifying and addressing content or information that is illegal under EU copyright law and is incompatible with their Terms of Service. These activities can include a broad repertoire of measures: Gorwa et al (2020, p. 4) distinguish with regard to consequences between “hard” measures such as blocking and removal, and “soft” measures, such as downranking and flagging content, and with regard to technological measures between “matching” approaches (seeking additional copies of known content) and “prediction” approaches (seeking to extrapolate features from known to unknown content).…”
Section: Copyright Content Moderation By Platformsmentioning
confidence: 99%
“…Platforms exercise their governance of cultural production, particularly through automated content moderation (Gorwa et al, 2020). With a view to copyright, we understand with Quintais et al (2022) copyright content moderation as the activities of platforms aimed at detecting, identifying and addressing content or information that is illegal under EU copyright law and is incompatible with their Terms of Service. These activities can include a broad repertoire of measures: Gorwa et al (2020, p. 4) distinguish with regard to consequences between “hard” measures such as blocking and removal, and “soft” measures, such as downranking and flagging content, and with regard to technological measures between “matching” approaches (seeking additional copies of known content) and “prediction” approaches (seeking to extrapolate features from known to unknown content).…”
Section: Copyright Content Moderation By Platformsmentioning
confidence: 99%
“…Specifically, the scope of monitoring should be refined to the extent that the standards for determining illegality are distinct and practical to meet current available technology. 179 That is to say, the permissible monitoring must not require platforms to assess the legality of content, and should target online content that has been previously identified as illegal by national authorities, or is manifestly illegal for a reasonable person. 180 Considering the distinctive dual-track approach concerning monitoring obligations, private sphere should be excluded from the scope of public law monitoring, while public law monitoring obligations are applicable merely to public law issues, namely the illegal content listed in 'Eleven Boundaries.…”
Section: Legal Predictability Of Monitoring Obligationsmentioning
confidence: 99%