2019
DOI: 10.14763/2019.4.1436
|View full text |Cite
|
Sign up to set email alerts
|

Unpacking the “European approach” to tackling challenges of disinformation and political manipulation

Abstract: The European Commission (EC) has recognised the exposure of citizens to online disinformation and micro-targeting of voters based on the unlawful processing of personal data as one of the major challenges for European democracies. In a response, the EC has put in place several measures creating a "European approach". This paper analyses the approach to identify which are the key principles upon which it is based; and the extent to which it takes into account the complexities of the challenges identified. The i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0
2

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(10 citation statements)
references
References 12 publications
0
8
0
2
Order By: Relevance
“…5–10). Despite these shortcomings, self‐regulation is considered to be an important and necessary measure to increase the accountability of digital platforms and to counter false information because third‐party fact‐checking activities are insufficient to tackle the problem (Nenadic, 2019, p. 13). Black (1996) presents a taxonomy of self‐regulation that includes the following: mandated private regulation: a collective group is required or designated by the government to formulate and enforce norms within a framework defined by the government (Ayres & Braithwaite, 1992); sanctioned private regulation: a collective group itself formulates the regulation that is subjected to government approval (Ogus, 1994); coerced private regulation: a collective group itself formulates and imposes regulation in response to threats of statutory regulation (Marsden, 2011); and voluntary private regulation, where there is no direct or indirect active state involvement (Black, 1996).…”
Section: Governance Models Of Social Media Platformmentioning
confidence: 99%
“…5–10). Despite these shortcomings, self‐regulation is considered to be an important and necessary measure to increase the accountability of digital platforms and to counter false information because third‐party fact‐checking activities are insufficient to tackle the problem (Nenadic, 2019, p. 13). Black (1996) presents a taxonomy of self‐regulation that includes the following: mandated private regulation: a collective group is required or designated by the government to formulate and enforce norms within a framework defined by the government (Ayres & Braithwaite, 1992); sanctioned private regulation: a collective group itself formulates the regulation that is subjected to government approval (Ogus, 1994); coerced private regulation: a collective group itself formulates and imposes regulation in response to threats of statutory regulation (Marsden, 2011); and voluntary private regulation, where there is no direct or indirect active state involvement (Black, 1996).…”
Section: Governance Models Of Social Media Platformmentioning
confidence: 99%
“…Disruption of democratic political processes and distortion of the public sphere, which were caused by disinformation in some countries and rapidly spread to social media, were an alarm that prompted the European Commission to take action in this field. The first actions were taken in 2018 but, given that the European media policy envisages caution in order not to undermine the freedom of speech, it is still an emerging policy that would use soft instruments (Nenadić, 2019;Đukić, 2019).…”
Section: European Disinformation Policymentioning
confidence: 99%
“…Budući da je riječ o uslugama koje su dostupne globalno, njihovi uvjeti i standardi ne odražavaju nužno određene nacionalne zakonodavne okvire. Primjerice, iz perspektive zaštite ljudskih prava i slobode izražavanja bilo bi vrlo zabrinjavajuće da platforme prihvate i globalno primijene standard sadržan u singapurskom zakonu iz 2019. koji omogućuje ministrima da izdaju naredbe platformama o uklanjaju sadržaja koje smatraju lažnima, a pojedinačnim prijestupnicima prijete kaznom zatvora do deset godina (Nenadić 2019). Iako je problem širenja dezinformacija globalno prepoznat i postoji stanovita suglasnost o potrebi da se određenima regulatornim intervencijama odgovori na njega, vodeće svjetske organizacije i stručnjaci koji djeluju na području zaštite slobode izražavanja upozoravaju na to da intervencije koje ciljaju primarno na sadržaj i platformama povjeravaju još više ovlasti u odlukama o uklanjanju sadržaja u vrlo kratkima vremenskim okvirima i bez javnog nadzora predstavljaju rizik za slobodu izražavanja (Kaye 2018(Kaye , 2019.…”
Section: Degradiranje Ili Uklanjanje šTetnih Sadržajaunclassified