Document details

A Treatment for Viral Deception? Automated Moderation of COVID-19 Disinformation

Universität Innsbruck (2022), 110 pp.

Contains bibliogr. pp. 85-110

Series: Future Law Working Papers, 2022-1

CC BY-SA

"This paper examines responses to disinformation, in particular those involving automated tools, from a human rights perspective. It provides an introduction to current automated content moderation and curation practices, and to the interrelation between the digital information ecosystem and the phenomenon of disinformation. The paper concludes that an unwarranted use of automation to govern speech, in particular highly context-dependent disinformation, is neither in line with states’ positive obligation to protect nor with intermediaries’ responsibility to respect human rights. The paper also identifies required procedural and remedial human rights safeguards for content governance, such as transparency, user agency, accountability, and independent oversight. Though essential, such safeguards alone appear insufficient to tackle COVID-19 online disinformation, as highly personalized content and targeted advertising make individuals susceptible to manipulation and deception. Consequently, this paper demonstrates an underlying need to redefine advertising- and surveillance-based business models and to unbundle services provided by a few dominant internet intermediaries to sustainably address online disinformation." (Abstract)
1 Introduction, 1
2 Digital information ecosystem, 3
3 Concept of disinformation, 20
4 International Human Rights Framework, 31
5 Responses to COVID-19 online disinformation, 40
6 Conclusion, 81