Document details

Working Group on Infodemics: Policy Framework

Forum on Information & Democracy (2020), 127 pp.

Contains bibliogr. pp. 114-116

"The Forum on Information & Democracy proposes a number of policy steps to democratic governments and their supporters. Transparency and accountability need to be shored up and content moderation should be done according to democratic mandates and oversight. The impact of new platforms where disinformation can go viral, such as private messenger services, needs to be understood. Through a global democratic coalition, a meaningful alternative should be offered instead of the two dominant models of technology governance: the privatized and the authoritarian. Through the intergovernmental Partnership on Information & Democracy, democratic leaders recognize the information and communication space as a ‘public good’. Now they have to implement their commitments in policies on the national and international level. Our recommendations are designed to shape and support their policy agenda." (Foreword, page 13)
"PUBLIC REGULATION IS NEEDED TO IMPOSE TRANSPARENCY REQUIREMENTS ON ONLINE SERVICE PROVIDERS
1. Transparency requirements should relate to all platforms' core functions in the public information ecosystem: content moderation, content ranking, content targeting, and social influence building.
2. Regulators in charge of enforcing transparency requirements should have strong democratic oversight and audit processes.
3. Sanctions for non-compliance could include large fines, mandatory publicity in the form of banners, liability of the CEO, and administrative sanctions such as closing access to a country's market.
A NEW MODEL OF META-REGULATION WITH REGARDS TO CONTENT MODERATION IS REQUIRED
4. Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law: legality, necessity and proportionality, legitimacy, equality and non discrimination.
5. Platforms should assume the same kinds of obligation in terms of pluralism that broadcasters have in the different jurisdictions where they operate. An example would be the voluntary fairness doctrine.
6. Platforms should expand the number of moderators and spend a minimal percentage of their income to improve quality of content review, and particularly, in at-risk countries.
NEW APPROACHES TO THE DESIGN OF PLATFORMS HAVE TO BE INITIATED
7. Safety and quality standards of digital architecture and software engineering should be enforced by a Digital Standards Enforcement Agency. The Forum on Information and Democracy could launch a feasibility study on how such an agency would operate.
8. Conflicts of interests of platforms should be prohibited, in order to avoid the information and communication space being governed or influenced by commercial, political or any other interests.
9. A co-regulatory framework for the promotion of public interest journalistic contents should be defined, based on self-regulatory standards such as the Journalism Trust Initiative; friction to slow down the spread of potentially harmful viral content should be added.
SAFEGUARDS SHOULD BE ESTABLISHED IN CLOSED MESSAGING SERVICES WHEN THEY ENTER INTO A PUBLIC SPACE LOGIC
10. Measures that limit the virality of misleading content should be implemented through limitations of some functionalities; opt-in features to receive group messages, and measures to combat bulk messaging and automated behavior.
11. Online service providers should be required to better inform users regarding the origin of the messages they receive, especially by labelling those which have been forwarded.
12. Notification mechanisms of illegal content by users, and appeal mechanisms for users that were banned from services should be reinforced." (12 main recommendations, pages 14-15)
1 Transparency of Platforms, 17
2 Meta-Regulation of Content Moderation, 41
3 Platform Design and Reliability of Information, 61
4 Mixed Private and Public Spaces on Closed Messaging Services, 85
Next Steps, 112