Document details

Policy Brief: Ensuring Ethical AI Practices to Counter Disinformation

MediaFutures;Luiss Data Lab (2023), 15 pp.
"In particular we recommend to strengthen collaboration (platforms should adopt a collaborative approach involving various stakeholders, including governments, civil society organisations, and fact-checkers, to counter the spread and impact of disinformation. This can include sharing information, best practices, and resources to develop effective strategies); enhance transparency (platforms should prioritise transparency by providing clear and comprehensive information on their policies, algorithms, and content moderation processes. Users should have a better understanding of how their data is used, and how algorithms work to prevent the amplification of false and misleading narratives); implement effective content moderation (platforms need to allocate sufficient resources to effectively monitor and moderate harmful content. This includes investing in advanced AI systems and human moderation teams to detect and remove disinformation in a timely manner. Transparent and consistent guidelines should be in place to ensure fairness and accountability in content moderation decisions); promote fact-based information (platforms should prioritise the promotion of fact-based information from reliable sources. This can be done by partnering with credible news organizations and fact-checkers to provide accurate information and combat false narratives. Advertising promoting climate change denial or other forms of misinformation should be prevented); improve the access to data for researchers (platforms should make efforts to provide access to data for independent researchers to evaluate the effectiveness of their policies and initiatives in countering disinformation. This will enable better analysis and understanding of the impact of disinformation and the effectiveness of countermeasures); comply with regulatory frameworks (platforms should fully comply with regulatory frameworks, such as the Digital Services Act (DSA) or other relevant International, EU and National laws and regulations, that provide for obligations on addressing disinformation and mitigating associated risks, the Code of Practice on Disinformation that aims to commit signatories to a range of actions to counter disinformation. These actions include providing transparency reports on political advertising, restricting advertising placements on disinformation websites, disrupting advertising revenue for purveyors of disinformation, and enabling user feedback and fact-checking mechanisms. In this framework, compliance should not be limited to large platforms but extended, with adjustments, to smaller platforms to ensure a comprehensive approach)." (Recommendations, page 6)