"The three countries [Bosnia and Herzegovina, Indonesia, and Kenya] provide evidence of online hate speech and disinformation affecting human rights offline. The evidence is not comprehensive yet clear enough to raise serious concerns. Online gender-based violence is also reported as critical in the
...
three countries. In the three countries, national legislation to address harmful content shows some degree of inconsistency in comparison to international standards, notably in relation to the protection of freedom of expression. The reasons for such inconsistency vary among countries. The effective enforcement of legal frameworks is uneven in all three countries. Social and cultural inequalities are often reproduced in government or judicial decisions, and vagueness in legislation opens space for discretionary decisions. Platform companies have offices in Indonesia and Kenya, but not in Bosnia and Herzegovina. In the three countries, there is a lack of transparency in how companies allocate the roles of moderation tasks, including the number of different language moderators and their trusted partners and sources. Companies do not process content moderation in some of the main local languages and community standards are not entirely or promptly available in local languages." (Executive summary)
more
"The proliferation of hate speech and disinformation on online platforms has serious implications for human rights, trust and safety as per international human rights law and standards. The mutually-reinforcing determinants of the problems are: ‘attention economics’; automated advertising system
...
s; external manipulators; company spending priorities; stakeholder knowledge deficits; and flaws in platforms’ policies and in their implementation. How platforms understand and identify harms is insufficiently mapped to human rights standards, and there is a gap in how generic policy elements should deal with local cases, different rights and business models when there are tensions. Enforcement by platforms of their own terms of service to date has grave shortfalls, while attempts to improve outcomes by automating moderation have their limitations. Inequalities in policy and practice abound in relation to different categories of people, countries and languages, while technology advances are raising even more challenges. Problems of ‘solo-regulation’ by individual platforms in content curation and moderation are paralleled by harms associated with unilateral state regulation. Many countries have laws governing content online, but their vagueness fuels arbitrary measures by both authorities and platforms. Hybrid regulatory arrangements can help by elaborating transparency requirements, and setting standards for mandatory human rights impact assessments." (Key messages)
more
"The spread of disinformation in recent years has caused the international community concerns, particularly around its impact on electoral and public health outcomes. When one considers how disinformation can be contained, one often looks to new laws imposing more accountability on prominent social
...
media platforms. While this narrative may be consistent with the fact that the problem of disinformation is exacerbated on social media platforms, it obscures the fact that individual users hold more power than is acknowledged and that shaping user norms should be accorded high priority in the fight against disinformation. In this article, I examine selected legislation implemented to regulate the spread of disinformation online. I also scrutinise two selected social media platforms – Twitter and Facebook – to anchor my discussion. In doing so, I consider what these platforms have done to self and co-regulate. Thereafter, I consider the limitations on regulation posed by certain behavioural norms of users. I argue that shaping user norms lie at the heart of the regulatory approaches discussed and is pivotal to regulating disinformation effectively." (Abstract)
more
"In particular we recommend to strengthen collaboration (platforms should adopt a collaborative approach involving various stakeholders, including governments, civil society organisations, and fact-checkers, to counter the spread and impact of disinformation. This can include sharing information, be
...
st practices, and resources to develop effective strategies); enhance transparency (platforms should prioritise transparency by providing clear and comprehensive information on their policies, algorithms, and content moderation processes. Users should have a better understanding of how their data is used, and how algorithms work to prevent the amplification of false and misleading narratives); implement effective content moderation (platforms need to allocate sufficient resources to effectively monitor and moderate harmful content. This includes investing in advanced AI systems and human moderation teams to detect and remove disinformation in a timely manner. Transparent and consistent guidelines should be in place to ensure fairness and accountability in content moderation decisions); promote fact-based information (platforms should prioritise the promotion of fact-based information from reliable sources. This can be done by partnering with credible news organizations and fact-checkers to provide accurate information and combat false narratives. Advertising promoting climate change denial or other forms of misinformation should be prevented); improve the access to data for researchers (platforms should make efforts to provide access to data for independent researchers to evaluate the effectiveness of their policies and initiatives in countering disinformation. This will enable better analysis and understanding of the impact of disinformation and the effectiveness of countermeasures); comply with regulatory frameworks (platforms should fully comply with regulatory frameworks, such as the Digital Services Act (DSA) or other relevant International, EU and National laws and regulations, that provide for obligations on addressing disinformation and mitigating associated risks, the Code of Practice on Disinformation that aims to commit signatories to a range of actions to counter disinformation. These actions include providing transparency reports on political advertising, restricting advertising placements on disinformation websites, disrupting advertising revenue for purveyors of disinformation, and enabling user feedback and fact-checking mechanisms. In this framework, compliance should not be limited to large platforms but extended, with adjustments, to smaller platforms to ensure a comprehensive approach)." (Recommendations, page 6)
more
"This Synthesis Report examines the effectiveness of countermeasures against misinformation on social media platforms, focusing on the two most examined remedies: content labeling and corrective information interventions. A meta-analysis is a research process for synthesizing and aggregating the fin
...
dings of many, independent studies, using statistical methods to calculate overall effects from multiple data sources. A meta-analysis of 43 studies from 18 peer-reviewed manuscripts was selected from a comprehensive database of 4,798 publications. First, there is an emerging scientific consensus that content labels and corrective information help people evaluate misinformation on social media platforms. Other mitigation strategies may be viable, but there is less consensus about their effectiveness. Second, understanding the global information environment requires more research: (i) from countries around the world, (ii) about user experiences in languages other than English, (iii) with genuine access to social media data from firms, (iv) that allows scientists to standardize measures and definitions for robustly reporting the results of independent research." (Synopsis)
more
"In many countries, censorship, blocking of internet access and internet content for political purposes are still part of everyday life. Will filtering, blocking, and hacking replace scissors and black ink? This book argues that only a broader understanding of censorship can effectively protect free
...
dom of expression. for centuries, church and state controlled the content available to the public through political, moral and religious censorship. As technology evolved, the legal and political tools were refined, but the classic censorship system continued until the end of the 20th century. However, the myth of total freedom of communication and a law-free space that had been expected with the advent of the internet was soon challenged. the new rulers of the digital world, tech companies, emerged and gained enormous power over free speech and content management. All this happened alongside cautious regulation attempts on the part of various states, either by granting platforms near-total immunity (US) or by setting up new rules that were not fully developed (EU). China has established the Great Firewall and the Golden Shield as a third way. in the book, particular attention is paid to developments since the 2010s, when Internet-related problems began to multiply. the state's solutions have mostly pointed in one direction: towards greater control of platforms and the content they host. Similarities can be found in the US debates, the Chinese and Russian positions on internet sovereignty, and the new European digital regulations (DSA-DMA). The book addresses them all." (Publisher description)
more
"[...] There are 70 million individuals who are administrators of Facebook groups. Many more coordinate groups across other platforms such as WhatsApp, Signal, and others, which are often run on mobile devices - with a global user base of 5.34 billion unique mobile users. Group administrators and mo
...
derators act as “community stewards.” They are individuals in charge of reviewing user-generated content to ensure users adhere to rules, regulations, and community standards of social media platforms. They hold tremendous influence over the experience of their users, but they often step into these roles without fully understanding the scope of responsibilities they are taking on. Many community stewards describe their role as a “labor of love,” representing a substantial opportunity to catalyze large-scale, positive social change across societies. Yet, they face many challenges. Stewards cite myriad challenges, risks, rewards, and opportunities they face in managing their online groups and pages.
This report focuses on the role of community stewards in promoting healthier relationships in their online and digital groups in conflict-affected and fragile countries. There is growing evidence of the societal impacts stewards are driving through their platforms, such as mobilizing aid during crises, raising awareness on important issues, and fostering solutions to community challenges. Given their unique reach, influence, and trust, community stewards hold great potential to not only mobilize positive social change, but also to foster connection and belonging in a way that positively transforms relationships and disrupts the toxic polarization that divides societies and fuels violence. The potential of community stewards is clear; it is less clear how civil society, the private sector, governments, and others can best support and scale up this potential. Understanding the needs and incentives of community stewards to proactively use their roles for building healthier online (and offline) communities can help build on what works. Understanding the barriers and challenges they face in doing so, will serve as critical entry points for mobilizing the right support to stewards. This report looks to uncover the barriers and opportunities that stewards face in their efforts to build healthier and safer online communities in conflict-affected and fragile places." (Introduction)
more
"This special edition honours the efforts of various state and non-state actors in the promotion of internet freedom in Africa. The report takes a deep dive into the dynamic landscape of internet freedom on the African continent and offers contextual information and evidence to inform ICT policymaki
...
ng and practice, creates awareness on internet freedom issues on the continent, and shapes conversations by digital rights actors across the continent. Through a series of essays, authors in this special issue of the report reflect on the past 10 years on the state of Internet freedom in Africa, exploring various thematic issues around digital rights, including surveillance, privacy, censorship, disinformation, infrastructure, access, advocacy, online safety, internet shutdowns, among others. Authors featured in the report include, Admire Mare, Amanda Manyame, Blaise Pascal Andzongo Menyeng, Rima Rouibi, Victor Kapiyo, Felicia Anthonio. Richard Ngamita, Nanjala Nyabola, Professor Bitange Ndemo, Paul Kimumwe, and Edrine Wanyama. The report maps the way ahead for digital rights in Africa and the role that different stakeholders need to play to realise the Digital Transformation Strategy for Africa and Declaration 15 of the 2030 Agenda for Sustainable Development on leveraging digital technologies to accelerate human progress, bridge the digital divide, and develop knowledge societies." (Publisher description)
more
"Das vorliegende Papier soll die Umsetzung des neuen Digital Services Act (DSA) konstruktiv aus einer kinderrechtlichen Perspektive begleiten. Dabei wird gezeigt, welche Potenziale sich aus dem DSA ergeben, um Kinderrechte im Digitalen zu stärken. Im Fokus stehen Anbietermaßnahmen sowie auch Präv
...
entions- und Befähigungsanliegen. Dabei wird über Points to Consider erarbeitet, was kinderrechtliche Good- und Best-Practice-Ansätze bei der Erfüllung der Anforderungen des DSA ausmacht." (Zusammenfassung)
more
"In November 2020, with support from Public Safety Canada, Tech Against Terrorism launched the Terrorist Content Analytics Platform (TCAP). The world’s largest database of verified terrorist content, collected in real time from verified terrorist channels on messaging platforms and apps, the TCAP
...
is a secure and transparent online tool to detect and verify terrorist content and notify technology companies of the presence of such content on their platforms. The TCAP is developed using a transparency-by-design approach. This is the first TCAP transparency report, which is one of several initiatives Tech Against Terrorism has taken in compliance with our core principles. The report provides a detailed breakdown of the core metrics for the reporting period between 1 December 2020 and 30 November 2021, and of key TCAP policies and processes." (Executive summary, page 2)
more
"Content moderation at scale is an extremely complicated issue, however by looking at specific examples such as the case studies and data highlighted in this study, the conversation can start to take into account more diverse experiences and context that is normally overlooked. Emerging from these e
...
xperiences are recommendations for reform and structural change reflected in focus group discussions and demands by activists in the region, some of which are reproduced below. 1. Over-reliance on automated systems should be revised in light of issues emerging from non-English speaking markets. The failure of these systems to adequately account for context should be reason enough to fundamentally revise systems and protocols underpinning them. 2. Dedicating more resources to human-based content moderation in non-Western contexts. The disparity of material resources between countries considered “key economies” and the “rest of the world” is startling and has resulted in enormous challenges for societies and political structures elsewhere [...] 3. Radical transparency by tech platforms regarding the ways in which content moderation policies are formulated and implemented should be high on the priority of digital platforms [...] 4. Content moderation decisions are often one-sided, with little recource for users who are aggrieved by the decisions, both for false positives or inaction by platforms. Meta's Oversight Board is a positive start but the model only impacts select cases. There needs to be a robust and time-responsive system for appeals that provides users with complete information regarding content moderation decisions and responsive action on appeals. 5. Content moderation decisions by tech platforms, and inaction in equal measure, have resulted in tangible real-world harms in the past and present." (Conclusion, page 23-24)
more
"This report outlines the findings from the initial scoping phase of a project supported by a grant from Omidyar Network and launched by the Institute for Strategic Dialogue (ISD) and CASM Technology to identify online spaces used by extremist, hate and disinformation actors and communities as they
...
increasingly move away from mainstream social media platforms. The report outlines the key barriers posed by these platforms to researching and mitigating harmful content and behaviours, and reviews existing research methodologies and tools to address these barriers. Finally, the report presents possible future scenarios for the evolving online ecosystem, and proposes a series of initial recommendations for policy-makers, platforms and the research community." (About this publication)
more
"Analisamos uma amostra do conteúdo denunciado ao TSE, a fim de verificar se, de fato, a Meta removeu ou indicou a presença de desinformação nestas publicações. Os resultados apontam que parcelas expressivas das publicações denunciadas e já diagnosticadas como nocivas por checadores de fato
...
s não foram removidas das redes da Meta e/ou não receberam o rótulo de desinformação. A Meta está permitindo a circulação de conteúdo nocivo à democracia brasileira no Facebook e Instagram, sem cumprir de forma efetiva com as suas políticas e a parceria com o TSE." (Apresentação)
more
"This report explores the social media habits of Iranian netizens and how the Islamic Republic is repressing the online space." (Publisher description)
"This report specifically looks at the situation of local actors who, while they are impacted by the circulation of harmful content on social media or the moderation thereof, often find themselves unable to take effective action to improve their situation in that respect." (Introduction)
"This research has shown that in the midst of the complex and diverse cultural context of Indonesia, growing use and misuse of social media in the country, and the complexity of ‘grey-area’ problematic content in the country, there has been a lack of meaningful and continuous dialogue between pl
...
atforms and leading and peripheral civil society groups. Civil society groups and lay users have been battling individually, instead of coordinating, against the content moderation decisions of platforms. Most of them do not know how to appeal against the platform’s decisions. Meanwhile, the leading civil society groups in their capacity as the official partners of platforms have often felt powerless in the negotiation process with platforms. Platforms usually hold the final decision-making power, while not displaying sufficient understanding of the complexity of the local context. Accordingly, there have been cases of over and under content moderation in the country, that either hurt freedom of expression or the safety of individuals and public. When we submitted the idea of a local Coalition on Freedom of Expression and Content Moderation to the interviewees, most of them responded positively. To be clear, there is already a number of multi-stakeholder groups and civil society alliances working on issues of Internet governance, freedom of expression, and social media ethics in the country, but only few have shown interest, resources, and commitment to develop work on the issue of the contribution of local actors to content moderation on social media." (Recommendations, page 57)
more
"This report explores the local-specific contextual concerns stemming from global, non-transparent, and profit-driven content moderation processes of social media. The report analyses what happens when certain local communities and countries are 'invisible' to social media platforms and illustrates
...
how cross-sectoral collaboration in the form of a coalition for freedom of expression and content moderation could help these communities engage with social media platforms and have a voice in content moderation cases that impact their society." (Executive summary)
more