"Content moderation at scale is an extremely complicated issue, however by looking at specific examples such as the case studies and data highlighted in this study, the conversation can start to take into account more diverse experiences and context that is normally overlooked. Emerging from these experiences are recommendations for reform and structural change reflected in focus group discussions and demands by activists in the region, some of which are reproduced below. 1. Over-reliance on automated systems should be revised in light of issues emerging from non-English speaking markets. The failure of these systems to adequately account for context should be reason enough to fundamentally revise systems and protocols underpinning them. 2. Dedicating more resources to human-based content moderation in non-Western contexts. The disparity of material resources between countries considered “key economies” and the “rest of the world” is startling and has resulted in enormous challenges for societies and political structures elsewhere [...] 3. Radical transparency by tech platforms regarding the ways in which content moderation policies are formulated and implemented should be high on the priority of digital platforms [...] 4. Content moderation decisions are often one-sided, with little recource for users who are aggrieved by the decisions, both for false positives or inaction by platforms. Meta's Oversight Board is a positive start but the model only impacts select cases. There needs to be a robust and time-responsive system for appeals that provides users with complete information regarding content moderation decisions and responsive action on appeals. 5. Content moderation decisions by tech platforms, and inaction in equal measure, have resulted in tangible real-world harms in the past and present." (Conclusion, page 23-24)
Research Findings, 6
Limitations of Research, 8
Case Studies in Pashto, 10
Analysis, 18
Conclusion & Recommendations, 23