Online Harassment and Content Moderation: The Case of Blocklists

Online harassment is a multi-faceted problem with no easy solutions. Social Media platforms are squeezed between charges of indifference to harassment and suppression of free speech. CSMR faculty participated in research into the challenge of designing technical features and seeding social practices that promote constructive discussion and discourage abusive behavior.
Shagun Jhaver, Georgia Institute of Technology Sucheta Ghoshal, Georgia Institute of Technology Amy Bruckman, Georgia Institute of Technology Eric Gilbert, John Derby Evans Associate Professor, University of Michigan School of Information Online harassment is a complex and growing problem. On Twitter, one mechanism people use to avoid harassment is the blocklist, a list of accounts that are preemptively blocked from interacting with a subscriber. In this paper, we present a rich description of Twitter blocklists - why they are needed, how they work, and their strengths and weaknesses in practice. Next, we use blocklists to interrogate online harassment - the forms it takes, as well as tactics used by harassers. Speci€fically, we interviewed both people who use blocklists to protect themselves, and people who are blocked by blocklists. We €nd that users are not adequately protected from harassment, and at the same time, many people feel they are blocked unnecessarily and unfairly. Moreover, we €nd that not all users agree on what constitutes harassment. Based on our €findings, we propose design interventions for social network sites with the aim of protecting people from harassment, while preserving freedom of speech. CCS Concepts: • Human-centered computing →Empirical studies in collaborative and social computing; Ethnographic studies; Additional Key Words and Phrases: Online harassment, moderation, blocking mechanisms, GamerGate, blocklists
ACM Transactions on Computer-Human Interaction, Vol. 25, No. 2, Article 1. Publication date: March 2018. https://doi.org/10.1145/3185593 .