Classification and Its Consequences for Online Harassment: Design Insights from HeartMob

Classification and Its Consequences for Online Harassment: Design Insights from HeartMob


CSMR affiliated faculty and students will present a paper at the CSCW conference in October 2018 about design insights from the HeartMob system. These may help platform companies, when designing classification systems for harmful content, to take into account secondary impacts on harassment targets as well as primary impacts on the availability of the content itself.
Lindsay Blackwell, University of Michigan School of Information Jill Dimond, Sassafras Tech Collective Sarita Schoenebeck, University of Michigan School of Information Cliff Lampe, University of Michigan School of Information Online harassment is a pervasive and pernicious problem. Techniques like natural language processing and machine learning are promising approaches for identifying abusive language, but they fail to address structural power imbalances perpetuated by automated labeling and classification. Similarly, platform policies and reporting tools are designed for a seemingly homogenous user base and do not account for individual experiences and systems of social oppression. This paper describes the design and evaluation of HeartMob, a platform built by and for people who are disproportionately affected by the most severe forms of online harassment. We conducted interviews with 18 HeartMob users, both targets and supporters, about their harassment experiences and their use of the site. We examine systems of classification enacted by technical systems, platform policies, and users to demonstrate how 1) labeling serves to validate (or invalidate) harassment experiences; 2) labeling motivates bystanders to provide support; and 3) labeling content as harassment is critical for surfacing community norms around appropriate user behavior. We discuss these results through the lens of Bowker and Star’s classification theories and describe implications for labeling and classifying online abuse. Finally, informed by intersectional feminist theory, we argue that fully addressing online harassment requires the ongoing integration of vulnerable users’ needs into the design and moderation of online platforms.
PACM on Human-Computer Interaction, Vol. 1, No. CSCW, Article 24. Publication date: November 2017. https://doi.org/10.1145/3134659