E-paper

Algorithmic misogynoir in content moderation practice

Kostenlos

Existing content moderation practices, both algorithmically-driven and people-determined, are rooted in white colonialist culture. Black women’s opinions, experiences, and expertise are suppressed and their online communication streams are removed abruptly, silently, and quickly. Studying content moderation online has unearthed layers of algorithmic misogynoir, or racist misogyny directed against Black women. Tech companies, legislators and regulators in the U.S. have long ignored the continual mistreatment, misuse, and abuse of Black women online. This paper explores algorithmic misogynoir in content moderation and makes the case for the regular examination of the impact of content moderation tactics on Black women and other minoritized communities. 

Read also the study "The state of content moderation for the LGBTIQA+ community and the role of the EU Digital Services Act" by Christina Dinar

Produktdetails
Veröffentlichungsdatum
June 2021
Herausgegeben von
Heinrich-Böll-Stiftung European Union and Heinrich-Böll-Stiftung Washington, DC
Seitenzahl
17
Sprache der Publikation
English
Inhaltsverzeichnis

1. Introduction

2. Current content moderation practices


2.1. The problem with generalization

2.2. Double standards

3. Proposals and suggestions

3.1. Clarifying the role of social media companies

3.2. Addressing structural inequalities

3.3. Balancing power asymmetries between originator and commenter

References

Ihr Warenkorb wird geladen …