CAREER: Toward Equitable Social Media Content Moderation for Marginalized Individuals and Communities
This research examines the causes and consequences of online platforms banning users and content that do not actually violate site policies, or fall into gray areas with respect to site policies and community norms. It will achieve: (1) A characterization of the types and prevalence of reasons people are banned or have content removed from social media sites, how these relate to systemic biases, and the implications of these practices. (2) An in-depth description of processes and support structures marginalized online communities enact to address inequities resulting from moderation. (3) A combined qualitative and computational technical approach that more accurately delineates between social media content that should and should not have been banned, according to community norms and shifting site policies. (4) Design recommendations for a context-aware content moderation system that balances community needs, norms, and values with platform requirements and can be applied across social media sites. (5) Theoretical insights around current inequities in online content moderation and how moderation goals can be achieved more equitably by recognizing and respecting marginalized online communities' needs.
The following studies will be conducted: 1) Interviews with marginalized people whose content or account was removed from a social media site even though in line with site policies or falling into a gray area; 2) A virtual ethnography in three marginalized online communities who are frequent targets of user bans and content takedowns. 3) Use of participatory design and focus group methods to determine how to build a context-aware content moderation system that can be applied across social media sites. 4) Implementation of this sociotechnical system online, and evaluation to understand the extent to which users consider it equitable and how well it meets their needs. In addition, a digital literacy online resource will help educate people about content and account removals and how to regain site access, and the results of the research will educate social media platforms about challenges marginalized populations face around content moderation.