University of Michigan School of Information
New website provides information for marginalized communities after social media bans

Wednesday, 06/22/2022
University of Michigan School of Information (UMSI) scholars launched a new online resource for understanding and dealing with social media content bans. The Online Identity Help Center (OIHC) aims to help marginalized people navigate the gray areas in social media moderation.
Having content taken down from social media can be a jarring and upsetting experience. Being silenced can feel even more isolating if you are a person from a marginalized group. Those who are sexual, gender and/or racial minorities may already feel unwelcome online, and content censorship can narrow sense of community even further.
After content removal there is often an accompanying notification, but people may not fully understand why the content was deemed inappropriate. Often, it is unclear how to avoid censorship in the future. Even worse, people may be unsure how to reinstate an account post-ban.
To assist marginalized communities navigate social media content takedowns, scholars from UMSI teamed up with Harvard Cyberlaw Clinic and Salty, and created the OIHC website to help clear up the murky waters of social media content moderation.
OIHC aims to help people understand different social media platforms’ policies and rules about content takedowns. It also provides easy-to-read resources on different social media guidelines and what to do if your content is taken down.
Gray areas of content moderation
OIHC is the brainchild of assistant professor Oliver Haimson. Haimson was awarded an NSF CAREER grant in 2020 for his research on equitable social media content moderation, including the causes and consequences of online platforms banning users and content.
As part of the NSF award, grantees are asked to include an educational component to their research projects. While many researchers include courses based on their research, Haimson wanted to do something more innovative. “Rather than just presenting research results about content moderation and the ways that social media sites are sometimes suppressing content for marginalized people, I wanted to put together an actual resource that is aimed at marginalized people,” he says.
Since receiving the CAREER award, Haimson has been researching which groups experience the most content takedowns. One finding is that content moderation is a tricky endeavor. For example, Haimson says that platforms trying to combat hate speech often turn to computational methods: using algorithms to sift out certain words or phrases. While these efforts are admirable and needed, he says, the filter isn’t always accurate.
“For instance, a person of color could post complaining about white people,” says Haimson. “A computational model will flag that as hate speech, when really it might be a statement about racial justice or someone's experiences with racism.” The poster could have their content removed or even banned, leaving them confused and frustrated, and feeling even more marginalized.
OIHC is designed to help people navigate these takedowns by providing education, advocacy, and engagement resources. “The site takes a very research-based approach and really tries to provide the things that are most helpful for people,” says Haimson. Using interviews with people who had experienced content takedowns, the team focused on topic areas that were most helpful: social media user rights, privacy and data collection, social media guidelines, and social media appeals.
Making sense of takedowns
“The website is a tool that helps people understand what's going on after content takedowns,” says Christian Paneda, graduate of the MSI program and designer of the site. Early in the planning, Paneda says, it was clear the team needed some law experts to help them. Haimson reached out to the Harvard Cyberlaw Clinic and asked to collaborate with students. Jessica Fjeld, lecturer at Harvard Law School, joined the effort and brought in two students, Sammy Camy and Landon Harris.
Paneda says the Cyberlaw Clinic students, Camy and Harris, were a big help explaining the intricacies of content moderation and specific policies for each platform. “They went through the different terms and conditions of the social media we're looking at, so Facebook, Instagram, Twitter, TikTok, and Reddit,” he says. In particular, the law team focused on guidelines around nudity, graphic content and hate speech — topics that frequently arise in content bans for marginalized people.
The Harvard team also tackled scenarios that were most pertinent to marginalized communities. “They came up with a short answer covering what you need to know about this situation, based on a legal perspective on the community guidelines,” says Paneda. They also provided a longer version with more details on how to learn more about the topic.
“The Harvard team were responsible for putting together the content for some of the pages and making sure that it was all accurate and truthful,” says Haimson, adding that they knew the legal language also had to be translated for a lay audience. He asked Claire Fitzsimmons of Salty to collaborate. “Salty helped us with making sure it was more palatable to a general audience.”
In addition to providing the ins and outs of community guidelines, OIHC also has a spot where people who have experienced censorship can share their experiences. Seeing other people post their stories can help create community, and can point out areas where more research and effort is needed to support marginalized people.
Human-centered design to support humanity
The idea behind OIHC is to compile resources for the most widely used social media platforms, in an easy-to-understand format. Haimson notes that guidelines on social media sites aren’t very successful in getting clear information to users. “I think they do try, but you have to know where to look,” he says. “You have to go to a different place to find guidelines on Reddit versus TikTok.”
The OIHC website is a one-stop source of information that was carefully designed with a human-centered approach. “We did a lot of user testing in terms of making sure that people could actually find what they were looking for,” Haimson says.
“I just want people who are experiencing what can be really stressful situations to be able to find this resource and use it to help them in some way,” says Haimson. “We think of it kind of as a digital literacy resource, helping people to learn more about this digital world.”
“In an ideal world, I would love it if this could influence social media policy in some way to have social media policy managers better understand these challenges that people face.”
— Sarah Derouin, UMSI public relations specialist
OIHC was created by an interdisciplinary team: Sammy Camy (Harvard law student), Daniel Delmonaco (UMSI doctoral student), Claire Fitzsimmons (Salty), Jessica Fjeld (Harvard lecturer), Oliver Haimson (UMSI assistant professor), Landon Harris (Havard law student), Shannon Li (BSI graduate now in UMSI’s accelerated MSI program), Samuel Mayworm (MSI graduate), Christian Paneda (MSI graduate), Hibby Thach (incoming UMSI doctoral student), and Andrea Wegner (BSI student).
Visit the Identity Help Center at oihc.org.