University of Michigan School of Information
‘The silliness rhetoric has come onboard’ the 2024 presidential election
Tuesday, 10/22/2024
By Noor HindiTaylor Swift has entered the chat.
And so have Charli XCX, brat memes, cats and dogs, offbeat humor, absurd AI-generated photos and firestorms of misinformation.
TL;DR? It’s been bizarre.
University of Michigan School of Information researchers — experts on social media, misinformation and disinformation — have closely watched the unfolding landscape this year. Now they’re weighing in on the deeper meaning behind the memes.
“Social media has played a bigger role, I think, than we saw in the last election,” says UMSI professor and associate dean for academic affairs Cliff Lampe, an expert on social media.
As Taylor Swift signs off on her endorsement letter to the Harris-Walz campaign with “Childless Cat Lady,” in a dig against JD Vance, and Donald Trump Jr. posts an AI-generated photo of Donald Trump on a giant cat with the caption “Save our pets!!!!!” the Harris-Walz campaign has attempted to capture the hearts of Gen Z Americans on their home turf: TikTok.
Could this be a good thing?
“Younger people have been really demotivated to come out and vote,” says UMSI assistant professor Chelsea Peterson-Salahuddin, whose research focuses on the culturally specific ways marginalized communities, most often Black women, femmes and queer folks, engage with mass and digital communications technologies to seek information and build community. “The Harris campaign has tapped into TikTok meme culture and attempted to capitalize on a younger vote.”
The 2024 election — and the 2020 and 2016 elections before it — have demonstrated how technology can both undermine and empower democracy. But UMSI associate professor David Jurgens, an expert in artificial intelligence and social media, notes that one bedrock of democratic discourse — rhetoric — seems stuck in perpetual decline.
“The degree of eloquence in making campaign arguments has dropped,” he says. “I think the 2016 and 2020 elections were effective in raising the anger rhetoric, but in 2024, the silliness rhetoric has come onboard, as well.”
Although the memes are sometimes funny, their ability to parade as official sources of truth has become a pressing concern.
UMSI experts are studying the impact of misinformation — false or misleading information shared without harmful intent — on democracy. Even more troubling is the spread of disinformation, which is deliberately false information shared with the intent to deceive.
“Social media allows for the rapid spread of disinformation,” says Lampe, emphasizing that the goal of disinformation isn’t just to deceive, but to create a sense of nihilism and a belief that truth is an illusion.
“If we can’t even agree on fairly straightforward things, and truth becomes entirely subjective to your identity affiliation, it can be super harmful for democracy overall,” Lampe says. “The 2020 claims of the stolen election, for example, were proven wrong multiple times. But it doesn’t matter, right? There’s definitely been a blow to democracy.”
In June 2024, U-M launched its Year of Democracy, Civic Empowerment, and Global Engagement. The university-wide initiative is aimed at developing programming and events around the protection of democratic values. Upcoming UMSI events, in alignment with this initiative, include an Oct. 23 Misinformation and Voter Information Workshop led by Lampe and open to the public and a Nov. 1 UMSI InfoSpeaks webinar, Voting While Misled: Social Media, Disinformation and the Election, featuring faculty from the School of Information and other U-M units.
As we watch misinformation campaigns disrupt Hurricane Helene disaster relief efforts and contribute to threats of violence against Ohio’s Haitian immigrant population, UMSI experts say it’s going to take a combination of media literacy, education and government regulation to combat the pervasive spread of falsehoods and restore trust in public discourse.
The widespread use of generative AI, especially, has changed the sophistication and quality of the information people are capable of producing, and raised our level of skepticism in engaging with information we see online.
“What types of safeguards do we need to have as a democracy? Because this isn’t a fad,” says Peterson-Salahuddin. “I think the more innovation there is, the more people have access to it. AI can be used as a powerful learning tool and a research tool, but it can also be used to generate misinformation and create deep fakes.”
Though the impact AI has had on the current election is unclear, Jurgens says it’s the “university’s job to educate people on the reality of deep fakes moving forward,” as well as their risks. Beyond education, UMSI assistant professor Nazanin Andalibi says the United States needs greater regulation and social media companies’ incentive structures need to change to better protect against harmful speech.
“We can’t trust tech companies to be of any help here,” Andalibi says regarding companies self-regulating. “Their obligation is to make money. I would like to see a system that attends to a set of values and a social media governance model that has different incentive structures and motivations than currently exist.”
There is no quick fix for these issues, but election years have a way of highlighting what’s at stake when technology is unrestrained and moving faster than society is able to control it.
“We need to be thinking critically about what types of information infrastructures we need to build back the trust that has been lost,” says UMSI assistant professor Matt Bui, who studies data justice and activism. “We need infrastructures that help communities form but also foster a sense of cohesion and unity.”
RELATED
Learn more about upcoming events, talks and faculty research about democracy, misinformation and civic engagement by visiting umsi.info.