Skip to main content

University of Michigan School of Information


Media Center

UMSI welcomes media inquiries

Misha Teplitskiy wins reproducibility challenge for data science research

Headshot of Misha Teplitskiy in a award ribbon. "UMSI logo, Michigan Institute for Data Science, MIDAS Reproducibility Challenge, Misha Teplitskiy, Assistant Professor."

Friday, 03/25/2022

Misha Teplitskiy, assistant professor at University of Michigan School of Information, earned an award for the MIDAS (Michigan Institute for Data Science) 2021 Reproducibility Challenge. He and his colleague James Evans, from University of Chicago, presented their project, “How Firm is Sociological Knowledge: Reanalysis of GSS findings with alternative models and out-of-sample data, 1972-2012.”

Every year, MIDAS hosts a Reproducibility Challenge to seek exemplary work from U-M researchers to make data science projects more reproducible and research outcomes more robust. Winners are awarded a cash prize and their projects are added to the MIDAS Reproducibility Hub

Reproducibility across scientific fields is a significant challenge. “I think the incentives in scientific publishing tend to push people towards coming up with new claims rather than strengthening the credibility of existing ones,” says Teplitskiy. “A focus on reproducibility and developing the tools and practices for it serve as a counterweight, and help make the published literature more credible.”

Teplitskiy and Evans proposed a simple and scalable method for how to measure research replicability and reproducibility of social research that is based on social surveys. Replicability is the ability to generate the results again by using the same procedures on different data, while reproducibility is the process of using the same procedures on the same data sets and getting the same answer. 

At present, measuring how well studies replicate or reproduce is usually done ‘by hand,’ so it's extremely time- and resource-intensive; it cannot really be done at scale,” explains Teplitskiy. “Our idea is that surveys that ask the same questions year after year are very convenient for measuring reproducibility and replication of research based on them. You repeat the analysis a study does on the same survey(s), compare results, and get a measure of the reproducibility of that study.”

“But maybe even more interestingly, you can run that analysis on the very next wave of the survey, the one that came out after the study was published, and it’s almost like the perfect replication, because you’re doing the same analysis on a different sample of people,” says Teplitskiy. “While this method can’t apply to all types of social science research, there are thousands of studies that use, for example, the General Social Survey (GSS), which asks some identical questions year-to-year.”

Teplitskiy notes that there are a couple of benefits researchers can gain from their approach. “First, instead of using the wave of the survey coming immediately after publication, you may be able to use the survey done 2 or 5 or 20 years later, and see how well the findings hold up due to social change,” he says. “Second, you can make some cosmetic changes to the analyses the original papers reported to see if you still get the same results, i.e. how robust the findings are.”

Reproducible work serves as a public good, helping other researchers more easily build on previous efforts, Teplitskiy says. “It's great to see that others find this approach to measuring robustness of social science literature compelling, and the financial support helps us continue this research!”

—Sarah Derouin, UMSI public relations specialist.

Learn more about assistant professor Misha Teplitskiy

Learn more about the MIDAS Reproducibility Challenge