Brooks and Schaub to study intersection of privacy and bias in learning analytics

Research assistant professor Christopher Brooks and assistant professor Florian Schaub at the University of Michigan School of Information (UMSI) have received a $50,000 grant from the Spencer Foundation to fund new research into predictive models of student success.  

In education, learning analytics, fueled by student data, is used to develop early warning systems to help learning institutions identify when a student might need extra support to succeed in school. 

Florian Schaub

Schaub says there is a lot of promise in being able to predict success and identify when struggling students might need help, but there are strong intersections with privacy questions that relate to using student data to accomplish that. Recent debate has centered on whether students should be given the opportunity to opt out of having their data shared. 

This research will look at the population characteristics of students who might opt out of allowing their data to be used for predictive modeling, whether allowing opt-outs will have an effect on the accuracy of predictive models, and whether the process will lead to algorithmic bias. 

“Predictive modeling work in education assumes that you are getting either a representative sample or all of the data from students,” says Brooks. “Most of my work uses all of the data that we have on all students. But what happens if we know that data is incomplete because we’re allowing students to exercise their agency and control over that?”

Brooks says they are eager to see the groupings and demographics of students who would be inclined to opt out. 

“Are computer scientists more likely to opt out than art students are?” he asks. “What about first-generation students, or black or Hispanic students? You can build narratives for any of these groups. In this study, we actually want to measure it.”

The problem, Schaub says, is if there are disparities in the data and certain demographics are underrepresented in reporting, it might affect the ability to predict outcomes for some subpopulations.

“One of the challenges of this kind of work is that it is necessarily interdisciplinary,” says Brooks. “We need to understand privacy, agency, student choice, predictive models and data science. It is hard to find a federal agency focused specifically on interdisciplinary work, so we are grateful that the Spencer Foundation has stepped up to explore something that has such large social implications.”

  - Jessica Webster, UMSI PR Specialist

Posted May 9, 2019