Complicating Consent: Computer Science Students Examine Predicaments of Privacy & Public Health

Two students pass one another on campus: one, whose face is not visible, is on their phone; the other is wearing a mask.

This post is part of a series following the progress of Ethics Lab’s collaboration with the Computer Science Department that began with the Mozilla-sponsored ResponsibleCS challenge.

Throughout the more than 10 months of the COVID-19 pandemic, arguments about privacy have been integral to debates about public health. With this in mind, Dr. Ray Essick and Ethics Lab Postdoctoral Fellow Dr. Alicia Patterson modeled the last ResponsibleCS session of the semester in Essick’s Advanced Programming course around issues of privacy to help students further examine the potential power of data and the ethics of its applications in the midst of a global pandemic.

Building on a previous version of this exercise, Essick and Patterson situated questions about collecting and sharing health data in two contexts — a university requiring its students to download a contact tracing app in order to return to campus and an automotive factory requiring the same of its employees. Patterson cited Albion College’s rollout of a similar app that tracks students’ locations, without any way to opt out, as one example of the exercise’s roots in reality. 

The scenarios helpfully illustrated the ability of large institutions to complicate the notion of consent. “We usually think that meaningful consent requires having genuine choice,” Patterson noted. “If the cost of refusing is losing your job, does that count as genuine choice?”

After contemplating which types of data should be shared, who should have access to the data, and for what purposes data should be collected, students interviewed peers who are not in the class.

“Engaging with someone else to find out their viewpoint is a deliberate focus of this activity,” Patterson said about the exercise. She and Essick challenged the students to more critically examine their intuitions about privacy. “What is this value? How do other people conceptualize it? We wanted students to be in dialogue about those things, and to anticipate the ways in which our good intentions to solve a social problem with new technology might create other problems.”