CS Students Explore the Ethics of Contact Tracing Apps

 

During the final sessions in Professor Essick’s Advanced Programing course, the Responsible CS Team challenged students to reflect on the ethical considerations at stake in the design and implementation of a contract tracing application that might be part of a broader public health response to the COVID-19 pandemic.

These sessions provided students with a final guided opportunity to cultivate their ability to recognize ethical values in the context of computer science and to discuss those values openly, critically, and confidently. These sessions expanded upon the previous engagements that gave students initial practice appreciating both that and how value pervades computer sciences. Professor Ray Essick along with Ethics Lab’s Senior Ethicist Professor Elizabeth Edenberg and Postdoctoral Fellow Mark Hanin led the sessions which were conducted virtually due to social distancing guidelines.

In preparation for the sessions, the students were asked to interview two people not enrolled in the course about their attitudes toward sharing personal information through a contact tracing application as part of a public health initiative against the spread of COVID-19. Getting students to talk to people outside of the classroom about the ethical implications of their subject is an important kind of assignment the Ethics Lab uses in many of its courses and engagements since it cultivates a habit of raising ethical questions in real-world contexts. Here, the interviews also primed students for engaging virtual sessions that took the form of both class-wide and smaller breakout group discussions where they grappled with ethical considerations concerning data privacy, liberty, and public health. 

A selection of interview responses demonstrates the variations in people’s attitudes towards data collection.

A selection of interview responses demonstrates the variations in people’s attitudes towards data collection.

One key theme of the discussions concerned the important relation between accountability and trust. A contract tracing application might use personal data to help prevent the spread of the novel coronavirus. Multiple students reported being more comfortable sharing such data with the government rather than some private corporation. As these students explained, this difference in comfort was due to their basic trust of regulatory mechanisms within the government and their worry that financial incentives might lead corporations to misuse the data that the application collects. This discussion prompted one student to distinguish between trust in the government generally and trust in some particular administration. Multiple students expressed skepticism that the current administration in the United States can be trusted to oversee a contract tracing application, citing worries about its failure to respond effectively during the early stages of the outbreak, lack of responsibility in other contexts, and concerns about whether and how data collected by the application might be misused as propaganda. Noting that the technical solutions of computer science are necessary but not sufficient for promoting trust, Professor Essick urged students to strive for what he called ‘provable responsibility’. With respect to computer code this means “[having] documentation that you can point to that says ‘this is what makes your data safe’”. And with respect to the people who have access to sensitive data, Essick highlighted the value of strict and rigorously enforced policies specifying exactly when data may be accessed. “It’s not just enough to write software that executes correctly that behaves responsibly,” Professor Essick continued, “we want to make sure that the other programmers that interact with us and the users that interact with our software understand that we care about [responsible programming], why we care... how we’re doing it and [that] they believe it.”

Noting that the technical solutions of computer science are necessary but not sufficient for promoting trust, Professor Essick urged students to strive for what he called ‘provable responsibility’.

Another key theme of the discussion of the contract tracing application related to the potential tradeoff between personal liberties and public interests. One student reported that, from the perspective of one of his interviewees, there may be no tradeoff at all: his interviewee was so worried about contracting the disease that he prefered that the application be both mandatory for all citizens and that it collect a wide range of personal data. Relatedly, another student speculated that because the pandemic has made the world so chaotic, people may be willing to sacrifice their liberties more than they would ordinarily. Many students expressed a willingness to share proximity-based personal data for the benefit of public health with one student noting his comfort with increasingly ubiquitous iBeacon technology that relies on similar data to offer proximity based marketing. Others expressed a preference that they have a clear understanding of the goals of the application and how their data will be used. Professor Edenberg and Postdoctoral Fellow Hanin noted that in addition to providing clarity in these respects, the application designers might also add a sunset provision that would specify when the application will cease to collect or use any data. Such design elements may help to ensure that the application promotes public health without unduly interfering with personal liberties. 

The final sessions of the semester helped students in the Responsible CS curriculum hone their ability to recognize that ethical values pervade computer science and its applications, to identify what those values are, and to discuss them openly, critically, and confidently. Reinforcing the importance of these sessions, Professor Edenberg reminded students, “we raise these ethical questions as you’re learning particular skills in programming so that you can learn to recognize that there are ethical questions at stake… [and] to give you the confidence to raise these questions.” Given the rich discussions across the three sets of engagements the students seemed to appreciate the vision. As one student put it, “that’s why we’re having these ethical discussions... you can’t uncouple what you’re actually coding and what you’re actually creating from what you’re doing in the ethical discussions.”