Uncovering Ethical Concerns in Programming

Two colleagues are reviewing and discussing lines of code visible on a screen.

This post is part of a series following the progress of Ethics Lab’s collaboration with the Computer Science Department that began with the Mozilla-sponsored ResponsibleCS challenge.


On day one of Dr. Ray Essick’s Advanced Programming class, 85% of students in the course either agreed or strongly agreed when asked whether it was a computer scientist’s duty to consider the social impact of their work. At the same time, however, 48% also agreed or strongly agreed that if their program works and is maintainable, then they’ve fulfilled their job—suggesting there is opportunity to deepen students’ understanding of ethical complexities and instill a greater sense of moral responsibility.

For the first ResponsibleCS session of the semester, Ethics Lab worked with Essick to revise an exercise from the Spring. Instead of framing the activity by asking whether CS is value neutral, the new iteration took for granted that CS is not value neutral and focused on exploring how and why that’s the case. Specifically, the activity examined how ethical dilemmas might materialize at any stage of software development. Examples included designing a Tesla touch-screen or smart home device, repurposing old software for a new medical device, and reviewing code.

Essick noted that the goal of the engagement was to ensure that students were “able to ask questions about how [their] code works—not only is it technically correct, but is it responsible? Is it addressing the ends of your users? Is it recognizing all of its user base?”

Essick noted that the goal of the engagement was to ensure that students were ‘able to ask questions about how [their] code works—not only is it technically correct, but is it responsible? Is it addressing the ends of your users? Is it recognizing all of its user base?’

For example, students discussing the issue of repurposing software for a new medical device emphasized concerns of bias and inequity, questioning whether the software had been tested on a wide range of patients, or if key demographics had been excluded.

While discussing the Tesla, one group of students reflected on the potential for unresponsive buttons, leading to unnecessarily dangerous driving. They also contextualized this concern with police brutality, positing that the touch screen might be incorrectly identified as a mounted laptop or phone, which might lead to the driver being pulled over.

Dr. Alicia Patterson, the Ethics Lab Postdoctoral Fellow leading the collaborative efforts with Essick, noted that “students were really connecting [the questions raised] to some of their personal experiences, whether at internships or elsewhere”—another aim of the exercise.