Exploring Values in Computer Science

 
Mozilla+Computer+Science+150.jpg

This semester the Responsible CS team is partnering up with Professor Ray Essick’s Advanced Programming course. The first session sought to introduce students to how ethics intersects with computer science to introduce the values and concepts that we will tackle throughout the semester and the students’ engagement with ethics in their CS curriculum. The session was framed around the question of whether or not computer science is value neutral. Ethics Lab’s Senior Ethicist Professor Elizabeth Edenberg and Postdoctoral Fellow Mark Hanin joined Professor Essick in leading the sessions.

The session began with a discussion about what makes values ethical along with a survey of reasons one might think computer science is indeed value neutral. Students then collaborated in small groups to complete two worksheets aimed to help students reassess those reasons. The worksheets challenged students to think carefully about values associated with natural events and simple artifacts before returning to computer science itself by way of analyzing values implicated in the digital dashboard design of the Tesla Model 3.

value neutral.jpg

Right away students recognized that values might be implicated in computer science, even if they could not identify precisely how at first. One student remarked that although the general public might believe computer science is value neutral because it is “just a tool” or “only ones and zeros”, as computer scientists at Georgetown there is a sense that values are an important part of the field. Professor Edenberg pressed students to clarify how ethical as opposed to technical values might be implicated. Some suggested that ethical values are implicated because software design choices are the designer’s responsibility or because they might affect others.

Postdoctoral Fellow Hanin added to these suggestions, proposing a role for virtues as well, asking, “What kind of person do you want to be? What kind of society or community do you want to foster?”

As preparation for considering values in a computer science context students then turned to a worksheet concerning values implicated with natural events and simple artifacts and expressed budding concerns about agency, expectations, and intent as being relevant to whether or not something is value neutral. Some students suggested that because an avalanche is a natural event, it would make little sense to blame it for destroying some ski resort. As another student pointed out, however, since climate change may make avalanches more likely, humans may bear responsibility for their occurence. Here Professor Edenberg emphasized the role that the capacity for agency plays in these assessments: unlike avalanches, humans have minds and some degree of control over their behaviors, and this helps to explain why they might be liable for those behaviors. Turning to artifacts like guns and wheelchairs, some students remarked that because they are designed with a purpose, designers might be accountable if the intended purpose is harmful. Professor Essick highlighted how the suggested shift in responsibility toward designers is an interesting departure from the “just a tool” idea that places responsibility entirely on the user.

Considering how values might be associated with natural events and simple artifacts primed students for thinking about values in the context of computer science. Regarding the digital dashboard of the Tesla 3, students highlighted concerns about safety and accessibility. Students noted that since audio and temperature knobs have been replaced by in-menu options on a screen, it may now be more difficult to change these settings while attending to the road. Others observed that screens are likely to display additional information -- like incoming texts and video – that may distract the driver further. Thus the choice of incorporating a fully digital display may affect the car’s safety directly and promote poor norms of driving behavior. It may also affect its accessibility. Students noted that some people may find it difficult to use or interact with the software depending on its default settings. Professor Essick echoed this observation noting that even seemingly benign design choices, such as default color scheme of map and menu displays, might greatly impair one’s ability to use the car. Students then sought to modify the design to improve it, suggesting ways to promote safety and enhance accessibility.

In this session students explored how values may be implicated in computer science due to the agency and responsibility of designers, expectations about how software and algorithms will be used, and what affects software and algorithms are likely to have on people. As Professor Edenberg stressed, ideally questions of values and responsibilities will arise prospectively and not just retrospectively.

Professor Essick agreed, noting that “we want to avoid stumbling into these problems… we want to think about these questions now before you write a million lines of code. That’s the best time... [so] we have time to take action and change.”