Students Focus on Ethical Challenges in Privacy

Dr. Elizabeth Edenberg speaks to the class.

Dr. Elizabeth Edenberg speaks to the class.

 

In this semester’s final session in Advanced Programming, Georgetown’s Responsible CS Team focused on ethical challenges relating to privacy and consent that that platform developers face when collecting user data.  The session was co-led by Professor Elizabeth Edenberg, Senior Ethicist at Ethics Lab, and Professor Essick.

In advance of class, students were asked over Thanksgiving break to engage in thoughtful conversations with others about sharing personal data in various contexts.  To facilitate these conversations, students were given a fictional data request form of a public-private partnership that requested a broad range of data points, from routine to highly intrusive.  These forms were designed to prompt conversations about different comfort levels people have with sharing various types of personal data and which entities they see as trustworthy protectors of these types of data.  This exercise served two purposes. First, it helped students practice speaking to people outside of the classroom about key issues in digital ethics. Second, it encouraged students to uncover disagreements about technology and thereby see the value of exploring diverse perspectives before assuming that their own choices were universal.

The diverse perspectives students brought to bear generated a fruitful basis for class discussion.  An interesting difference of opinion centered on distinct degrees of trust in private companies and the government in acquiring personal data.  As one student reported, their interlocutor thought that a private company poses greater privacy risks given its profit motive to commercialize and sell data.  Another student retorted that coercive mechanisms at the government’s disposal make it a greater threat to privacy. 

The topic of consent emerged as a prominent theme in relation to terms of service and fine print.  Students grappled with ethical questions raised by the ubiquitous practice of accepting terms of service without carefully reading or understanding them.  Some students thought that companies can do little else but spell out exactly how data will be used and expect users who voluntarily opt-in to read those terms before agreeing.  Another student was quite skeptical that users will ever read terms closely unless more interventionist strategies, such as verification questions, are used. Professors Essick and Edenberg problematized the idea of opt-in as a proxy for consent.  As Professor Edenberg put it, real consent entails an “ability to do otherwise that’s more than in name only.” Professor Essick noted that while access to leading social media platform isn’t strictly speaking a right, practically speaking it may be hard to get by without access.  “Even more challenging than the issue of social media,” Professor Edenberg added, “are digital platforms adopted by our schools and jobs that offer no real opportunity to opt out. Communicating and collaborating electronically is an essential part of many jobs. In this context, we might need to think more critically about current consent practices that assume that use of digital platforms is a voluntary choice.”

Encouraging computer scientists to have conversations with other people who have different views about data privacy underscored that design choices are value-laden, that one’s own views are not universally shared, and that seeking input from a plurality of voices affected by programming decisions is critical to responsible programming.  To “keep software in sync with our norms and values,” as Professor Essick noted, ethical issues need to be flagged and raised early in the design process, “even if a programmer doesn’t know the right answer or final answer.”

This project is funded by a Mozilla Responsible Computer Science Challenge grant and is part of the university’s Initiative on Tech & Society, which is training the next generation of leaders to understand technology’s complex impact on society and develop innovative solutions at the intersection of ethics, policy and governance.

Back to Mozilla Responsible Computer Science Grant

 
MozillaGuest User