Ethics Lab alum awarded grant to circumnavigate the globe researching privacy & voting rights
“There’s blind acceptance a lot of the time that tech at least won’t hurt—and that’s dangerous.”
We recently sat down with Ethics Lab alum Fiona Singer, SFS’20, to talk about her plans for the SFS Circumnavigators Grant* she was recently awarded.
[*Hot air balloon not included.]
In September of 2018, hackers gained access to the accounts of 50 million Facebook users. One of those users was Fiona Singer, an SFS junior who happened to be enrolled in a course on International/Comparative Privacy & Surveillance taught by Assistant Professor and Ethics Lab Faculty Affiliate Meg Jones, JD, PhD.
When she was prompted to reset her Facebook password following the breach, Fiona thought back to 2016 and remembered seeing a lot of political ads for the Brexit Leave campaign. She’s a UK citizen, and her Facebook page reveals that she has a lot of family in the UK. With all the news about data firms’ election interference, she wondered—had she been targeted? “It’s weird to connect the dots about how something so abstract happens in person.”
When news of the Facebook breach broke, Fiona was focused on de-blackboxing body worn video for her Privacy & Surveillance course project. Since then, she’s turned her attention to privacy as a lens for studying voting rights. As she explains, body worn video and internet tracking are both “examples of diffuse surveillance. Most people think it’s there to help them, not hurt them. But when you look at their use in larger systems, they can do harm. There’s blind acceptance a lot of the time that tech at least won’t hurt, and that’s dangerous. Tools like Facebook and Google can empower movements—and that’s important—but I do think there needs to be a level of critical thought about how technology is implemented in society. People need to begin to have nuanced conversations about it, especially people my age.”
And that’s exactly what Fiona plans to do. As a grantee of the Circumnavigators Club Foundation (which provides support for research projects that “explore an international problem or issue and generally contribute to our understanding of world conditions”), she will spend the summer of 2019 traveling the world to speak with NGOs, educators, press, and civil interest groups on the subject of privacy and government. She’ll be traveling to Belgium, the UK, South Africa, Vietnam, Singapore, and Argentina—all countries that have faced election manipulation since 2005. Her interviews will cover such topics as current legislation, disparities between legislation and implementation, impact, and popular understanding of privacy. As Fiona points out, there is very little literature on these topics in South Africa, for example, in spite of continual interference by data firms (including Cambridge Analytica’s parent company) since 1992. “Most people don’t know Cambridge Analytica has been active in South Africa and India for decades. It’s not just in the United States, and we shouldn’t just care about Brexit. It’s a far bigger issue.”
Follow Fiona’s journey this summer: @fiona.singer (Instagram) and at https://medium.com/@fionasinger
Explore her Privacy & Surveillance de-blackboxing project below:
“I’ve always been curious about surveillance technology, and wanted to explore the rapid and systemic integration of Body Camera Technology in the last few years. I began where many Americans do, with the basic premise that Body Worn Video (BWV) is a tool for good as it holds law enforcement accountable and increases transparency. As I researched BWV, it became clear to me that police do waive privacy rights disproportionately for traditionally policed communities; in a world where everyone is watched, not everyone is watched equally. BWV is like any other technology, it inherently isn’t ‘good’ or ‘bad,’ instead its integration can be harmful without critical thought about who it will impact.”