The Gray Space: Complicating Control and Care

Against a background of diamonds, two identical muted green deskspaces with a PC, clock, and window. Between them is a different kind of workspace; a vibrant violet light of creativity shines through the window on the desk.

The Fritz Family Fellows Program is a joint effort among Georgetown’s three campuses and nine schools to harness technology for the betterment of humanity. The fellowship program aims to cultivate the next generation of leaders with expertise in the social impacts of technology, and build a network of public interest technologists who learn from and support each other’s work.

Georgetown students Meera Kolluri (CCT ‘22), Joyce Yang (SFS ‘23), and Maddox Angerhofer (SFS ‘23) were awarded Fritz Fellowships for the 21–22 academic year to study the intersection of surveillance technologies, cyberlaw, and design justice with the goal of supplementing theories of control with theories of care. Advised by CCT Professor and Ethics Lab Fellow Dr. Meg Leta Jones and Ethics Lab Assistant Director Jonathan Healey, the Fritz Fellows are focusing on biometric surveillance case studies in three different sectors: policing (alcohol monitoring ankle bracelet), labor (health and lifestyle program), and education (online test proctoring). In this interview, the Fritz Fellows discuss how diving into the case studies has complicated their early ideas of ‘control’ and ‘care’...


Q: What case studies are you looking at, and how do they contribute to your understanding of control and care?

Meera: I think [our research has] actually broadened control and care for me—I don’t think it’s given me a specific answer—and I think that’s important because it’s not just about the product or biometric surveillance. In policing, for example, the body matters more than the product or software and is more essential to the working of the system of surveillance. 

Maddox: The case for labor is called Go365, a workplace wellness program offered by insurance companies. The specific case that we chose is by Humana, a mobile app that rewards points based on healthy lifestyle changes [...] The overall goal of it is to improve people’s health, but there are some interesting mechanisms at play such as the game of buying health and changing the relationship that you have with your provider [...] to a way in which the patient has access to more information and is taking more preventative steps towards their own care. It certainly has brought to light the complex dynamics between control and care, and that they’re not entirely separate and will sometimes feed into each other, as one of the goals of collecting this kind of information is to empower the user and give the insurance policyholder more control over their own health.

Meera: We have a constant conversation about whether care is good and control is bad, and that’s not necessarily the case and there’s a big gray space that’s in there, too. [...] It isn’t shocking that we’re being monetized in a system that has a business plan. So, tracking the ethics of it is something that needs to be fluid and match the upcoming legislation and policies with each case because everything’s moving simultaneously with lots of moving parts. 

Maddox: Even though we were selecting our case studies based on examples of technology that was more control-oriented and could move towards care, in the case of Go365 I actually found a few positive care examples [...] For example, the app was designed to be more open-ended and interoperable with a variety of different wearable technology devices [...] I think this is a good example of ways that designers can leave the opportunity for people to exercise their own choice, even while participating within the system they’ve designed. 

Joyce: When we all met as a group with our faculty fellows, we had three cases for each sector that we were choosing from. One point that really stuck out to me from Dr. Jones was her saying that we should choose one that isn’t the most obviously ‘bad’ example so that during discussion we could have a debate about whether it is control or care, and how we can shift it in a certain direction. That was really eye-opening for me because I was very much leaning towards thinking that surveillance is obviously scary since it goes into our personal information. But focusing on the gray areas and complicating things even more has been where I’ve learned the most and somehow gotten more concrete information from these case studies.