People of ACM - danah boyd

August 9, 2022

We read you initially considered becoming an astronaut. What made you decide to pursue a career in computer science and, more specifically, to research social media networks?

(laughs) Even when I wanted to become an astronaut, I wanted to be a payload specialist (the scientists who run experiments). I’ve always loved math in particular. I stopped pursuing my dream of becoming an astronaut when I broke my neck, but I still went to college to study math. On my first day of college at Brown University, I was assigned to a mentor—Andy van Dam—who, unsurprisingly, told me to add his Introduction to Computer Science to my list of courses. He explained that computer science was math. He also enticed me to pursue computer graphics by explaining that it was applied linear algebra. Toy Story had just come out and I was intrigued.

My interest in math and graphics led me to visualization. Under the supervision of Judith Donath at MIT, I began visualizing Usenet and email by exploring the networks that form when people interact with one another. I was fascinated—and disturbed—by how much could be learned by the digital traces people left behind. This prompted me to start getting involved in privacy research. Shortly afterwards, someone told me about Friendster, an early social network site. Even before the first news article was written about the service, Jeff Heer and I were visualizing the graph of Friendster connections and showcasing just how much could be discerned from these digital traces.

Your book, It’s Complicated, was published eight years ago and the landscape and impact of social media evolves very quickly. If this was still a core focus of your work, what research question would interest you now with respect to how social networks impact the lives of teens?

Funnily enough, when I started my PhD at Berkeley, I decided that I wanted to stop doing research on social media. Instead, under the mentorship of Peter Lyman, I decided to focus my attention on how teenagers use technology. Little did I know, in those days before MySpace and Facebook, that this would require me to continue my research into social media. But what really drove my research then was unpacking myths. People buy into a lot of myths about technology. And they have a lot of hopes and fears about youth. Combined, this produces some fantastical cultural logics that get boiled down to variously intoned versions of “kids these days!” In response to those anxieties, I worked with an amazing network of researchers under the guidance of Mimi Ito, where we collectively wrote Hanging Out, Messing Around, and Geeking Out. But the moral panics continued to grow, especially around bullying and predation. My response with this book was “it’s complicated.”

What tends to frustrate me the most are deterministic assumptions about technology. Both zealots and critics tend to believe technology is at the center of everything. So, opposing views emerge which hold that technology is either making the world better or destroying it. This is especially common when adults fret over how young people use technology. I approach technology as a tool and an intervention. Certain futures are made easier because of technology, and certain futures get more complicated. We can make bets about the probabilities of certain futures, but we cannot see the future. We can only introduce other interventions.

For this reason, I like to look at what people do with technology, to see how they leverage it to shape their futures, how they understand the world around them, and how they make choices given their realities. If I were to dive back into this arena, I would start there and see where it goes. But one of the main reasons that I won’t anytime soon is that I made myself a promise. I have three children. I don’t want to study them, or their peers, because I don’t want to relate to them as a researcher and I don’t trust myself to be able to codeswitch in a way that would be good for them or for me. Instead, I’m relishing the opportunity to study infrastructure, to pursue my curiosity about boring things and try to bring them alive for others to see.

Why did you decide to found Data & Society? Why is the mission of this organization especially important at this time?

I firmly believe that research can help us see in new ways by complicating our existing assumptions and inviting us to think with new frames. Before I founded Data & Society, individual researchers were doing amazing work at different institutions across different disciplines, but I wanted to knit this diverse network together in the hopes that doing so could have greater impact on the broader conversations about technology, data, and the society we are co-constructing. I wanted to help enable multi-disciplinary research on privacy and invite people who hold different epistemic vantage points to find points of agreement and clearly articulate the sources of their disagreements. I wanted to weave together ideas, empower decision-makers, and strengthen the networks of people critically engaging with sociotechnical systems. I am no longer involved in day-to-day work at the organization. I decided a few years ago to shift back to being a researcher. But, as a board member, part of what is exciting is to see how so many others’ visions have contributed to what Data & Society has become. Evolving alongside the ecosystem, those at Data & Society continue to pursue important research questions, but they also engage with policy questions and work to amplify the important work of others. They’ve built an amazing network that helps bring research into broader conversations and generate new knowledge that can inform discussions about our sociotechnical world. That work is so important now precisely because so many folks are waking up to the complexities of living in a world shaped by data-driven technologies. It gives me tremendous joy to watch an organization I initiated grow up and enable so many amazing people to connect, think, and take action.

Along with Jayshree Sarathy, you wrote a recent article about the US Census Bureau’s decision to use differential privacy to modernize its disclosure avoidance procedures for the 2020 Census. Will you explain the controversy and how this ties in with the work you are doing on social inequity now?

The Census Bureau has a critical responsibility in the US to produce data that is essential for democracy, including data that allocates political representatives and financial resources. To achieve this mission, the Census Bureau is given the awesome responsibility of collecting data from the public. This is data that the public is required by law to provide. To ensure that this data is of high quality, to uphold a moral imperative, and to comply with federal laws, the Census Bureau must keep this data confidential. But doing so in our modern world is extraordinarily complicated.

The Census Bureau enacted disclosure avoidance protocols decades ago, and it has injected noise into its canonical product since 1990. But it has always done this quietly. The Census Bureau also does a lot of data processing work to ensure that the data best reflect what is to be measured. This has long involved different kinds of repair work to deal with situations where forms caught on fire or where people refused to participate. The bureau knows that data are made, not found. But the public often holds onto an illusion that census data are simply an act of finding people, aggregating their responses, and publishing tabulations.

The controversy over differential privacy is an epistemic one. How do we know what we know? What are statistics anyway? In this paper, Jayshree and I try to unpack how this controversy ruptured a long-standing “statistical imaginary” when the bureau attempted to be forthcoming about its data-making work and engaged people about the process without grappling with divergent views on statistics.

I am deeply committed to an equity-first approach to data. In my mind, this requires grappling with the complexities that go into the making of data. If we want people to be represented in the data, we cannot presume that a method of voluntary participation will work, but we know for sure that fear of data being misused will ensure an undercount. For that reason, confidentiality is non-negotiable in my mind. At the same time, I’m a firm believer that we need to move past a dream of an exhaustive approach to enumeration and start reckoning more deeply with data’s weaknesses, biases, and uncertainties—and taking steps to advance our approaches to capturing and modeling data to ensure that they can inform decision-making. This also means that we need to stop treating census data as perfect and precise. That is a consequential illusion. Census data were never perfect and they never will be. But that doesn’t mean that these data aren’t extremely valuable and informative. They are even more powerful when we engage with their complexity. Moreover, because the Census Bureau is tasked to count everyone, these data continue to be more inclusive than any data captured by commercial entities. And, even if those data are flawed, the pursuit of an inclusive count matters tremendously to me.

Advances in computing technologies can happen so fast that they impact people’s lives in profound ways that are often difficult to understand. In broad terms, how can society catch up with the pace of innovation to ensure these technologies serve the public in responsible, ethical, and humane ways?

First off, there’s no “catching up”—we must accept that life around us will always shapeshift, sometimes because of technological innovation, sometimes because of forces outside of our control (from global pandemics to personal tragedies). As individuals, we need to develop resilience and empathy to respond to all that might come our way. Organizations and societies are made of people. Resilient organizations and resilient societies are comprised of healthy and empathetic people connected across networks. When we talk about the dangers of technology and the scale of change, the problem isn’t the technology itself. It’s the ruptures to our social fabric. People experiencing food insecurity are not healthy. Those who are greedily engaging in arbitrage are not empathetic. Organizations and societies don’t collapse because of technology. They collapse when technologies are leveraged to reconfigure the arrangement of power and people in oppressive ways. Our response to these shifts cannot—must not—focus on the technologies themselves. What matters is how society is structured, and how these technologies shape those structures.

The project of governance is one of ensuring a particular social configuration. Governments are created to govern people at scale. Governance can be a collective project aiming for equity, but it can also be an abusive project driven by other values, such as financial gain or a desire to control or oppress. When technologies operate at scale, they must be governed. A market-based approach to governing technologies is at odds with equity. Thus, it’s of little surprise that our current configuration is shaped by lecherous financialized logics that benefit a few at the expense of many. Those costs are borne through instability and uncertainty, as much as by financial and political oppression. Our nation-state approaches to governance are also faltering, in no small part because these technologies have upended the geopolitical arrangement.

All of this is to say that we’re in for a rocky road, but the path out will require collective will, new frameworks for governance and, most likely, new governance bodies. Unfortunately, at present, those most invested in leveraging this disruptive moment are doing so with deeply concerning values at their core, values rooted in inequity, oppression, and greed.

This is not to say that nothing can be done. Much can and should be done. For a professional organization like ACM—and for those computer scientists whose moral compass starts with equity—the first step is to gain clarity on values and build networks of peoples and practices grounded in those values. No technologies are neutral. People and organizations who cannot reflexively center on values are easily manipulated by those seeking power. Welcome to the banality of evil. Very few technologies were created with malevolent intentions. They become toxic because decision-making centers on the wrong values. My hope is that ACM and its members will leverage this moment in time to invest more deeply in conversations about values and societal commitments. From reflexivity, healthier infrastructures can grow.

danah boyd (stylized) is a Partner Researcher at Microsoft Research. In 2013, she founded Data & Society, a non-profit research organization that studies the development and governance of new technologies. She is also a Distinguished Visiting Professor at Georgetown University. For over a decade, her research focused on how young people use social media as part of their everyday practices. More recently, she has conducted research to understand how contemporary social inequities are shaped by data and algorithmic technologies.

Her 2014 book, It’s Complicated: The Social Lives of Networked Teens, was well-received and translated into seven languages. This book shed light on how teenagers use social media as part of their everyday lives. She is currently working on a new book about the 2020 US Census based on four years of fieldwork into how data are made and become legitimate. In addition, she has written dozens of papers, hundreds of essays, and given countless talks. Her honors include receiving the Electronic Frontier Foundation’s EFF Pioneer/Barlow Award and being named as one of Forbes America’s “Top 50 Women in Tech.” boyd was a participant on the “Balancing Trust and Risk” panel as part of ACM’s 75th Anniversary Celebration on June 10, 2022.