With Obfuscation, Finn Brunton and Helen Nissenbaum mean to start a revolution. They are calling us not to the barricades but to our computers, offering us ways to fight today’s pervasive digital surveillance—the collection of our data by governments, corporations, advertisers, and hackers. To the toolkit of privacy protecting techniques and projects, they propose adding obfuscation: the deliberate use of ambiguous, confusing, or misleading information to interfere with surveillance and data collection projects. Brunton and Nissenbaum provide tools and a rationale for evasion, noncompliance, refusal, even sabotage—especially for average users, those of us not in a position to opt out or exert control over data about ourselves. Obfuscation will teach users to push back, software developers to keep their user data safe, and policy makers to gather data without misusing it.
Brunton and Nissenbaum present a guide to the forms and formats that obfuscation has taken and explain how to craft its implementation to suit the goal and the adversary. They describe a series of historical and contemporary examples, including radar chaff deployed by World War II pilots, Twitter bots that hobbled the social media strategy of popular protest movements, and software that can camouflage users’ search queries and stymie online advertising. They go on to consider obfuscation in more general terms, discussing why obfuscation is necessary, whether it is justified, how it works, and how it can be integrated with other privacy practices and technologies.
About the Authors
Finn Brunton is Assistant Professor of Media, Culture, and Communication at New York University and the author of Spam: A Shadow History of the Internet (MIT Press).
Helen Nissenbaum is Professor of Media, Culture, and Communication and Computer Science at New York University, where she is Director of the Information Law Institute.
—danah boyd, author of It’s Complicated: The Social Lives of Networked Teens and founder of Data & Society
—Lorrie Faith Cranor, Director, CyLab Usable Privacy and Security Laboratory, Carnegie Mellon University
—Ross Anderson, Professor of Security Engineering, Computer Laboratory, University of Cambridge