If extricating yourself from the electrical grid is, to some degree, a test of moxie and patience, extracting yourself from the web of urban surveillance technology strains the limits of both. If you live in a dense urban environment, you are being watched, in all kinds of ways. A graphic released by the Future of Privacy Forum highlights just how many sensors, CCTCV cameras, RFID readers, and other nodes of observation might be eying you as you maneuver around a city’s blocks. As cities race to fit themselves with smart technologies, it’s nearly impossible to know precisely how much data they’re accumulating, how it’s being stored, or what they’ll do with it.
BY Jessica Leigh Hester for The International Chronicles
Even in the middle of major city, it’s possible to go off the grid. In 2016, the Atlantic profiled a family in Washington, D.C., that harvests their entire household energy from a single, 1-kilowatt solar panel on a patch of cement in their backyard. Insulated, light-blocking blinds keep upstairs bedrooms cool at the peak of summer; in winter, the family gets by with low-tech solutions, like curling up with hot water bottles. “It’s a bit like camping,” one family member said.
If extricating yourself from the electrical grid is, to some degree, a test of moxie and patience, extracting yourself from the web of urban surveillance technology strains the limits of both. If you live in a dense urban environment, you are being watched, in all kinds of ways. A graphic released by the Future of Privacy Forum highlights just how many sensors, CCTCV cameras, RFID readers, and other nodes of observation might be eying you as you maneuver around a city’s blocks. As cities race to fit themselves with smart technologies, it’s nearly impossible to know precisely how much data they’re accumulating, how it’s being stored, or what they’ll do with it.
“By and large, right now, it’s the Wild West, and the sheriff is also the bad guy, or could be,” says Albert Gidari, the director of privacy at Stanford Law School’s Center for Internet and Society.
The various nodes where sensors and other tech could detect your movements through the city. Photo from Future of Privacy Forum.
Smart technologies can ease traffic, carve out safer pedestrian passages, and analyze environmental factors such as water quality and air pollution. But, as my colleague Linda Poon points out, their adoption is also stirring up a legal maelstrom. Surveillance fears have been aroused in Oakland, California, Seattle, and Chicago, and the applications of laws protecting citizen privacy are murky. For instance: data that’s stored on a server indefinitely could potentially infringe on the “right to be forgotten” that’s protected in some European countries. But accountability and recourse can be slippery, because civilians can’t necessarily sue cities for violating privacy torts, explains Gidari.
What would it look like to leapfrog that murkiness by opting out entirely? Can a contemporary urbanite successfully skirt surveillance? I asked Gidari and Lee Tien, a senior staff attorney at the Electronic Frontier Foundation, to teach me how to disappear.
During the course of our conversations, Tien and Gidari each remind me, again and again, that this was a fool’s errand: You can’t truly hide from urban surveillance. In an email before our phone call, Tien points out that we’re not even aware of all the traces of ourselves that are out in the world. He likens our data trail—from parking meters, streetlight cameras, automatic license plate readers, and more—to a kind of binary DNA that we’re constantly sloughing. Trying to scrub these streams of data would be impossible.
Moreover, as the tools of surveillance have become more sophisticated, detecting them has become a harder task. “There was a time when you could spot cameras,” Tien says. Maybe a bodega would hang up a metal sign warning passersby that they were being recorded by a clunky, conspicuous device. “But now, they’re smaller, recessed, and don’t look like what you expect them to look like.”
Other cameras are in the sky. As Buzzfeed has reported, some federal surveillance technologies are mounted in sound-dampened planes and helicopters that cruise over cities, using augmented reality to overlay a grid that identifies targets at a granular level. “There are sensors everywhere,” Gidari says. “The public has no ability to even see where they are.”
The surest way to dodge surveillance is to not encounter it in the first place—but that’s not a simple ask. While various groups have tried to plot out routes that allow pedestrians to literally sidestep nodes of surveillance, they haven’t been especially successful. In 2013, two software developers released a beta version of an app called Surv, which aspired to be a crowdsourced guide to cameras mounted in cities around the world. The app would detect cameras within a 100-meter radius of the user’s phone, but it failed to meet its crowdfunding threshold on Kickstarter.
The most effective solutions are also the least practical ones. To defeat facial recognition software, “you would have to wear a mask or disguises,” Tien says. “That doesn’t really scale up for people.” Other strategies include makeup that screws with a camera’s ability to recognize the contours of a human face, or thwarting cameras by blinding them with infrared LED lights fastened to a hat or glasses, as researchers at Japan’s National Institute of Informatics attempted in 2012. Those techniques are hardly subtle, though—in trying to trick the technology, you would stick out to the naked eye. And as biometrics continue to advance, cameras will likely be less dupable, too. There are also legal hiccups to consider: Drivers who don’t want city officials to know where they parked or when, Gidari says, would have to outwit license plate recognition tools by obscuring their license plate, such as with the noPhoto camera jammer, a new $399 device that fires a flash at red light cameras in an attempt to scramble a readable image. Obscuring license plates is already illegal in many cities and states, and others are chewing on new procedures.
LED glasses might not trick biometric cameras—but they will definitely attract the attention of folks on the street. Photo from National institute of Informatics.
In their book Obfuscation: A User’s Guide for Privacy and Protest, Finn Brunton and Helen Nissenbaum, both professors at New York University, champion a strategy of “throwing some sand in the gears, kicking up dust and making some noise,” essentially relying on the melee of data jamming to “hide in a cloud of signals.” A number of apps, websites, and browser extensions attempt to aid users in this type of misdirection—say, for instance, by running in the background of your regular web activities, trying to cover your digital tracks by throwing surveillance off your scent.
For example: A site called Internet Noise searches for randomized phrases and opens five fresh tabs every ten seconds. (I left it running as I wrote this, and now my browser history includes pictures of badgers, an online mattress store, an NPR article about the Supreme Court, and a research paper about gene mutation in hamsters.) As a cloaking technique, it’s not a perfect veil, writes Emily Dreyfess in Wired: “It’s actually too random. It doesn’t linger on sites very long, nor does it revisit them. In other words, it doesn’t really look human, and smart-enough tracking algorithms likely know that.” The site is more of a protest over Congress rolling back a not-yet-implemented FCC regulation that would have stymied ISPs from selling users’ browsing history.
Still, Tien advocates a certain degree of self-protection. He views these measures as a kind of digital hygiene—the “equivalent of washing your hands when you go to the bathroom,” or getting a flu shot. But he stresses that they’re only a partial prophylactic: “Nothing that will make you immune from the problem.”
Other techniques include employing Tor—a network that tries to anonymize the source and destination of your web searches by routing traffic along a convoluted path—and Signal, which offers encrypted messaging and phone calls. The Electronic Frontier Foundation’s Surveillance Self-Defense toolkit also suggests particular tools and behaviors for specific scenarios. People participating in protests, the guide suggests, might consider stripping meta-data from photos, to make it harder to match them with identities and locations. But this isn’t a perfect solution, either, Tien says, because you can only control what you post. “If I take a picture and scrub the metadata, that’s one thing,” Tien says. “If my friend takes a picture of me, I can’t do anything about that.” The Intercept produced a video illustrating step-by-step instructions for phone security at a protest, from adding an access passcode to turning on encryption settings.
On a daily basis, Tien tells me, “I don’t think you or I can exercise much meaningful self-help against the kind of tracking we’ll be seeing in real-world physical space.” That’s fodder for a point he makes about a fundamental asymmetry in the information that’s available to the bodies that install the cameras and those who are surveilled by them. There are relatively few laws relating to the expectation of privacy in a public space. The officials and organizations that install sensors, cameras, and ever-more-sensitive devices, he says, “have much more money than you do, much more technology than you do, and they don’t have to tell you what they’re doing.”
Ultimately, Tien and Gidari both take a long view, arguing that the most payoff will come from pushing for more transparency about just what this technology is up to. Part and parcel of that, Tien says, is resisting the idea that data is inherently neutral. The whole messy, jumbled mass of it contains information that could have tangible consequences on people’s lives. Tien says citizens need to remind their elected officials what’s at stake with data—and in the process, maybe “dampen their enthusiasm” for the collection of it.
He points out that sanctuary cities could be a prime example. There, he says, some advocates of immigrant rights are realizing that data collected via municipal surveillance “might not be such a good thing when we’re interested in protecting immigrants and the federal government is interested in deporting them.”
The practical strategies for opting out—of becoming invisible to some of these modes of surveillance—are imperfect, to say the least. That’s not to say that data collection is inherently nefarious, Gidari says—as he wrote in a blog post for the CIS, “no one wants to live in a ‘dumb’ city.” But he says that opting out shouldn’t need to be the default: “I don’t think you should have been opted in in the first place.”