NSA Leaks and the Future of Surveillance
- Posted on July 29, 2013 at 4:42 pm by gabe.
Authored by Gabe McCoy and Kevin Hamilton, mostly before the most recent revelations (and also heavily revised in light of them).
After the Snowden leaks about PRISM, we still don’t know much about exactly how the government is spying on its citizens. All signs point to a level of surveillance that is taking place deep inside realms we traditionally define as “private.”
One thing we know, however, is at least one approach the NSA is taking to the law, and this approach does not bode well for the future. Their argument suggests a much more conservative approach to privacy than they seem to have actually adhered to, if XKeyscore is working as advertised. But even if they “retreat” from the current level of snooping, to a level that conforms to their expressed standard, problems will remain. At the heart of their approach – and much of the outrage – is an inadequate approach to the definitions of private and public.
Much of the press coverage so far has focused on the NSA’s contentious legal distinction between monitoring personal communication and monitoring metadata associated with that communication.
Here’s how the NSA documents discuss it:
“On the fourth amendment and metadata : This provision protects against the unreasonable search and seizure of the contents of a communication in which a person has a reasonable expectation of privacy.[…] We conclude that a person has no such expectation, however, in dialing, routing, addressing, or signaling information that does not concern the substance, purport, or meaning of communications. We reach this conclusion with respect to “metadata” associated with both telephone calls and electronic communications.”
Now, if the NSA is actually searching as broadly as XKeyscore seems to be, then they either aren’t holding to this approach –or- they have decided to approach all internet communication as a process of “dialing, routing, or signaling information.”
But for the sake of argument, let’s imagine a scenario wherein our government, perhaps even Congress, decides that the NSA has overstepped its bounds. They rule that no agency should look at anything but the “address on an envelope” as it were, and not the envelope’s contents. This would not solve the problem.
“Dialing, routing, addressing or signaling information” occupies a significant amount of internet communication. Emails and webpages carry content when they fly through networks, but they are preceded by routing information of many kinds – requests from machine to machine.
The initial PRISM revelations, interpreted most generously, paint a picture of government requests to the providers of personal communication. “We just want to look at the envelope,” they seemed to have said to Google, Facebook, Microsoft.
But there are other companies we might have imagined on the initial PRISM lists and didn’t find. If the NSA asked Facebook, for example, to see the “envelopes” of messages, why weren’t they asking companies like Akamai to see the “envelopes” of webpage requests made to their servers?
Perhaps they did, and we just don’t know it yet. But the more recent revelations about XKeyScore seems to indicate that they didn’t need to. They are tapping directly into all traffic, indexing it, and saving it for as long as they could (3-5 days).
In any case, let’s say (VERY hypothetically) that the NSA did actually stick to looking at “the envelope” but not the content of our web browsing, determining that a request to access a webpage is something like a public act. Even then, current technology allows for a great deal of inference about a person just based on an accumulation of such “public acts.”
If agencies like the NSA are forced to take a step back into the “public” realm, they will develop better “glasses” for seeing into the “private” realm. It is not inconceivable, for example, that an agent looking to track the web-browsing activity of a single household on your city block, without actually tapping that household’s internet connection, could mathematically infer the desired data from an anonymized list of all web activity for your block. It’s not the mythical “enhance” function common to facial recognition technologies in science fiction movies – but it’s close.
As Helen Nissenbaum argues at length in her book Privacy in Context, our reliance on a hard distinction between public and private has often hindered progress in these debates and discussions. Private and public, for Nissenbaum, aren’t only fuzzy categories. They also reinforce certain approaches to individual and society, intimacy and morality, that do not hold for all people in all social contexts. We need, she argues, a better understanding of how people need information to flow between spheres of influence and action.
How can we implement such a broader understanding of “contextual integrity” in information sharing? We may need new metaphors – space and vision sometimes don’t seem adequate to the task – and we certainly need robust implementation in our infrastructures. For example, many have pointed to the need for more built-in provenance information in our data-selves – databases should contain, they argue, not only containers for information, but containers for memory about who accessed the information, when and where.
The new metaphors may come from many places – perhaps even from within the history of art and architecture, where the social and political relationships of seen to unseen have occupied people for centuries. More on that to come.
- Learn more about Helen Nissenbaum’s work here: http://www.nyu.edu/projects/nissenbaum/