In the last decade the Internet has come to dominate how we consume information. News, entertainment, and even education, are often a click away. If you know what you want, a few typed words can lead you to the webpage you seek. But what if your search is less concrete? What if you are looking to find inspiring, new, undiscovered content to consume? You could certainly ask a friend, or you could ask a personal recommendation engine. Continue reading
There has been a lot of alarming speculation in the media since February about the potential consequences of the FAA Modernization and Reform Act, which requires that the FAA prepare the national airspace for the introduction, in 2015, of privately owned and operated unmanned aerial vehicles (UAVs). The airspace already hosts UAVs flown by federal, state, and local governments; the Act makes it easier for such agencies to acquire them and permits private entities to get licenses to fly them too. It is designed to get as many drones as possible in the air as quickly as possible.
Much ink has been spilled speculating about the potential effects of a widespread drone presence in this country, mostly focusing on either their ramifications for privacy or on the potential for physical injury they represent. These observations fail to address what makes a sky full of drones so radically unsettling.
Drones are going to be used to gather data, and the data will be integrated into the marketing scan. All drones gather at least the data they need in order to function remotely, and some of them will be able to photograph in staggeringly high resolution, or track up to 65 separate people at once. They won’t all be doing this, obviously, but the FAA’s licensing process doesn’t require drone operators to go into detail about what their vehicles will carry or collect.
We also know that data about people’s movements and behavior is hugely valuable to marketers. It is already collected unobtrusively from us as we move around in the virtual space of the Internet. In an important sense, that space is already patrolled by drones with data collection capabilities, similar to the ones that will soon be operated in the national airspace by private entities. Behavioral information is lucrative. There is every reason to think that the data collected by airborne drones will be just as interesting to the purchasers of bot-collected online behavior data.
Of course, much of our physical-space movement is monitored already, and it is possible to aggregate this information to create an eerily complete picture of a person’s movements, social circle, and preferences. Credit cards, license plate scanners, CCTV cameras, transit passes, and smartphones are all sources of this information.
Over this web of information, drones can add a layer of photographic evidence. The marketing scan of the online drone will merge into the marketing scan of the physical-space drone, and the result will be that we are even more easily identified, tracked, tagged, and followed. Privacy advocates are justly concerned about the erosion of basic notions of privacy by ubiquitous monitoring.
This is a danger separate from safety hazards, because it undermines one of the most basic presumptions of freedom – the absence of arbitrary power. Conceptually, the danger potentially posed by the coming drone squadrons can be separated from privacy concerns, too. The concept of the panopticon (likely familiar to many of you) illustrates the loss of freedom that accompanies arbitrary power, and shows how distinct it is from the lack of contextual integrity that marks an absence of privacy.
The panopticon exemplifies the reality of arbitrary power. The English philosopher Jeremy Bentham invented the Panopticon: a prison in which guards can watch prisoners without prisoners knowing whether they are being watched. The architectural design features a central guard tower, from which a single guard could see every cell in the prison. Bentham reasoned that this architecture would force prisoners to behave at a minimal cost, since fewer resources would need to be invested in guarding them.
100 years later, French sociologist Michel Foucault observed that the “panoptic mechanism” exists in the abstract, as a form of social control. A panoptic arrangement exists wherever there is ongoing subjection to a “field of visibility.” Drones do this, literally: they could be watching at any time, but it will be impossible for us to know at any given time whether we are being observed. The constant subjection to this field, coupled with the capacity for this data to be used by the government to punish or by the marketing scan to determine what information we receive, means that our rational self-interest will lead us to self-censor. We are already seeing this play out socially; people have developed strategies like maintaining separate social network identities for personal and business use, or paying cash for transit passes to avoid being traceable via credit card.
Domestic drones taking photographs or video won’t significantly change this dynamic. They will push it further toward an extreme, in which it becomes harder and harder to extricate ourselves from the marketing scan, and in which the marketing scan and the eye of the State merge (because law enforcement will have ready access to privately owned and aggregated data).
My point in writing this is not to challenge anyone to come up with a “solution,” but rather to point out that the negative effects of drone presence are not exemplified by their security vulnerabilities or their tendency to drop out of the sky. Abstract as it might seem, the increased power and intensity of this “field of visibility” is what will affect our lives the most. It will determine the distribution of information through the marketing scan; we will eventually be aware of it for this reason. And as the reality of our observed status sinks in, we will rationally self-monitor in case we’re being recorded. This state of being poses a radical threat to the way we think about freedom.
We’d like to invite you to check out the following theory-oriented articles in our Fall 2012 issue dedicated to Big Data:
- Jelani Nelson, “Sketching and streaming algorithms for processing massive data”
- Ronitt Rubinfeld, “Taming big probability distributions”
- Jeff Ullman, “Designing good MapReduce algorithms”
- Ashwin Machanavajjhala and Jerome P. Reiter, “Big Privacy”
This issue of XRDS was edited by a multidisciplinary team of students – Aditya Parameswaran of Stanford, Andrew Cron of Duke and Huy L. Nguyen of Princeton. From the editors’ letter:
“It has been an interesting time for big data with innovations coming simultaneously from theorists, system builders, and scientists or application designers. We hope to provide readers with an idea of the interplay between developments in these three different communities … [that] together drive forward the development of big data analysis.”
Hope you enjoy reading this issue (feel free to let us know what you think in the comments)!