[O]ur central choice now is whether this surveillance is a secret, one-way panopticon — or a mutual, transparent kind of “coveillance” that involves watching the watchers. The first option is hell, the second redeemable.The answer, he says, is coveillance, creating symmetry and transparency in how we are being watched, and in watching the watchers. Sounds about right.
Still, I cringe at the thought of how little privacy we now have in the technological present. Anybody and everybody can know your business - it's like living in a small town all over again.
By Kevin Kelly
Image: Twentieth Century Fox & Dreamworks
I once worked with Steven Spielberg on the development of Minority Report, derived from the short story by Philip K. Dick featuring a future society that uses surveillance to arrest criminals before they commit a crime. I have to admit I thought Dick’s idea of “pre-crime” to be unrealistic back then. I don’t anymore.
Most likely, 50 years from now ubiquitous monitoring and surveillance will be the norm. The internet is a tracking machine. It is engineered to track. We will ceaselessly self-track and be tracked by the greater network, corporations, and governments. Everything that can be measured is already tracked, and all that was previously unmeasureable is becoming quantified, digitized, and trackable.
If today’s social media has taught us anything about ourselves as a species it is that the human impulse to share trumps the human impulse for privacy.
We’re expanding the data sphere to sci-fi levels and there’s no stopping it. Too many of the benefits we covet derive from it. So our central choice now is whether this surveillance is a secret, one-way panopticon — or a mutual, transparent kind of “coveillance” that involves watching the watchers. The first option is hell, the second redeemable.
We can see both scenarios beginning today. We have the trade-secret algorithms of Google and Facebook on one hand and the secret-obsessed NSA on the other. Networks require an immune system to remain healthy, and intense monitoring and occasional secrets are part of that hygiene to minimize the bad stuff. But in larger doses secrecy becomes toxic; more secrecy requires more secrets to manage and it sets up a debilitating auto-immune disease. This pathology is extremely difficult to stop, since by its own internal logic it must be stopped in secret.
The remedy for over-secrecy is to think in terms of coveillance, so that we make tracking and monitoring as symmetrical — and transparent — as possible. That way the monitoring can be regulated, mistakes appealed and corrected, specific boundaries set and enforced. A massively surveilled world is not a world I would design (or even desire), but massive surveillance is coming either way because that is the bias of digital technology and we might as well surveil well and civilly.
In this version of surveillance — a transparent coveillance where everyone sees each other — a sense of entitlement can emerge: Every person has a human right to access, and benefit from, the data about themselves. The commercial giants running the networks have to spread the economic benefits of tracing people’s behavior to the people themselves, simply to keep going. They will pay you to track yourself. Citizens film the cops, while the cops film the citizens. The business of monitoring (including those who monitor other monitors) will be a big business. The flow of money, too, is made more visible even as it gets more complex.
Much of this scenario will be made possible by the algorithmic regulation of information as pioneered by open source projects. For instance, while a system like Bitcoin makes anonymous bank accounts possible, it does so by transparently logging every transaction in its economy, therefore making all financial transactions public. PGP encryption relies on code that anyone can inspect, and therefore trust and verify. It generates “public privacy”, so to speak.
Encoding visible systems open to all eyes makes gaming them for secret ends more difficult.
Every large system of governance — especially a digital society — is racked by an inherent tension between rigid fairness and flexible personalization. The cloud sees all: The cold justice of every tiny infraction by a citizen, whether knowingly or inadvertent, would be as inescapable as the logic of a software program. Yet we need the humanity of motive and context. One solution is to personalize justice to the context of that particular infraction. A symmetrically surveilled world needs a robust and flexible government — and transparency — to enforce adaptable fairness.
But if today’s social media has taught us anything about ourselves as a species it is that the human impulse to share trumps the human impulse for privacy. So far, at every juncture that offers a technological choice between privacy or sharing, we’ve tilted, on average, towards more sharing, more disclosure. We shouldn’t be surprised by this bias because transparency is truly ancient. For eons humans have lived in tribes and clans where every act was open and visible and there were no secrets. We evolved with constant co-monitoring. Contrary to our modern suspicions, there wouldn’t be a backlash against a circular world where we constantly spy on each other because we lived like this for a million years, and — if truly equitable and symmetrical — it can feel comfortable.
Bitcoin generates ‘public privacy’, so to speak.
Yet cities have “civilized” us with modern habits such as privacy. It is no coincidence that the glories of progress in the past 300 years parallel the emergence of the private self and challenges to the authority of society. Civilization is a mechanism to nudge us out of old habits. There would be no modernity without a triumphant self.
So while a world of total surveillance seems inevitable, we don’t know if such a mode will nurture a strong sense of self, which is the engine of innovation and creativity — and thus all future progress. How would an individual maintain the boundaries of self when their every thought, utterance, and action is captured, archived, analyzed, and eventually anticipated by others?
The self forged by previous centuries will no longer suffice. We are now remaking the self with technology. We’ve broadened our circle of empathy, from clan to race, race to species, and soon beyond that. We’ve extended our bodies and minds with tools and hardware. We are now expanding our self by inhabiting virtual spaces, linking up to billions of other minds, and trillions of other mechanical intelligences. We are wider than we were, and as we offload our memories to infinite machines, deeper in some ways.
Amplified coveillance will shift society to become even more social; more importantly it will change how we define ourselves as humans.
Kevin Kelly is Senior Maverick at WIRED. He co-founded Wired in 1993, and served as its Executive Editor from its inception until 1999. Kelly is the author of What Technology Wants (2010), Cool Tools: A Catalog of Possibilities (2013), and other books. He was involved with the launch of the pioneering online community The WELL (1985) and also co-founded the ongoing Hackers’ Conference.