« Summer Reading (Had Me A Blast) | Main | Sight Licenses »

Not Giving Up

buckle upAbout ten years ago, I found myself sitting on the floor of my San Francisco Bay Area apartment, hoping that the call I was on wasn’t going to drop yet again. At the other end of the line was a Seattle public radio station, hosting a live debate/conversation between me and computer scientist Bill Joy on the question of whether our technologies were going to kill us; at that point, my main concern was whether our technologies would even work. Joy had recently published his infamous “Why the Future Doesn’t Need Us” essay in Wired, and was still charged with the fiery nihilism of his argument that we are less than a generation away from nano-, bio-, and information technologies that would fundamentally transform — in a bad way — human society and the human species. Joy was convinced that these emerging technologies would cause our extinction, and that the only hope for humanity was to give up entirely on these innovations.

Joy was suffering from the same repeated disconnection problem I was wrestling with, but didn’t seem to appreciate the irony of the situation: here he was arguing that all-powerful technologies on the near horizon would inevitably destroy us, even while a ubiquitous and more-than-a-century-old technology remained stubbornly unreliable.

It’s a theme that would recur in countless arguments and debates I’d find myself in over the years. Usually, my sparring partner would claim (like Joy) that transformative technologies were about to sweep away human civilization, eliminating our humanity if they don’t destroy us completely. The only weak hope we might have would be to get rid of them — call this the Rejectionist perspective. Occasionally, however, the claim would be that transformative technologies were about to sweep away human civilization and replace it (and eventually us) with something better. This future was being driven by forces beyond our understanding, let alone control — call this one the Posthumanist argument. Each claim is a funhouse mirror of the other: We are on the verge of disaster or on the verge of transcendence, and the only way to hold on to our humanity in either case would be to disavow what we have made.

And they’re both wrong. More importantly, they’re both dangerous.

Our technologies are not going to rob us (or relieve us) of our humanity. Our technologies are part of what makes us human, and are the clear expression of our uniquely human minds. They both manifest and enable human culture; we co-evolve with them, and have done so for hundreds of thousands of years. The technologies of the future will make us neither inhuman nor posthuman, no matter how much they change our sense of place and identity.

The Rejectionist and Posthumanist arguments are dangerous because they aren’t just dueling abstractions. They have increasing cultural weight, and are becoming more pervasive than ever. And while they superficially take opposite views on technology and change, they both lead to the same result: they tell us to give up.

By positing these changes as massive forces beyond our control, these arguments tell us that we have no say in the future of the world, that we may not even have the right to a say in the future of the world. We have no agency; we are hapless victims of techno-destiny. We have no responsibility for outcomes, have no influence on the ethical choices embodied by these tools. The only choice we might be given is whether or not to slam on the brakes and put a halt to technological development — and there’s no guarantee that the brakes will work. There’s no possible future other than loss of control or stagnation.

Today, Rejectionists like writer Nicholas Carr and MIT social scientist Sherry Turkle argue passionately that a new wave of digital technologies is crippling our minds and breaking our social ties. Their solution is to (paraphrasing the words of William F. Buckley) “stand athwart history yelling Stop!” While their visions are less apocalyptic than Joy’s tirade, they’re more directly relevant for many people, and ultimately have the same ends.

The Posthumanist side is no less active. The godfather of the concept, technologist Ray Kurzweil, continues to churn out books and interviews telling us that the Singularity is near, a claim that seems to have special attraction for many tech-savvy young men. But like the Rejectionist perspective, Posthumanist arguments have mutated into new forms linked to current debates. Venture capitalist Peter Thiel (co-founder of PayPal and currently on the board of directors at Facebook), for example, insists that his investments in Singularity technologies will allow him to create a future devoid of politics — and arguing, infamously, that true freedom is incompatible with democracy.

Technology is part of who we are. What both critics and cheerleaders of technological evolution miss is something both subtle and important: our technologies will, as they always have, make us who we are—make us human. The definition of Human is no more fixed by our ancestors’ first use of tools, than it is by using a mouse to control a computer. What it means to be Human is flexible, and we change it every day by changing our technology. And it is this, more than the demands for abandonment or the invocations of a secular nirvana, that will give us enormous challenges in the years to come.

I'm looking forward to it.


All good points, but that "what it means to be human is flexible" at the end is neat get-out clause, isn't it? Just make it flexible enough to stretch to some post singularity state and post-humanism is ruled out, by definition... I think the situation may change when we have technologies which genuinely operate on our own constitution, or which play host to non-human intelligence, though neither is imminent.

Post a comment

All comments go through moderation, so if it doesn't show up immediately, I'm not available to click the "okiedoke" button. Comments telling me that global warming isn't real, that evolution isn't real, that I really need to follow [insert religion here], that the world is flat, or similar bits of inanity are more likely to be deleted than approved. Yes, it's unfair. Deal. It's my blog, I make the rules, and I really don't have time to hand-hold people unwilling to face reality.


Creative Commons License
This weblog is licensed under a Creative Commons License.
Powered By MovableType 4.37