Bots, Bacteria, and Carbon talk at the University of Minnesota
(video) March 2013
Visions of a Sustainable Future interview
(text) March 2013
Talking about apocalypse gets dull...all apocalypses are the same, but all successful scenarios are different in their own way.
The Future and You! interview
(video) December 2012
Bad Futurism talk in San Francisco
(video) December 2012
Inc. magazine interview
(text) December 2012
Any real breakthrough in AI is going to come from gaming.
Singularity 1 on 1 interview
(video) November 2012
(text) September 2012
One hope for the future: That we get it right.
Doomsday talk in San Francisco
(video) June 2012
Polluting the Data Stream talk in San Francisco
(video) April 2012
Peak Humanity talk at BIL2012 in Long Beach
(video) February 2012
(text) January 2012
Our tools don't make us who we are. We make tools because of who we are.
Hacking the Earth talk in London
(video) November 2011
(text) May 2011
The fears over eugenics come from fears over the abuse of power. And we have seen, time and again, century after century, that such fears are well-placed.
Future of Facebook project interviews
(video) April 2011
Geoengineering and the Future interview for Hearsay Culture
(audio) March 2011
Los Angeles and the Green Future interview for VPRO Backlight
(video) November 2010
Surviving the Future excerpts on CBC
(video) October 2010
Future of Media interview for BNN
(video) September 2010
Hacking the Earth Without Voiding the Warranty talk at NEXT 2010
(video) September 2010
We++ talk at Guardian Activate 2010
(video) July 2010
Wired for Anticipation talk at Lift 10
(video) May 2010
Soylent Twitter talk at Social Business Edge 2010
(video) April 2010
Hacking the Earth without Voiding the Warranty talk at State of Green Business Forum 2010
(video) February 2010
Manipulating the Climate interview on "Living on Earth" (public radio)
(audio) January 2010
(video) January 2010
Homesteading the Uncanny Valley talk at the Biopolitics of Popular Culture conference
(audio) December 2009
Sixth Sense interview for NPR On the Media
(audio) November 2009
If I Can't Dance, I Don't Want to be Part of Your Singularity talk for New York Future Salon
(video) October 2009
Future of Money interview for /Message
(video) October 2009
Cognitive Drugs interview for "Q" on CBC radio
(audio) September 2009
How the World Could (Almost) End interview for Slate
(video) July 2009
Geoengineering interview for Kathleen Dunn Show, Wisconsin Public Radio
(audio) July 2009
Augmented Reality interview at Tactical Transparency podcast
(audio) July 2009
ReMaking Tomorrow talk at Amplify09
(video) June 2009
Mobile Intelligence talk for Mobile Monday
(video) June 2009
Amplify09 Pre-Event Interview for Amplify09 Podcast
(audio) May 2009
How to Prepare for the Unexpected Interview for New Hampshire Public Radio
(audio) April 2009
Cascio's Laws of Robotics presentation for Bay Area AI Meet-Up
(video) March 2009
How We Relate to Robots Interview for CBC "Spark"
(audio) March 2009
Looking Forward Interview for National Public Radio
(audio) March 2009
Future: To Go talk for Art Center Summit
(video) February 2009
Brains, Bots, Bodies, and Bugs Closing Keynote at Singularity Summit Emerging Technologies Workshop (video) November 2008
Building Civilizational Resilience Talk at Global Catastrophic Risks conference
(video) November 2008
Future of Education Talk at Moodle Moot
(video) June 2008
(text) May 2008
"In the best scenario, the next ten years for green is the story of its disappearance."
A Greener Tomorrow talk at Bay Area Futures Salon
(video) April 2008
Geoengineering Offensive and Defensive interview, Changesurfer Radio
(audio) March 2008
(text) March 2008
"The road to hell is paved with short-term distractions. "
The Future Is Now interview, "Ryan is Hungry"
(video) March 2008
G'Day World interview
(audio) March 2008
UK Education Drivers commentary
(video) February 2008
Futurism and its Discontents presentation at UC Berkeley School of Information
(audio) February 2008
Opportunity Green talk at Opportunity Green conference
(video) January 2008
Metaverse: Your Life, Live and in 3D talk
(video) December 2007
Singularity Summit Talk
(audio) September 2007
Political Relationships and Technological Futures interview
(video) September 2007
(audio) September 2007
"Science Fiction is a really nice way of uncovering the tacit desires for tomorrow...."
G'Day World interview
(audio) June 2007
(audio) June 2007
Take-Away Festival talk
(video) May 2007
(audio) May 2007
Changesurfer Radio interview
(audio) April 2007
(audio) July 2006
FutureGrinder: Participatory Panopticon interview
(audio) March 2006
TED 2006 talk
(video) February 2006
Commonwealth Club roundtable on blogging
(audio) February 2006
Personal Memory Assistants Accelerating Change 2005 talk
(audio) October 2005
Participatory Panopticon MeshForum 2005 talk
(audio) May 2005
Andrew Leonard has a short, sharp piece in Salon entitled "Silicon Valley dreams of secession," about a recent talk by tech entrepreneur Balaji Srinivasan calling for the Valley to secede from the US on a wave of 3D printers, drones, and bitcoins. Here's Leonard's capsule of the talk, along with Srinivasan's money quote:
Virtual secession, argues Srinivasan, is just natural evolution. Once upon a time, people seeking better lives left their broken states to immigrate to the U.S. Now, it is time for their descendants to emigrate further, except this time they don’t need to go anywhere physically, except into the cloud.
“Exit,” according to Srinivasan, “means giving people tools to reduce the influence of bad policies over their lives without getting involved in politics… It basically means build an opt in society, run by technology, outside the U.S.”
Long time readers will have guessed what part of Srinivasan's quote bothered me the most: "without getting involved in politics."
In 2009, I wrote a piece entitled "The End-of-Politics Delusion," about a broadly parallel set of arguments emerging from the bowels of Silicon Valley. Democracy is bad, and what we really need is a technology-enabled society to get rid of politics, or so the true believers would have us think. I reacted with this:
Politics is part of a healthy society -- it's what happens when you have a group of people with differential goals and a persistent relationship. It's not about partisanship, it's about power. And while even small groups have politics (think: supporting or opposing decisions, differing levels of power to achieve goals, deciding how to use limited resources), the more people involved, the more complex the politics. Factions, parties, ideologies and the like are simply ways of organizing politics in a complex social space -- they're symptoms of politics, not causes.
Calls to get rid of politics can therefore mean one of two things: getting rid of persistent relationships with other people; or getting rid of differential goals. Since I don't see too many of the folks who talk about escaping politics also talking about becoming lone isolationists, the only reasonable presumption is that they're really talking about eliminating disagreements.
It's the latest version of the notion that "a perfect world is one where everyone agrees with me."
Anyone calling for an end to politics, whether via secession or technocracy or singularity, either has no understanding of how human societies work (the generous interpretation) or has an authoritarian streak itching to show itself (the less-generous version). Srinivasan's version is even worse due to its dependence upon a thoroughly unreliable, opaque, and politically-biased substrate, "the cloud."
Here's what I mean: technologies fail, sometimes briefly, sometimes disastrously, whether because of physical damage, bad code, or intentional attack; telecommunication systems, in particular the commercial telecom carriers in the US, are notoriously unwilling to divulge operational details and abide by network neutrality; and all of these technologies embed norms and choices that are inherently biased [just as one example, the vast majority of home internet connections in the US are asymmetric, with much faster download (consumption) speeds than upload (creation) speeds -- that's a choice, not an inherent fact of the technology]. Using this as the basis of a political system seems... unwise.
It's the big question for the century: how do we manage human society across the planet as the planet itself becomes increasingly hostile? Geophysical systems are complex, slow (in human terms), and deeply interconnected -- exactly the combination of conditions that our existing systems of governance can't handle. Simply continuing to operate as if nothing is changing at a point when everything is changing is a recipe for disaster.
Jake Dunagan has proposed a panel for South by Southwest Interactive 2014 to explore this very question, and asked me to be on it. Here's the tricky part: the selection of the panel depends (in part) on community support. In other words, if you think this is a useful or important idea, you need to vote for the panel.
Governance in the AnthropoceneWelcome to the Anthropocene, a geological epoch defined by humans and human activities. Humans are a global geological and evolutionary force, yet our economies, social formations, consumption patterns, and governments operate with intentional blindness to this enormous power and responsibility. The institutions that support human civilization, many of which have caused the global challenges we face today, do not appear capable of adapting successfully to 21st century realities. What is needed is a global movement to re-think and re-design governance for the Anthropocene epoch.
One of the first rules one is taught as a futurist-in-training is to avoid "normative scenarios" -- forecasts that describe what you want to see, even when the signals and evidence at hand make the scenario highly unlikely. This is much more of a challenge than non-futurists may think, as a good scenarist can usually come up with a plausible set of early indicators and distant early warnings to support just about any forecast. If one's work focuses on issues that have a strong ethical component (around human rights, for example, or the global environment) the problem is further multiplied.
One of the reasons I've been running silent over the past month or so has been the explosion of news around government (and corporate) surveillance of the Internet. Not that I'm especially worried about my own stuff -- I have a fairly public life, and have few secrets worth knowing. But the implications for the futures of privacy, security, commerce, communications, big data, and so forth are so enormous that I'm still trying to wrap my mind around where this is all going. And the desire to imagine normative scenarios about the potential outcomes is almost overwhelming.
Reality has a bad habit of undermining desired futures. Here's a non-privacy example: If you have a moral stance that says that individual access to guns should be strictly controlled or prohibited in the U.S., you may wish to imagine future outcomes where such restrictions are possible and widely accepted. But the evolution of 3D printers has made that kind of future highly implausible, as designs for 3D-printer-friendly firearms have now emerged and spread. As long as 3D printers are available, it will be extremely difficult to eliminate or control access to firearms, and as 3D printers become more capable, we'll see increasingly diverse and powerful printable weapons. Any discussions of "gun control" that don't acknowledge this are doomed to imminent irrelevance.
So when we think about the future of privacy, surveillance, and related concepts, one of the first questions we need to ask is "what real-world conditions constrain our possible futures?" What are the technical aspects of privacy and surveillance, and what kinds of changes would have to happen to shift the balance between the two? What are the political barriers? For example, if a leader took positive steps to reduce government surveillance, and subsequently a major terrorist attack happened, how likely is it that the public (and certainly the political opponents of said leader) would link the two? If the technological standards underlying the present-day Internet make full privacy essentially impossible -- not just vis-a-vis government snoops but also corporate "big data" behavior analysis -- who would actually have the capability to construct a more secure alternative?
I'm still thinking.
I've been asked to serve as guest editor for an upcoming edition of the Journal of Evolution and Technology, a peer-reviewed electronic journal published by the Institute for Ethics and Emerging Technologies. (Full disclosure: I've been a senior fellow at IEET for seven years.) The topic of the edition is, as the title of this post suggests, the ethics of geoengineering. Link to the full call for papers.
Here's a bit about what we're looking for:
For this issue of JET we would like to solicit papers exploring both the proposed geoengineering methods, and ethical, social and political questions that must be considered before they are explored and undertaken. Which methods make sense to explore? How can we keep the pressure on to shift to renewable and sustainable forms of energy, agriculture and manufacturing if we avail ourselves of this techno-fix? What agencies should be empowered to research and undertake these initiatives? What risks and benefits should be considered? What kinds of evidence and modeling should be required before they are undertaken, and at what point should they be deployed?
And the relevant info:
Submission deadline: Nov 1, 2013
Notification of acceptance/rejection: Feb 1, 2014
Final revision deadline: March 1, 2014
Publication: Spring/Summer 2014
Length and Style
We anticipate that this issue will contain around 10 papers and, as a working guide, the papers should be between 4000 and 12,000 words in length. Instructions on format and style are here: http://jetpress.org/authors.html
Manuscripts must be submitted electronically in Microsoft Word to email@example.com
Each submission will ideally receive two reviews. Completed reviews will be forwarded to the corresponding authors. Please suggest up to three external reviewers to facilitate the review process.
Here's what I'll be looking for: arguments and discussions that directly address the underlying dilemma driving the consideration of geoengineering, namely, the growing possibility that dire effects from climate disruption will happen faster than any carbon emission cuts could stop. Papers that just assert that geoengineering is bad and we should feel bad for talking about it, or that geoengineering is great because it will mean we don't have to waste money on cutting carbon will very likely find themselves stuck in a spam filter.
I've written quite a bit about the politics and ethics of geoengineering, but I know that I'm (a) not the only one thinking about it, and (b) not in possession of a monopoly on good ideas. I'd really love to see submissions of pieces that change my mind.
No, not really.
But that's the conclusion people are getting from a story published initially at Mother Jones, and picked up around the web. The CIA -- along with NASA, NOAA, and the National Academies of Sciences (NAS) -- is funding a $630,000 NAS project to study geoengineering. Not how to do it, but what the broader ramifications are for global politics. The announcement at the NAS gives the details:
An ad hoc committee will conduct a technical evaluation of a limited number of proposed geoengineering techniques, including examples of both solar radiation management (SRM) and carbon dioxide removal (CDR) techniques, and comment generally on the potential impacts of deploying these technologies, including possible environmental, economic, and national security concerns. The study will:
1. Evaluate what is currently known about the science of several (3-4) selected example techniques, including potential risks and consequences (both intended and unintended), such as impacts, or lack thereof, on ocean acidification,
2. Describe what is known about the viability for implementation of the proposed techniques including technological and cost considerations,
3. Briefly explain other geoengineering technologies that have been proposed (beyond the selected examples), and
4. Identify future research needed to provide a credible scientific underpinning for future discussions.
The study will also discuss historical examples of related technologies (e.g., cloud seeding and other weather modification) for lessons that might be learned about societal reactions, examine what international agreements exist which may be relevant to the experimental testing or deployment of geoengineering technologies, and briefly explore potential societal and ethical considerations related to geoengineering. This study is intended to provide a careful, clear scientific foundation that informs ethical, legal, and political discussions surrounding geoengineering.
This is an entirely appropriate use of a very small amount of money (in government terms) and the resources of the intelligence services. As I've gone into multiple times, the global political risks around geoengineering are massive, likely greater than the environmental risks. A better understanding of an emerging complex geopolitical issue is precisely what I'd want an intelligence service to be doing. It's a lot better than operating killer drones and reading our email. People who don't want to see geoengineering happen should be glad that the US government is taking this seriously as a potential issue.
I had an opportunity a couple of years ago to participate in a "wargame" project run by the CIA Center on Climate Change and National Security -- see souvenir above -- and watched as a climate-focused exercise turned into a geoengineering-focused game. As far as I could tell, this was not the designer's intent, but an organic result of player actions. And over the course of the game it came very close to leading to armed conflict between the US and China, a conflict over uncertain (and unsettling) consequences of a geoengineering effort. [The CIA Center on CCNS is now closed, in part due to climate issues being integrated across the spectrum of CIA research, and in part because House Republicans cut funding. Can't have the government studying climate change as if it were a real thing, you know.]
There's no question that geoengineering needs to be thought of as a potential global political risk. I'm glad to see a project like this. And I hope that the intelligence and strategic risk analysis services of other governments around the world are doing the exact same thing.
Apropos of griefing Glass, my most recent piece for Co.Exist is now up: "To [Forecast] The Future Of Technology, Figure Out How People Will Use It Illegally". (The actual title uses "predict" not "forecast," of course.) It's a quick look at stuff I've talked about before, why illicit and unexpected uses of new systems give a better vision of the future than do such systems' intended purposes.
New technologies don’t exist in a vacuum: they interact with both technological and non-technological systems as well as a variety of human wants and needs. This allows for the emergence of surprising combinations of goals and uses, many of which may be completely outside of the expectations of the designers. In short, as the patron saint of futurism William Gibson once said, “the street finds its own uses for things.”
As a futurist, I try to think beyond the designers notes when it comes to the impacts of emerging technologies. I find that it’s often useful to imagine the unintended, seedy, improper, or illicit uses of new tools and systems.
Not a new argument from me, but a concise articulation of it.
This last is the biggest risk factor for abuse. Saying "OK Glass Google [something shocking]" when someone is using Google Glass in near proximity will make it search Google for whatever startling content you've given it.
Not that I would suggest any such thing. Nosiree.
On Tuesday, June 18 (tomorrow, if you're reading this the day I post it), I'll be leading a roundtable discussion on "reinventing climate management" over at ReInventors.net. It will convene via Google Plus Hangouts, and will be accessible via the above link. Here's a taste of the promo text:
Managing the climate in the face of global warming is a wicked problem with almost no precedent. Given the global nature of the problem, no one nation can solve it without getting virtually all other nations involved. Even if Americans stopped driving cars and eating meat en masse tomorrow, it would not make much of a dent if the Chinese kept burning coal at their mad pace. India, Japan, even Canada all play outsized roles.
[...] During this roundtable we will face up to this extremely difficult problem and talk about how to Reinvent Climate Management. What would a system of global governance look like that’s up to the true challenges ahead? What kind of authority would it need? If actors like rogue nations or geoengineering tech titans broke the rules, what could be done? We’ll look at a range of possibilities, including those that don’t involve big government. Is there a bottom-up way forward? One led by corporations?
I honestly have no idea how far we'll get with this discussion. We know that the current model for dealing with global climate issues is broken, but most of the alternatives -- such as solar radiation management geoengineering or militarization of environmental management -- seem more like knee-jerk reactions from desperation rather than a smart, long-term system for stability. My hope is that a roundtable of smart, interesting people will be able to envision something usefully transcendent (or, failing that, at least useful).
The Earth's Environment
"Some of the most thoughtful work on the topic of climate change..."
-- The Futurist (July/Aug 2009)
What do we do if our best efforts to limit the emission of greenhouse gases into the atmosphere fall short? According to a growing number of
environmental scientists, we may be forced to try an experiment in global climate management: geoengineering.
Geoengineering would be risky, likely to provoke international tension, and certain to
have unexpected consequences. It may also be inevitable.
Environmental futurist Jamais Cascio explores the implications of geoengineering in
this collection of thought-provoking essays. Is our civilization ready
to take on the task of re-engineering the planet?
Geoengineering would be risky, likely to provoke international tension, and certain to have unexpected consequences. It may also be inevitable.
Environmental futurist Jamais Cascio explores the implications of geoengineering in this collection of thought-provoking essays. Is our civilization ready to take on the task of re-engineering the planet?
Since November 11, 2007. Based on IEA averages.