« Oil Crisis-Ready | Main | The Open Future: Open Source Scenario Planning »

On the Horizon (03/24/06): Nature on the Future of Computing

2020visionNature.jpgIf Wired or Technology Review were to do a cover story on "computing in 2020," you know what you'd get: computer-generated mock-ups of what the laptop/wearable/ambient Computer of Tomorrow will look like, interviews with people working on bleeding-edge technologies, and lots of discussion of how future computers will work. When Nature does a cover story on "computing in 2020," you get something quite different: only one of the eight feature articles talks about how future computers might operate; the rest look more at the evolution of how we use computers, a much more worldchanging topic.

Unsurprisingly, most of the articles look at the science of the particular issue, either in the underlying theory or the actual applications; Nature is the world's premiere science journal, after all. But that doesn't mean they're inaccessible for non-scientific readers by any means. You may have to slip over some jargon here and there, but the core ideas -- the interplay of computation and sensor networks, the question of how we deal with massive amounts of incoming data, the parallels between biology and information -- remain relevant across many of the subjects we discuss here. Best of all, as indicated yesterday, all of the articles in the feature section can be read for free (in both HTML and PDF format); Microsoft's 2020 Science project made this possible, so it's worth noting that none of the articles talk about what Microsoft is doing at all.

I Sense Something...: Of all of the articles in the special section, the one that's likely to feel the most familiar is Everything, everywhere, written by WorldChanging ally Declan Butler. It's a look at the emergence of "smart dust," "motes" and the various other manifestations of wireless sensor technologies, and the role these systems will play in future scientific computation. The important message is that the growing use of abundant sensing technology will change how scientific research works:

Gaetano Borriello, a computer scientist at the University of Washington in Seattle, argues that such widely distributed computing power will trigger a paradigm shift as great as that brought about by the development of experimental science itself. "We will be getting real-time data from the physical world for the first time on a large scale."
Instead of painstakingly collecting their own data, researchers will be able to mine up-to-the-minute databases on every aspect of the environment — the understanding of diseases, and the efficacy of treatments will be dissected by ceaselessly monitoring huge clinical populations. "It will be a very different way of thinking, sifting through the data to find patterns," says Borriello, who works on integrating medical sensors — such as continuous monitors of heart rate and blood oxygen — with their surroundings. "There will be a much more rapid cycle of hypothesis generation and testing than we have now."

The faster you can experiment, analyze, and iterate, the greater the number of possibilities you can explore.

If Absolute Power Corrupts Absolutely, Does Sustainable Power Corrupt Sustainably?: Butler has a second piece in the section, a box on power for sensor hardware. It covers the issues admirably, looking more at the challenges than at particular solutions, although he does focus for a moment on ultrawideband as a technology for sending signals more reliably with less power. Sustainable power gets its due not just with solar energy, but with the use of vibrational energy, "... such as that given off by the traffic on a nearby road."

Waste is just a resource we haven't yet tapped.

Virtual Science: Also striking a theme familiar to Worldchanging readers, Vernor Vinge (a computer scientist best known as a science fiction author and credited with the most widely-accepted articulation of the concept of the "singularity") writes in "The creativity machine" of the use of virtual environments and open collaborative tools as novel mechanisms for scientific research. For Vinge, the Internet is the world's most important scientific tool, and it will only become moreso:

All this points to ways that science might exploit the Internet in the near future. Beyond that, we know that hardware will continue to improve. In 15 years, we are likely to have processing power that is 1,000 times greater than today, and an even larger increase in the number of network-connected devices (such as tiny sensors and effectors). Among other things, these improvements will add a layer of networking beneath what we have today, to create a world come alive with trillions of tiny devices that know what they are, where they are and how to communicate with their near neighbours, and thus, with anything in the world. Much of the planetary sensing that is part of the scientific enterprise will be implicit in this new digital Gaia. The Internet will have leaked out, to become coincident with Earth.

In this one sentence:

The Internet will have leaked out, to become coincident with Earth.

...Vinge manages to capture a key aspect of the Bright Green future.

Hyperinteractive: Ian Foster's "A two-way street to science's future" gives the broad view of the ongoing relationship between science and computation. It's not just that modern biological, physical and environmental sciences couldn't exist without abundant computational tools, these very sciences are changing how we understand the nature of computation and information.

...science is becoming less reductionist and more integrative, as researchers attempt to study the collective behaviour of larger systems. To quote Richard Dawkins: "If you want to understand life, don't think about vibrant, throbbing gels and oozes, think about information technology."
Such system-level approaches are emerging in fields as diverse as biology, climate and seismology. A frequent goal is to develop high-fidelity computer simulations as tools for studying system-level behaviour. Computer science, as the 'science of complexity', has much to say about how such simulations — which can be considered a new class of experimental apparatus — should be constructed, and how their output should be analysed and compared with experiment. Similarly, information theory provides formidable insight into how biological systems encode, transform and transmit information.

In Foster's view, not only has computation become integral to science, soon computer scientists will be necessary components of all kinds of research teams. This is already the case with many of the environmental sciences; there are few climatology groups that don't employ specialists in simulations and computer modeling.

Vibrant, Throbbing Gels and Oozes: Roger Brent and Jehoshua Bruck focus on a particular manifestation of this interplay between science and computing theory in "Can computers help to explain biology?." Their article makes the case that there are sufficient parallels between the function of biological systems and the function of information systems to warrant cross-study (especially for bioscientists). This piece is probably the least-accessible of the collection, as Brent and Bruck of necessity combine biology and information theories in a manner that emphasizes philosophy as much as it does applied study:

Happily, there is considerable interest in wanting to build one element of biological semantics — the passage of time — into information theory. Formalizations of information processing that embodied this and other semantic concepts relevant to biology might help biologists to go beyond quantifying reaction rates and molecular species of biological systems to understand their dynamic behaviour. They might also help to suggest new experiments — perhaps on synthetic biological systems engineered to have a crisper division between process and output, which could then be evolved by artificial selection. This approach might bring a deeper understanding of function at its most fundamental level of fitness and selection.

The essay doesn't so much focus on 2020 and use this theoretical combination as an example of what the future might hold as it does focus on the theory and toss out 2020 as a likely point at which this work will see fruition. This is both the most challenging piece of the feature and almost the least satisfying.

Data Deluge: "Science in an exponential world," by Alexander Szalay and Jim Gray, looks at the coming data crisis, where the abundance of useful information and the lack of cross-database format standards has a real potential to hamper research. Among the potential results could be an increase in the use of open access publishing systems as a means of standardizing data methods and a serious shift away from large-scale mega-science projects in favor of small, rapid collaborative efforts, especially those that take advantage of distributed sensing and computational systems. Given that the article's conclusion is that, yup, scientists of 2020 will still be dealing with exponential data growth, this one ranks as the feature's least provocative piece, overall.

Robots in White Coats: At the other end of the spectrum is Stephen Muggleton's "Exceeding human limits," which takes us into a future of robotic scientists handling the rote aspects of data collection and analysis, leaving the humans to tackle the fun stuff. This is already starting to happen, with machine-learning techniques evolving into rudimentary methods of hypothesis generation; this method has shown results in the world of molecular bioscience by identifying key structures of cancer-causing agents. More advanced versions would combine sophisticated analytic hardware with programs able to identify fruitful avenues of research.

Today's generation of microfluidic machines is designed to carry out a specific series of chemical reactions, but further flexibility could be added to this tool kit by developing what one might call a 'chemical Turing machine'. [...] Just as Turing's original machine later formed the theoretical basis of modern computation, so the programmability of a chemical Turing machine would allow a degree of flexibility far beyond the present robot-scientist experiments, including complex iterative behaviour.

Muggleton warns that an over-reliance on these tools could lead to a situation where human scientists do not fully comprehend the hypotheses generated by the robotic scientists. This may not be the most likely challenge, however; as long as the robotic scientists are programmed to "show their work" as they build their hypotheses, appropriately-trained human scientists should be able to follow along. A bigger problem would be an over-reliance on the breadth of automated research, such that human scientists don't check out pathways of research not initially explored by the robots. Programming remains a human endeavor with fallible results, and researchers should never assume that the robotic scientists have explored all possible options.

Qubit's Rube: The one article that is expressly about computational hardware is "Champing at the bits," by Philip Ball. An examination of the increasing capabilities of quantum computing systems, the piece illustrates the current uncertainty about the real-world applications of the method. On one hand, some researchers say that a functional quantum computer by 2020 is highly likely -- and one even muses about a quantum version of Grand Theft Auto; on the other, some researchers note that, despite having worked on quantum computer theory for years, there are still only two tasks they seem good for, factoring large numbers and rapid database searches. Admittedly, these are both useful tasks, but suggest a future more likely to include quantum co-processors than general-purpose quantum computers.

As a discussion of the current state of the art in quantum computing, however, the piece is pretty useful, although it presupposes a basic understanding of how quantum computing works. Wikipedia, of course, has the necessary introduction.

TrackBack

Listed below are links to weblogs that reference On the Horizon (03/24/06): Nature on the Future of Computing:

» Future Tense from reBang weblog
WorldChanging has an entry (Link) that touches on the kinds of things I’ve wrapped up into the “kirkyan” concept about which I’ve posted (reLink) and which has gotten the attention of a whole lot of people who read SterlingR... [Read More]

Comments (3)

You should probably note - Declan Butler works for Nature; we've corresponded on electronic publishing issues in the past (I work for the a science journal publisher in regular life too...)

Arthur

Daniel Hazelton waters:

The Quantum Computer is potential abundant and by 2020 we have at least 1000 qubit systems connected together!!!... Nanotech follows the rise in quantum computing, it is what enables alot of the potential power...

"The faster you can experiment, analyze, and iterate, the greater the number of possibilities you can explore."

And from virtual possibilities not restricted by Time, develop physical instantiations of solutions automatically so they can be implemented, tested, sensed. The code itself doesn't have to be at a high level for this feedback loop to become an evolutionary system.

There is of course a danger here. Isn't there always?

About

This page contains a single entry from the blog posted on March 24, 2006 5:13 PM.

The previous post in this blog was Oil Crisis-Ready.

The next post in this blog is The Open Future: Open Source Scenario Planning.

Many more can be found on the main index page or by looking through the archives.

Powered by
Movable Type 3.34