« Making the Visible Invisible |
| Monday Topsight, August 18, 2008 »
Here's a very early version of an augmented reality system for the iPhone from ARToolworks.
(Soundtrack Warning: The 1990s wants its rave music back.)
Posted by Jamais Cascio in Participatory Panopticon on August 13, 2008 1:18 PM
Comments (2) |
This approaches the kind of thing I've been imagining is possible with OpenMoko's Neo Freerunner.
This example only uses the camera and the 3D rendering, and tries to recognize what the camera sees in order to do overlaying. What I want to see is the combination of GPS (for rough position) and accelerometer (for direction finding) to render overlays that are intuitively recognizable just by pointing the phone. Perhaps image recognition could smooth over the rough edges.
I don't know how well it would work in practice, but it's worth a shot.
Nato Welch |
August 13, 2008 4:36 PM
I have a couple friends that have been playing around with this for about a year. It's still a bit slow but is very, very cool.
Howard Berkey |
August 16, 2008 7:55 PM