« Phraseology | Main | Tuesday Topsight, April 15, 2008 »

On the Record

deceptogram.jpgWhenever I talk about the participatory panopticon, one issue grabs an audience more often than anything else -- privacy. But the more I dig into the subject, the more it becomes clear that the real target of the panopticon technologies isn't privacy, but deception. We're starting to see the onset of a variety of technologies allowing the user to determine with some degree of accuracy whether or not the subject is lying. The most promising of these technologies use functional magnetic resonance imaging -- handy if you're conducting a police interview, perhaps, but not likely to be built into a cell phone any time soon. But it turns out that there's another emerging system for discovering deception, one that's not just potentially portable, but also offers the tantalizing possibility of determining if someone lied long after the fact.

Ron Brinkmann is a visual technology expert, author of The Art and Science of Digital Compositing, and an occasional Open the Future reader. He recently blogged about a set of emerging, very experimental lie-detection technologies relying on images. One takes advantage of observations of so-called "microexpressions," a real phenomenon where micro-second changes in our facial expressions correlate to our feelings about what we are saying. The other takes advantage of changes in skin temperature around the eyes, looking for a brief flare-up of heat that correlates with stress. Rather than reiterate Ron's post, I suggest you go read it.

I want to call particular attention to an observation he makes late in the piece, however, because I think it's worth careful consideration:

But enough about the future. Let’s talk about now. Because those last few video/audio analysis techniques I mentioned raise a particularly interesting scenario: Even though we may not have the technology yet to accurately and consistently detect when someone is lying, we will eventually be able to look back at the video/audio that is being captured today and determine, after the fact, whether or not the speaker was being truthful. In other words, even though we may not be able to accurately analyze the data immediately, we can definitely start collecting it. Infrared cameras are readily available, and microexpressions (which may occur over a span of less than 1/25th of a second) should be something that even standard video (at 30fps) would be able to catch. And today’s cameras should have plenty of resolution to grab the details needed, particularly if you zoom in on the subject [...].

Which brings us to the real point of this post. Is it possible that we’ve gotten to the point where certain peoples - I’m thinking specifically of politicians both foreign and domestic - should be made aware that anything they say in public will eventually be subject to retroactive truth-checking… Because it seems to me that someone needs to start recording all the Presidential debates NOW with a nice array of infrared and high-definition cameras. And they need to do it in a public fashion so that every one of these candidates is very aware of it and of why it is being done.

(emphasis in original)

There's no question in my mind that, when these lie-detection systems become seen as good enough (which does not mean 100% accurate, of course), people will start using them to go back through video recordings looking for microexpressions. Politicians offer an obvious set of initial subjects, but I suspect our attention would shift quickly to celebrities. I wouldn't be surprised to see the technologies adopted by activists, especially if we're in an age of going after environmental or economic criminals. Finally, once the systems have come down in price and increased in portability, we'll start pointing them at friends and lovers.

What then? It's hard to believe that cheap, easy-to-use, after-the-fact applicable lie-detection systems won't be snapped up. But do we really want to know that sometimes when spouses or parents say "I love you," their microexpressions and facial heat say "...but not right now..."? Imagine the market for facial analysis apps as add-ons to video conferencing systems for businesses or the home. Video iChat, now with iTruth!

Arguably, the only thing worse than this kind of technology getting into everybody's hands would be if it only got into the hands of people already in power.

Information is power, but so is misinformation. People who lie to achieve some outcome have very real power over the people they've lied to. The capacity to identify those lies, even after-the-fact, can undermine that power. This won't be an easy transition; the technological rebalancing of the political system is already underway (as shown with blogs, YouTube, and the like). Any efforts to pull back from this shift will be met with resistance, anger, and worse. And they will undoubtedly be on the record, like it or not.

Comments

There's probably money to be made by scrutinizing CEO's of large corporations.

But then the politicians in power will record their talks, CGI them to morph lie micro-expressions into truth micro-expressions, and and only then broadcast them. Spy vs spy...

I think in the long run - to quote a song by a canadian rock band inspired by senor kurzweil's vision - the future is wonderful.

The political upturning that deception reading technologies drive is one side. But in terms of interpersonal relations - people who regularly lie to their loved ones or friends aren't really living authentic self-actualized lives - I would say. People who are already intimate with other people can usually tell when they are lying. We shouldn't be afraid of a future without deception. That might be a wonderful future.

Archives

Creative Commons License
This weblog is licensed under a Creative Commons License.
Powered By MovableType 4.37