Alpha-Testing the Participatory Panopticon
It looks like the first draft version of the participatory panopticon -- the set of technologies allowing individuals to record everything that happens around them, for later playback, analysis, and archiving -- will come not from mobile phones on steroids, but as part of the US Defense Advance Research Projects Administration (DARPA) efforts to increase the information-recall capacity of soldiers in the field.
The Defense Advanced Research Projects Agency (DARPA) is exploring the use of soldier-worn sensors and recorders to augment a soldier's recall and reporting capability. The National Institute of Standards and Technology (NIST) is acting as an independent evaluator for the "Advanced Soldier Sensor Information System and Technology" (ASSIST) project. NIST researchers are designing tests to measure the technical capability of such information gathering devices.
[...] The sensors are expected to capture, classify and store such data as the sound of acceleration and deceleration of vehicles, images of people (including suspicious movements that might not be seen by the soldiers), speech and specific types of weapon fire.
A capacity to give GPS locations, an ability to translate Arabic signs and text into English, as well as on-command video recording also are being demonstrated in Aberdeen. Sensor system software is expected to extract keywords and create an indexed multimedia representation of information collected by different soldiers. For comparison purposes, the soldiers wearing the sensors will make an after-action report based on memory and then supplement that after-action report with information learned from the sensor data.
Let's see... recording of images and sounds the wearer may not have noticed, but later prove useful? Check. Integration with location-based systems for greater situational awareness? Check. Depiction of the system as a memory assistant? Check.
The original DARPA proposal goes into more detail about what ASSIST will be trying to accomplish, and it's appropriately ambitious. They clearly recognize that the challenge isn't the hardware -- as the illustration shows, you can cobble together something right now with off-the-shelf cameras and recorders -- but the software that makes sense of the recorded data. Many of the goals described in the DARPA item (check the section starting with "Task 2: Advanced Technology Research") parallel the issues being confronted by Microsoft in its MyLifeBits project and Nokia with its Lifeblog project: interpretation of images; assignment of metadata; ontologies for location, objects and activities; and interfaces for access to and editing of recorded material.
I wonder if Microsoft is working with DARPA on this; they certainly could be of use to each other.
It strikes me that we'll probably see the emergence of this kind of technology first in the work of the military and (possibly more likely) the work of first responders. Many police vehicles already have automatic recorders; insisting that officers wear recorders as well isn't a big leap. Firefighters and other emergency-response personnel could wear them for after-action analysis and investigation, as well as for liability reasons (proof that a responder behaved professionally, or that s/he violated protocol).
If ASSIST works well enough to do a "real" version, I wonder how many soldiers returning from duty will want to have something like that for their regular lives?