« An Insufficient Present | Main | Don't Forget »

Tuesday Topsight, July 3, 2007


Woz, Not Woz: The transcript of the R.U. Sirius show with Steve Wozniak is now up, if you're more inclined to read than to listen. Surprisingly, this transcript has already been Slashdotted.

JAMAIS CASCIO: So what do you think are the rules for being an ethical prankster?

STEVE WOZNIAK: Ethical prankster? It's tough. I don't think there's 100% ethical. In theory, you have agreements with society not to do things that are going to be disruptive — to not do things that are gonna be different. And yet, practically, all of us have to do things that are a little bit different. And there's always some weird little laws that are written to catch you just for being different.

Ethical hacking today is largely finding flaws in major computer systems, or possibly the phone systems. And to be ethical, you don't use it to harm anyone. And generally, that means you don't want to keep it secret forever.

Mech Me: Hugh Herr, director of the biomechatronics group at the Massachusetts Institute of Technology's Media lab, came up with a new generation of prosthetic legs for himself, and is using that expertise to develop an exoskeleton for the currently-abled. New Scientist (naturally) has the write-up, and the patent application is also available.

The novel aspect of the prosthetic legs (and, presumably, the exoskeleton): the legs are extensible for the full-on "Machine Man" effect. (Via Medgadget)

I, For One, Welcome Our New Taser-Wielding Roomba Overlords: That's right -- iRobot, the company behind the Roomba robo-vacuum, has now built a tactical bot for the military that's armed with a taser.

For iRobot, its Taser-equipped system will be the first robot capable of using force to disable a person, rather than a bomb. The 17-year-old company is best known for its mobile robots for the consumer market, including the disc-shaped, carpet cleaning Roomba.

But home robots account for only 60 percent of the company's revenue. The rest comes from government and industrial customers, including the military and police.

Versions of iRobot's PackBot have disarmed roadside bombs and searched caves and buildings in Iraq and Afghanistan. Some scout dangerous areas before soldiers or emergency responders go in.

It's interesting how attached the soldiers get to these little 'bots -- Joel Garreau's "Bots on the Ground" article in the Washington Post provided some details in May. What makes this particularly interesting is that, from a techno-futurist perspective, these aren't robots at all, they're remote-operated devices. Unlike a "real" robot, they're not autonomous or even (as with the Mars rovers) semi-autonomous -- they're not much more than high-quality RC cars.

But it makes me really curious as to how people are going to respond when real, autonomous robots enter the mix, devices that receive high-level commands ("check for traps") and figure out how to do it, rather than needing a human operator. If soldiers get emotionally attached to RC bots, how will they respond when something that seems to have its own identity gets damaged or destroyed?

Will the inability of human soldiers to cope with the destruction of robotic devices end up as the primary roadblock to the greater use of autonomous bots on the battlefield?


Interesting to think that losing a combat bot would evoke the same trauma that losing a fellow soldier - or, say, a limb would.

My primary concern with combat bots is the fact that reducing casualties reduces the evidence of and sentiment against illegal or immoral wars, giving leaders and people less reason to avoid them. Dead Bots Tell No Tales

If the sentiments people have toward bots becomes the same as for human soldiers, then that has interesting consequences for that concern.

OTOH, operator trauma can be reduced by regular backups, Cylon style.

Which underscores your primary concern.

Asimov is rolling in his grave.


Creative Commons License
This weblog is licensed under a Creative Commons License.
Powered By MovableType 4.37