Kemo D. (kemo_d7) wrote,
Kemo D.
kemo_d7

  • Mood:

Virtual Horizon

Today, virtual games and Tomorrow, virtual worlds...

Computers today barely connect with people. The human body evolved as a whole to sense and interact with the world, but computers sense us only at our fingertips. Even the fingertips aren’t allowed to do all they can; a computer that was designed to interact with us holistically would feel different from moment to moment in order to convey information. For more than two decades, scientists have been working on the grand project of virtual reality (VR) to bring the whole body into computing.


You can think of an ideal virtual reality setup as a sensory-motor mirror opposite of the human body. Wherever the body has a sensor, like an eye or an ear, a VR system must present a stimulus to that body part to create a corresponding illusory world. The eye needs a visual display, for instance, and the ear needs an audio speaker. That’s the easy part. Most of the body is covered in skin, so the ideal VR setup would be largely devoted to touch and feel. The term of art for this side of VR is "haptics."

 

Virtual reality technology needs a lot of computing power. Back in the early 1980s, scientists spent a lot of time exploring how the human body could best interact with the virtual world. The iconic VR device that emerged from that work was the ­DataGlove, a sensor-filled glove that would direct a real-time virtual model of your hand into a computer-generated world.

 

Perhaps the most intractable problem with the DataGlove was arm fatigue. Try holding your arm out without any support for a few minutes. You’ll soon start to notice little tremors in your arm muscles and before long you’ll wonder where all your strength went. We’re used to resting our hands at least a bit on the objects we’re manipulating, and that is one service a virtual object of the era just couldn’t offer.

 

The difficulties actually led to some happy discoveries, including a new approach to physical therapy. You could toss virtual balls in VR using a first-generation DataGlove, but only very slowly; it helped if the balls moved in slow motion as well. This was an advantage for klutzy jugglers—make the balls slow enough and anyone can juggle. We realized that people could learn to juggle real balls by gradually speeding up virtual balls, a great training system. This concept is now commonplace in advanced rehab. For instance, there are therapy systems that use slowed-down VR to help people with injured limbs get up to speed.

 

This all begs the question: What is the right way to manipulate virtual objects? When we can build the haptic interface of our dreams, what will it be like?

 

Some scientists are betting on a strategy called the “octopus butler robot.” Here’s how it will work: Imagine you’re in a virtual world and you want to slam your hand down on a virtual countertop. Now suppose there’s an attentive robot placed nearby. (You don’t see the robot, of course, because you are looking only at the computer-generated virtual world.) The robot has an arm that is holding a tray, like a butler. As you start to slam your hand down, the robot calculates that you ought to hit a virtual countertop. The robot swoops in just in time, bringing the physical tray into alignment with the virtual countertop and creating the illusion for you that the countertop was there all along.

 

This setup would need to work for any virtual object regardless of shape, so instead of a tray, the end of the robot’s arm would be a morphing machine, inspired perhaps by the physiology of an octopus. The “octopus” would take on whatever form your hand is trying to touch or grab, and it would be attached to the floor so you could rest on it and not suffer from arm fatigue. 

Can it be done? Scientists predict the octopus butler robot will appear in a lab by 2015 and in the home around 2025. 

Kemo D. (a.k.a. no.7) www.beyondgenes.com 

Tags: science
Subscribe
Comments for this post were disabled by the author