This Glove Makes VR Objects Feel Real

Pneumatic “muscles” on the glove simulate the feel of real objects

PhD students experiment with the glove in professor Tolley's lab. (University of California, San Diego)
smithsonian.com

With a virtual reality headset, you can see and hear other worlds, but so far, you can’t touch them. That may change with a new prototype kinesthetic glove built by researchers at the University of California, San Diego.

Kinesthetic is the word for feedback from a system designed to convey information via the feel of pushing against something. It's a subset of the field of haptics, which aims to help people understand the world through the sense of touch.

A glove like this, which uses techniques borrowed from soft robotics to push back on a user’s fingers and simulate the sense of touch, could be important in future exploration of virtual space, adding more sensation, and thus more reality, to virtual reality. Its creators say it could grow into a new controller for virtual games or even medical devices.

“When people think haptics, they typically think about a rumble pad, or vibrating controller, like when your phone vibrates, which can give you tactile feedback in a very simple way…there’s no directional component to it,” says Jurgen Schulze, an adjunct professor of computer science at UCSD who specializes in virtual reality and who helped develop the prototype. “With the glove what you can do, in theory, is make objects that you grab, and carry in your hand, make them feel like they are there. They are still weightless, but they at least have volume … It’s a step above, and quite a big one, above just having vibration feedback.”

The UCSD team adorned the glove with pneumatic “muscles,” like the ones found in soft robots. Spread out over the back of the hand, the air-filled sacs inflate or deflate to provide directional pressure on the fingers. The air reservoirs are covered with braided fibers, and a pump controls the level of inflation. The apparatus is strapped to a flexible silicone exoskeleton that can be worn over the back of the hand. A tracking device follows the motion of the user’s hand, and the pressure feedback is based on its reading of the hand’s position.

Put on the glove, along with a pair of goggles and a set of headphones, and you’ll be presented with a virtual piano that you can feel as you touch the keys. When you press against a key, the air sacs inflate, pulling back against your finger and simulating that touch. According to test users, the result was “mesmerizing,” though they did note a delay in the speed of the response.

Currently, the work is a prototype and the glove only works with the piano application and only when the tracking device can “see” both hands. Future versions, says Mike Tolley, an engineering professor who teaches a graduate class at UCSD on designing systems with soft robotics, could involve integrated sensors that get position information from the glove itself, which would increase accuracy and alleviate problems such as when one hand is positioned in front of another.

Tolley and Schulze envision applications in games and virtual training, but also see potential in robotic surgery. One of the tricks to robot-assisted surgery is feedback. The most popular device, called the DaVinci, offers only visual feedback; the surgeon drives it via two joysticks, but relies on visual cues to be able to tell when to push forward or when to back off on pressure.

“If you’ve played with DaVinci, you know that the feedback you get there is visual, you get stereovision. And it’s quite good, people have done a lot with that, even without the force feedback,” says Peter Kazanzides, a computer science professor and a robotic surgery expert at Johns Hopkins University, who was not affiliated with the UCSD project. “Experienced surgeons learn how to essentially estimate the amount of force they’re applying by looking at how taut the suture is, or how much the tissue is stretching.”

That’s not to say haptic feedback couldn’t improve such a system. But Kazanzides points out another problem that would have to be solved first: the DaVinci doesn’t have a way to sense forces.

To build a robot that can present force feedback to its users, it must be able to sense the pressure it’s putting on a surface (or body). Such sensors are typically too big, too expensive, and not medical grade. So while it’s difficult to say exactly in what form force feedback might appear in robot-assisted surgery, Kazanzides acknowledges it could still be beneficial.

For Tolley, Schulze, and the UCSD group, the more immediate future is the device’s potential in virtual reality exploration and gaming, like the old Nintendo Power Glove, but with feedback. Their focus is on getting a realistic response from the virtual keyboard. “The challenge with virtual reality, especially for a mechanical engineer, is it’s all about getting the right feel,” says Tolley.

About Nathan Hurst

Nathan Hurst blends a love of storytelling with a passion for science and the outdoors, covering technology, the environment, and much more. His work has appeared in a variety of publications, including Wired, Outside, Make: and Smithsonian.

Read more from this author
Tags

Comment on this Story

comments powered by Disqus