NormalTouch’ and ‘TextureTouch’ are in early stages, but could make tactility in VR more believable.
Virtual reality is one of the most immersive technologies available today. That is until the illusion shatters when you instinctively reach out to touch something and are met with a one-size-fits-all haptic response or no feedback at all. Microsoft Research (PDF) might have the solution to that. Rather than air-based haptics like we’ve seen before, “NormalTouch” and “TextureTouch” use handheld devices to simulate touching things while in VR — no bodysuit required.
The handheld devices use mechanical actuators and a 4×4 grid of pins to convey what it feels like to touch different objects, even taking surface hardness among different objects into account. In the video embedded below, you can even see someone flicking a ball and cube around using the admittedly hacky-looking gizmos. The team admits that despite the successes presented here it still isn’t sure how much haptic feedback is needed to be convincing:
“On several occasions we observed people trying out our devices when they were not well calibrated (e.g., NormalTouch would render a surface normal in a drastically different direction than it was supposed to). To our surprise, people often claimed that the device accurately rendered the surface when in fact it was obviously incorrect. While anecdotal, this points to the need to further evaluate whether or not it is important to precisely match the haptic rendering in order for it to be considered realistic and high fidelity.” Read More...