As immersive as virtual reality( VR) tech is visually, there is a bit of a unplug between what you see and what you can “touch” in a VR environment. We’ve seen corporations come up with various ways to allow users to feel objects in VR, so it doesn’t come as a surprise that researchers at Cornell University have also come up with their own solution.
This comes in the form of a stretchable, synthetic skin that’s attached to fiber-optic sensors. By use a stretchable fabric, it would allow for a variety of works , not just for humen, but it could also apply to robots where allowing them to feel objects, it could help them recognize these objects that could increase their capabilities and functionality.
According to lead researcher, Rob Shepherd, an associate professor of mechanical and aerospace engineering in the College of Engineering, “Right now, smelling is done predominantly by seeing. We hardly ever measure touch in real life. This bark is a way to allow ourselves and machines to measure tactile interactions in a manner that we now currently use the cameras in our phones. It’s exploiting vision to measure touch. This is the most convenient and practical purposes to make love in a scalable way.”
He contributes, “VR and AR immersion is based on motion capture. Touch is barely there at all. Let’s say you want to have an augmented actuality simulation that teaches you how to fix your vehicle or conversion a tire. If you had a glove or something that could measure pressure, as well as gesture, that augmented world visualization could say,’ Turn and then stop, so you don’t overtighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it.”
While it might be a while before we consider a commercial-grade application of Cornell’s technology, it does present possible options for how companies approach “touch” in VR or AR systems in the future.
Read more: ubergizmo.com