Touching the future: making technology more intuitive through haptic feedback

Haptic feedback allows us to experience virtual and remote environments through touch. This technology has gained some traction in teleoperation, surgery and gaming. Nevertheless, haptics has yet to reach widespread commercial use.

In today’s complex technological world, modern user interfaces allow us to interact with technology to achieve things that previous generations could only dream of. Take smartphones, for instance: we make phone calls, take pictures, check the weather, manage our calendars, read the news, track our health, and update our social media status, all within a matter of seconds, simply by tapping a few times on a glass screen that fits in the palm of our hands. But what if these interactions could be delivered in a better way? How would this impact on our overall experience?

We continuously obtain rich information from the world through a variety of sensory sources (vision, sound, smell, temperature, touch, etc.) and we subconsciously combine this information to make informed decisions about what actions to take next. In contrast, an overwhelming majority of modern technologies use only vision or sound exclusively to deliver information to the user. But what if we could stimulate more of our senses during our interactions with technology? One area that has gained some traction in the last few decades is haptic feedback technology, which enables users to interact with environments through touch.

Haptic devices (and the algorithms that are used to control them) can be used in a variety of ways, from generating forces that are perceived when interacting with simulated objects (virtual reality), to ‘touching’ real objects from a distance (teleoperation).

At ReSolve, we recently developed a haptic master-slave tissue indentation system, called VIRI (the Visuohaptic Illusions Robotic Interface). The operator uses a haptic device to control the position of a robotic arm (Denso) to indent a silicone sample – a simulation of human tissue. A camera captures the slave device’s movements and displays them on-screen, and forces are generated through the haptic device when the sample is indented. However, there is a twist: what you see isn’t necessarily what you feel. Samples may feel stiffer or softer than they really are. By decoupling visual and haptic feedback in this way, our research colleagues at The University of Leeds are using this novel system to investigate how humans combine sensory information by studying the individual contributions of vision and touch towards our overall perception of compliance. This research has potential implications in areas such as robotically-assisted surgery, which may allow for the augmentation of forces and/or vision to aid the surgeon’s perception during a procedure. Find out more about VIRI here.

The visuohaptic illusions robotic interface (VIRI): a master-slave haptic teleoperation system for assessing the relationship between vision and touch in our perception of compliance.

If you would like to find out more about haptics, we will soon be writing a longer article about haptics and some exciting trends that are changing the way we ‘feel’ the world – so watch this space!  If you would like to find out more, or have a potential application that could benefit from haptic technology, we would love to hear from you! Please post a comment below, or get in touch on