A novel method for investigating the relative effects of augmented visual and haptic (touch) feedback during the manual assessment of an object’s stiffness by a surgeon was needed.
We developed a bespoke software architecture in LabVIEW to precisely mimic hand movement by relaying the position of a master haptic device end-effector to a slave robot. The slave robot’s movements and indented sample were recorded using a webcam and displayed on-screen. Force feedback was generated in real-time through the haptic device from modelled data.
Compliance perception is vital in areas such as surgery. Augmented reality is one exciting prospect of robotic surgery for improving the quality of perceptual information available to the surgeon.
To investigate the optimum form of augmented feedback, ReSolve developed a master-slave robotic system to enable the decoupling of visual and haptic information during indirect interactions with samples of varying degrees of stiffness. A master haptic device (omega.7, Force Dimension) was used to measure hand position and two robotic arms (DENSO) were used to 1) select a sample and 2) mimic hand movements to indent the selected sample. The slave device movements and sample’s visual response were recorded and displayed in real-time. Modelled data were used to generate force feedback irrespectively of the sample’s stiffness, thus decoupling visual and haptic information. A fully configurable experimental protocol with built-in data analysis was integrated into a bespoke software package, enabling full autonomy over the running of human trials.
VIRI has been used successfully by the Surgical Technologies research group at the University of Leeds to conduct a variety of psychomotor studies which have enabled breakthrough medical research to inform the future of robotic surgery.