top of page

Commanding Grasping Robot through Virtual Reality and Simulated Wrenches

Nowadays, the need to manipulate objects in hostile environments, through natural movements of the human's body, has led to the development of several teleoperation interfaces.
Nevertheless, during the teleoperation of the robot, the direct vision of the environment and robot through the camera involves non-ergonomic user behaviors and occlusion problems.
In this work, it is proposed a novel method for teleoperation combining grasping theory and virtual reality.
To overcome the problems of occlusions, a new experimental teleoperation setup is designed with a continuous exchange of information between the human hand, the virtual kinematic hand model and the robotic manipulator. An interactive virtual grasping system is proposed where the gathered data are forwarded to the manipulator in the real world to replicate the grasping of the objects. The user perceives force stimuli coming from the virtual environment through wearable haptic devices.
As future work, the robot will be masked so that the user no longer identifies himself with the robot, but his reference will be a virtual hand in the real world that provides less occlusion and overcomes the non-ergonomic behaviors of the user.

Grasping Tasks.png

Virtual environment.

Grasp virtual objects is one of the most complex tasks in Virtual Reality (e.g. object's shape, discrepancies of physics).  As hypothesis, it is proposed the use of a sphere, with same mass of object, that encloses the target object touching its outermost points

OptiTrack Motive system to track the user's hand. 

The tracking is based on the IR light that, emitted from the cameras, is reflected by passive markers and detected by the sensor of the cameras

Manipulation task.

To track the user's hand it is used the Leap Motion Controller (position of palm), IMU (orientation of palm) and CyberGlove III to get the joint angles of the fingers

bottom of page