Commanding Grasping Robot through Virtual Reality and Simulated Wrenches

Nowadays, the need to manipulate objects in hostile environments, through natural movements of the human's body, has led to the development of several teleoperation interfaces.
Nevertheless, during the teleoperation of the robot, the direct vision of the environment and of the robot through the camera involves non-ergonomic user behaviors and occlusion problems.
It is proposed a novel method for teleoperation, combining grasping theory and virtual reality.
To overcome the problems of occlusions, a new experimental teleoperation setup is designed, a continuous exchange of information between the human hand, the virtual kinematic hand model and the robotic manipulator is provided. An interactive virtual grasping system is proposed, and the gathered data will be forwarded to the manipulator in the real world to replicate the grasping of the objects. The user perceives force stimuli coming from the virtual environment through wearable haptic devices.
Finally the robot is masked so that the user no longer identifies himself with the robot, but his reference will be the virtual hand in the real world that provides less occlusion and overcomes the non-ergonomic behaviors of the user.
It is reported a review of previous investigations, which serve as a foundation for this work, the presentation of the current achievements related to the control of the robot with motion capture (Mocap) systems and the future works.

Grasping Tasks.png

Virtual environment.

Grasp virtual objects is one of the most complex tasks in Virtual Reality (e.g. object's shape, discrepancies of physics).  As hypothesis, it is proposed the use of a sphere, with same mass of object, that encloses the target object touching its outermost points

OptiTrack Motive system to track the user's hand. 

The tracking is based on the IR light that, emitted from the cameras, is reflected by passive markers and detected by the sensor of the cameras

Manipulation task.

To track the user's hand it is used the Leap Motion Controller (position of palm), IMU (orientation of palm) and CyberGlove III to get the joint angles of the fingers