Shape-Based Grasping Techniques with Eye Tracking Device
The loss of a hand can significantly impact an individual's autonomy and ability to perform daily living, work, and social activities. Current prosthetic solutions inadequately address these challenges due to limitations in the control interfaces and the absence of force or tactile feedback, which restricts hand grasping capabilities.
​
To tackle this issue, a system has been developed for use with prosthetic hands, featuring an algorithm that enables adaptive grasping based on object shape. This system first extracts the user’s intent to grasp an object using electromyographic (EMG) signals obtained from an EMG Myo Armband device. Additionally, it incorporates visual feedback through an Eye Tracking Device and YOLO, a real-time object recognition algorithm. Finally, the Robot Operating System (ROS) serves as the framework for integrating both EMG signals and visual feedback, ensuring effective grasping.