2025-CME-219

Multimodal Sensor Fusion of Peripheral Sensing and Vision for Continuous Grasping Control in Hand Rehabilitation Exoskeleton

Oscar Vazquez, Josh Mehlman, Milton Tinoco

School of Engineering

Faculty Supervisor: David Quintero

Stroke is the second leading cause of death and a major contributor to disability for adults worldwide. Stroke survivors experience muscle paralysis, making it very difficult to perform tasks at home, particularly hand grasping of objects. Stroke rehabilitation using robot-assisted devices, such as hand exoskeletons, offer significant benefits for stroke survivors by enabling at-home therapy and assisting with activities of daily living. These devices promote motor recovery by providing controlled finger flexion and extension, helping users regain hand function through repetitive movement exercises and movement assistance. The challenge is developing an intelligent exoskeleton control strategy to assist in achieving daily grasping tasks, such as grasping utensils or holding a bottle, that will empower individuals to regain independence and improve overall quality of life. The proposed work is to develop a multimodal sensor fusion controller using peripheral nervous stimuli, vision, and sensorless fingertip force prediction for creating a continuous grasping motion using a designed hand exoskeleton. This comprehensive sensor fusion allows the hand exoskeleton to assist the stroke survivor to grasp various objects located around an individual's home. Results will demonstrate the hand exoskeleton achieve different grasping taxonomy depending on the object in an autonomous, natural manner for the user. The projects deliver an active assistance device for immediate post-stroke at-home activities and reduce healthcare cost and recovery time for the broader stroke population.