Project Details

Description

Humans can control their limbs to perform a variety of daily tasks with great precision and remarkable adaptability in unpredictable environments, thanks to our cognitive capacity and physical characteristics. People with disabilities could rely on assistive robots that have similar functional capabilities of real limbs, yet people find it difficult to use on a daily basis, partly because the interfaces are unnatural and unintuitive. The objective of this project is to understand the neural and cognitive processes brought to bear during daily tasks, such as reaching and grasping, and to establish natural and nature-inspired approaches that allow the user and the machine (the artificial limb) to communicate. The research outcomes will reduce motor disability and improve quality of life of individuals with physical disabilities. The developed approaches can also enable intuitive control of assistive robots in medical, industrial, and military applications. Summer projects and outreach events, incorporating the proposed techniques, will be offered to undergraduate students in minority-serving universities and local K-12 students, specifically targeting underrepresented students. The research team will organize workshops at national conferences to disseminate research findings and facilitate broader collaborations. Certificate and credential programs will be offered through online learning platforms. Research outcomes will also be presented to local and regional patient support groups and national clinical-oriented conferences so as to disseminate state-of-the-art research development to end users.The goal of this project is to develop and evaluate a biomimetic human-centric neural-machine interface system, which incorporates outward (efferent) and inward (afferent) directed signals for the control of assistive robots. The system will allow individuals with disabilities to interact with their assistive robots as they use their biological limbs. If successful, it will provide a robust and effective model for intuitive interaction of human-machine systems for application to a broader variety of health and industrial applications, and finally overcome the problem of intuitive control of assistive devices in individuals with disability. The research team will strategically integrate research threads that address critical barriers for human-robot integration: Thread 1 will develop implantable and wearable electrode platforms for neural recording and neural stimulation. Thread 2 will understand fundamental principles of neural encoding of artificial sensation and establish biomimetic sensory encoding strategies. Thread 3 will develop an integrated shared control framework for dexterous control of robotic hands. Thread 4 will collectively address the functional integration of closed-loop robotic systems for perceptual motor control. The research team will integrate the proposed techniques, closing the loop between artificial sensing and actuation of the robot and the perception and control authority of the human, examining the adaptability and robustness of the closed-loop human-machine systems. Collectively, the research project can generate transformative outcomes that can blur the boundary between humans and assistive robots, allow end-users to fully leverage the functionality of advanced robots, and promote the development of next-generation neural-machine interfaces and assistive robots.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
StatusActive
Effective start/end date9/15/238/31/28

Funding

  • National Science Foundation: $3,999,570.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.