Motion in action: Integrating multisensory inputs for posture stabilization and complex action acquisition

Project: Research project

Project Details

Description

Daily movements – from catching a ball to staying balanced while opening a heavy door – showcase the brain's remarkable ability to control movement and at the same time maintain stability. While these actions appear simple, they involve sophisticated interactions among different sensory systems, including vision, proprioception (muscle sense), and balance. This research project aims to understand how the brain coordinates these sensory systems to enable goal-oriented movement and stability. Using innovative virtual reality and robotic technology, the research team studies how humans control their movements and balance when interacting with moving objects. The team will develop computer models simulating how different sensory systems work together during these interactions. This knowledge is crucial for improving human-robot interactions in environments that require physical collaboration. The project includes educational outreach through summer camps teaching middle school students about the brain and movement, emphasizing how neurological conditions affect balance and movement coordination. Camps for high school students include exploration of educational and career opportunities at the intersection of neuroscience, movement science, and robotics. While the individual sensory processing pathways are well characterized, the mechanisms by which the brain integrates multiple sensory signals to produce complex actions, such as intercepting a moving ball, remain poorly understood. The project aims to elucidate how the nervous system processes visual motion signals to modulate anticipatory postural adjustments and compensatory postural adjustments during interactions with moving objects. The research tests two theoretical frameworks: the feedback error learning model, which proposes that skill acquisition occurs through iterative updating of internal models, and the hierarchical sensory predictive control model, which posits that internal models update intersensory mappings (between vision, proprioception, and vestibular sensation) to regulate motor responses. The experimental paradigm employs a novel virtual reality and robotic system that allows precise control of object motion and contact forces. Participants interact with moving virtual objects while researchers measure smooth pursuit eye movements, muscle activation patterns, and limb dynamics. The experimental design systematically manipulates visual tracking conditions and object motion parameters to investigate how different sensory inputs contribute to motor learning. Data analysis combines traditional motor control measures with computational modeling approaches. This research advances our fundamental understanding of how the brain controls movement while generating insights relevant to human-robot interaction and rehabilitation medicine. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
StatusActive
Effective start/end date8/1/257/31/28

Funding

  • National Science Foundation: $489,062.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.