Neural activity associated with motor commands changes depending on context
Standing at a crosswalk, the signal changes from “don’t walk,” to “walk.” You might step out into the street straight away, or you might look both ways before you cross. Credit: Credit: University of Pittsburgh Standing at a crosswalk, the signal changes from “don’t walk,” to “walk.” You might step out into the street straight […]
Standing at a crosswalk, the signal changes from “don’t walk,” to “walk.” You might step out into the street straight away, or you might look both ways before you cross.
Credit: Credit: University of Pittsburgh
Standing at a crosswalk, the signal changes from “don’t walk,” to “walk.” You might step out into the street straight away, or you might look both ways before you cross.
In either scenario, you see the light change, you cross the street. But the context is different; in one case, you didn’t think twice. In the other, you waited; looked to the left and right; saw the coast was clear; then stepped into the street.
Researchers have known that certain brain activity when you see the light change and certain brain activity when you step out into the street are the same no matter the context — there’s a known “pathway” that a neuron’s activity travels.
Neeraj Gandhi, a bioengineering professor in the Swanson school of Engineering, and team wanted to know, does anything happen along that pathway between the time you see the light change — a stimulus — and the moment you step into the street — an action? Or does the pathway for “crossing the street” look the same, no matter the context?
“If there are two different contexts, even though you’re making exactly the same movement, the neural activity in the brain is different.” Gandhi said. “In addition to the motor/action command, there is other activity there that tells you something about what’s going on cognitively in a given structure,” he said.
The findings were published September 29 in the Proceedings of the National Academy of Sciences (PNAS).
From an engineering standpoint, Gandhi said, the finding may have implications for algorithm design. For instance, a similarly designed system could serve as a framework for an autonomous vehicle system that could accelerate when a light turns green, but also delay that action if it sensed something in the crosswalk. The system could analyze the object and, if the coast is clear, it could then begin to drive.
Lead author Eve Ayar, a PhD student at Carnegie Mellon University and a member of Gandhi’s lab, found their results may have implications for better understanding the mechanisms underlying executive functions — and ways in which it may be impaired.
“There are a lot of disorders out there where people are unable to take in that sensory stimulus in your environment and make some kind of movement or action in response to that,” Ayar said. Soon we may be able to build models that help us better understand how these systems work, and the ways in which they can be disrupted.
“I think this is valuable not only for better understanding this structure of the brain, but potentially it will help us understand how other regions in the brain are operating as well,” Ayar said. “And help us be able to differentiate different signals underlying different behaviors.”
Journal
Proceedings of the National Academy of Sciences
DOI
10.1073/pnas.2303523120
Method of Research
Experimental study
Subject of Research
Animals
Article Title
Distinct context- and content-dependent population codes in superior colliculus during sensation and action
Article Publication Date
29-Sep-2023
What's Your Reaction?