Researchers have created a suite of digital tools that break complex mouse behaviors into discrete parts and link them to the animals’ brain activity.
The techniques, which are collectively called BehaveNet, can predict a mouse’s movements purely from its neural data. They may also help researchers map behaviors to specific neural circuits.
“People for a long time have been developing quantitative methods for understanding neural activity,” says Matthew Whiteway, a postdoctoral researcher in Liam Paninski’s lab at Columbia University in New York. “To finally bring behavior into the fold too is very important.”
Whiteway presented the unpublished work today at the 2019 Society for Neuroscience annual meeting in Chicago, Illinois.
The researchers studied mice that had been genetically modified so that the excitatory neurons in their cortex fluoresce when the cells are active. The team applied a substance to the animals’ skulls to make them transparent. And they used a head-mounted microscope to monitor the mice’s brain activity as the animals completed a simple decision-making task.
The team used machine-learning algorithms to process videos of the mice performing the task, and to break down each movement into discrete gestures or ‘behavioral syllables,’ such as licking, raising a paw or pushing a lever.
“The basic idea is you have very flexible, complex behaviors built out of these simpler building blocks,” Whiteway says.
The researchers also used machine learning to identify patterns of brain activity associated with each behavioral syllable. Their software could accurately predict many of these syllables, including licking and certain kinds of jaw and paw movements, by analyzing the animals’ neural activity.
“You can start to understand how behaviors are differently encoded by different populations of neurons,” Whiteway says.
The team hopes to use the techniques to track how animals’ behavior, and accompanying brain activity, change over time.
For more reports from the 2019 Society for Neuroscience annual meeting, please click here.