The quest to understand what’s happening inside the minds and brains of animals has taken neuroscientists down many surprising paths: from peering directly into living brains, to controlling neurons with bursts of light, to building intricate contraptions and virtual reality environments.
In 2013, it took the neurobiologist Bob Datta and his colleagues at Harvard Medical School to a Best Buy down the street from their lab.
At the electronics store, they found what they needed: an Xbox Kinect, a gaming device that senses a player’s motions. The scientists wanted to monitor in exhaustive detail the body movements of the mice they were studying, but none of the usual laboratory techniques seemed up to the task. So Datta’s group turned to the toy, using it to collect three-dimensional motor information from the animals as they explored their environment. The device essentially rendered them as clouds of points in space, and the team then analyzed the rhythmic movement of those points.
Datta’s solution might have been unorthodox at the time, but it’s now emblematic of a wave of automated approaches that are transforming the science of behavior. By studying animals’ behaviors more rigorously and quantitatively, researchers are hoping for deeper insights into the unobservable “drives,” or internal states, responsible for them. “We don’t know the possible states an animal can even be in,” wrote Adam Calhoun, a postdoctoral fellow who studies animal behavior at Princeton University.
Tracing those internal states back to specific activity in the brain’s complex neural circuitry presents a further hurdle. Although sophisticated tools can record from thousands of neurons at once, “we don’t understand the output of the brain,” Datta said. “Making sense of these dense neural codes is going to require access to a richer understanding of behavior.”
That richer understanding may not remain out of reach much longer. Capitalizing on advances in machine learning, scientists are building algorithms that automatically track animals’ movements, down to tiny changes in the angle of a fly’s wing or the arch of a mouse’s back. They’re also creating pattern-finding tools that automatically analyze and classify this data for clues about animals’ internal states.
A key advantage of these methods is that they can pick up on patterns that humans can’t see. [Continue reading…]