How do brains learn? It’s a mystery, one that applies both to the spongy organs in our skulls and to their digital counterparts in our machines. Even though artificial neural networks (ANNs) are built from elaborate webs of artificial neurons, ostensibly mimicking the way our brains process information, we don’t know if they process input in similar ways.
“There’s been a long-standing debate as to whether neural networks learn in the same way that humans do,” said Vsevolod Kapatsinski, a linguist at the University of Oregon.
Now, a study published last month suggests that natural and artificial networks learn in similar ways, at least when it comes to language. The researchers — led by Gašper Beguš, a computational linguist at the University of California, Berkeley — compared the brain waves of humans listening to a simple sound to the signal produced by a neural network analyzing the same sound. The results were uncannily alike. “To our knowledge,” Beguš and his colleagues wrote, the observed responses to the same stimulus “are the most similar brain and ANN signals reported thus far.”
Most significantly, the researchers tested networks made up of general-purpose neurons that are suitable for a variety of tasks. “They show that even very, very general networks, which don’t have any evolved biases for speech or any other sounds, nevertheless show a correspondence to human neural coding,” said Gary Lupyan, a psychologist at the University of Wisconsin, Madison who was not involved in the work. The results not only help demystify how ANNs learn, but also suggest that human brains may not come already equipped with hardware and software specially designed for language. [Continue reading…]