Skip to content

Decoding Canine Cognition

Machine learning gives glimpse of how a dog's brain represents what it sees
September 14, 2022
A man and a dog in front of an MRI scanner

Bhubo, shown with his owner Ashwin Sakhardande, prepares for his video-watching session in an fMRI scanner. The dog's ears are taped to hold in ear plugs that muffle the noise of the fMRI scanner. Image credit: Emory Canine Cognitive Neuroscience Lab

Scientists have decoded visual images from a dog’s brain, offering a first look at how the canine mind reconstructs what it sees. The Journal of Visualized Experiments published the research done at Emory University. 

The results suggest that dogs are more attuned to actions in their environment rather than to who or what is doing the action.

The researchers recorded the fMRI neural data for two awake, unrestrained dogs as they watched videos in three 30-minute sessions, for a total of 90 minutes. They then used a machine-learning algorithm to analyze the patterns in the neural data.

“We showed that we can monitor the activity in a dog’s brain while it is watching a video and, to at least a limited degree, reconstruct what it is looking at,” says Gregory Berns, Emory professor of psychology and corresponding author of the paper. “The fact that we are able to do that is remarkable.”