Translating Fido's Bark

January 15, 2018
Amanda Carrozza

Amanda Carrozza is a freelance writer and editor in New Jersey.

Progress is being made on developing a program to decode and translate dogs’ sounds and behaviors into English.

There is growing interest in and increasing research on the use of artificial intelligence to decipher animal communication. Developing and perfecting this technology would be so much more than a neat party trick. As work in some species has already shown, advancing knowledge in this area offers a greater understanding of how animals within a particular species communicate with one another. Looking to the future, the ability to translate dogs’ vocalizations and behaviors would provide veterinarians around the world with insight into canine anxiety, pain, and illness.

Leading the charge in translating Fido’s bark is Con Slobodchikoff, PhD, a professor emeritus of biology at Northern Arizona University. After developing technology that deciphered the calls of prairie dogs, Dr. Slobodchikoff now has set his sights on dogs. His plan? Gather thousands of videos of dogs barking and moving and use those images and sounds to create an algorithm that will decipher what the animals are communicating.

In the Amazon-commissioned report, Shop the Future, experts predict that this technology will be widely available by 2022. One of the report’s authors, behavioral futurist William Higham, specifically pointed to Dr. Slobodchikoff’s work as some of the leading progress toward creating an Amazon Echo that can translate barks and meows into English.


  • Acoustic Features of Dog Barks Predict Annoyance Levels in Humans
  • Could Your Dog's Tongue Be Telling You Something?

Past research, including work by Dr. Slobodchikoff, has demonstrated the power of human analysis and technology to translate the communications of several species.

Prairie Dogs

Dr. Slobodchikoff spent more than 3 decades decoding the high-pitched calls of prairie dogs. Through sound analysis, he discovered that prairie dogs alter their calls to relay specific details, like the size of an approaching predator or even the color of a human’s shirt. He even went so far as to team up with a computer scientist to develop a complex algorithm to translate the vocalizations into English.


Veterinarians have developed various facial scales to detect pain levels in animals. But because this method relies on inference rather than hard data, computer scientists at the University of Cambridge in the United Kingdom set out to automate the process. They created an algorithm that could estimate sheep pain levels based on their facial expressions. While the researchers concluded that additional elements could be added to the algorithm to improve accuracy, they noted that the developed pain assessment approach is not specific to sheep and could be generalized for use with other animals.


In November 2017, researchers unveiled software that turns the sounds marmosets make into conversations scientists could read. Developed at the Massachusetts Institute of Technology, the program detects a monkey’s call and turns the frequency patterns into black and white images. After collecting enough samples and fine-tuning the program’s accuracy, the developers were able to create a pattern of words that decode the conversations between marmosets.