Is the use of Machine Learning to understand what animals say ‘a green signal towards Google translation of animal languages in the near future?

The potential of machine-learning systems to detect human language has resulted in the invention of voice assistants that can recognize speech, transcription software that can transform speech into text and digital tools that can translate between human languages. Machine-learning systems use algorithms to detect patterns in large collections of data.

Researchers have now begun applying this field to decode animal communication, using machine-learning algorithms to figure out when stressed-out mice squeak or why fruit bats are shouting.

Researchers at Germany’s Max Planck Institute for Brain Research recently used machine-learning algorithms to analyze 36,000 soft chirps recorded in seven-mole rat colonies.
The underground-dwelling rodents known as naked mole rats make soft chirping sounds when they meet in a tunnel. So:

According to the study, each mole rat not only had a distinct vocal signature, but each colony also had a distinctive dialect that was passed down culturally over many generations.

These cohesive dialects fell apart during periods of social instability, like the weeks after a colony’s queen was forcibly removed. A new dialect seemed to emerge as a new queen took the throne.

Quoting Alison Barker, a neuroscientist at the Max Planck Institute for Brain Research in Germany as saying, the New York Times writes, “The greeting call, which I thought was going to be pretty basic, turned out to be incredibly complicated”.

Although this kind of study still has a long way to go, this work has already revealed that animal communication is far more complex than it sounds to the human ear, and the chatter is providing a richer view of the world beyond our own species.

Stating that it’s really intriguing that machines might help them to feel closer to animate life, Tom Mustill, a wildlife and science filmmaker and the author of the forthcoming book, said, “… artificial intelligence might help us to notice biological bits of intelligence”.

Scientists have shown that these programs can tell apart the voices of individual animals, distinguish between sounds that animals make in different circumstances, and break their vocalizations down into smaller parts, a crucial step in deciphering meaning.

Nearly three years ago, researchers at the University of Washington used machine learning to develop software, called DeepSqueak, that can automatically detect, analyze and categorize the ultrasonic vocalizations of rodents.

DeepSqueak has been repurposed for other species, including lemurs and whales, while other teams have developed their own systems for automatically detecting when clucking chickens or squealing pigs are in distress.

Google Translation of Animal Languages

What next? We can expect Google translation of animal languages in the near future.

Diana Reiss, an expert on dolphin cognition and communication at Hunter College and co-founder of Interspecies Internet, a think tank devoted to facilitating cross-species communication, is quoted as saying, “Let’s try to find a Google Translate for animals“.

If possible, Google translating animal voices could lead to some serious breakthroughs in studying the animal mind.

It’s an area that could lend itself to a lot of interesting new research.  If you can tell what the dolphin is saying, you might be able to understand its emotional state.

So, as Google’s translation of animal voices advances to reality, biologists will have a powerful tool for examining the conversations that animals with dissimilar brains engage in.

Indicating the possibility that such experiments might also raise ethical issues, Mustill said, “If you find patterns in animals that allow you to understand their communication, that opens the door to manipulating their communications“.

On the other hand, some experts also suggest that the technology could also be deployed for the benefit of animals, helping experts monitor the welfare of both wild and domestic fauna. According to them, this research might prompt a broader societal shift by providing new insight into animal lives.

Let’s now contemplate…, it’s not a far-fetched idea that in the future, we might be able to speak the same language as birds, dolphins, and other animals. Isn’t this Machine learning’s new best?

Leave a Reply

Your email address will not be published. Required fields are marked *

AI art Previous post AI-Generated Art Wins a Prize: Artists left with no words
Next post Subscribe to newsletter
Show Buttons
Hide Buttons