Machine learning’s next level? “Google Translation of Animal Languages”

Is the use of Machine Learn­ing to under­stand what ani­mals say ‘a green sig­nal towards Google trans­la­tion of ani­mal lan­guages’ in the near future?

The poten­tial of machine-learn­ing sys­tems to detect human lan­guage has result­ed in the inven­tion of voice assis­tants that can rec­og­nize speech, tran­scrip­tion soft­ware that can trans­form speech into text, and dig­i­tal tools that can trans­late between human lan­guages. Machine-learn­ing sys­tems use algo­rithms to detect pat­terns in large col­lec­tions of data.

Researchers have now begun apply­ing this field to decode ani­mal com­mu­ni­ca­tion, using machine-learn­ing algo­rithms to fig­ure out when stressed-out mice squeak or why fruit bats are shout­ing.

Researchers at Ger­many’s Max Planck Insti­tute for Brain Research recent­ly used machine-learn­ing algo­rithms to ana­lyze 36,000 soft chirps record­ed in sev­en mole rat colonies.
The under­ground-dwelling rodents known as naked mole rats make soft chirp­ing sounds when they meet in a tun­nel. So:

Accord­ing to the study, each mole rat not only had a dis­tinct vocal sig­na­ture, but each colony also had a dis­tinc­tive dialect that was passed down cul­tur­al­ly over many gen­er­a­tions.

These cohe­sive dialects fell apart dur­ing peri­ods of social insta­bil­i­ty, like as the weeks after a colony’s queen being forcibly removed. A new dialect seemed to emerge as a new queen took the throne.



Quot­ing Ali­son Bark­er, a neu­ro­sci­en­tist at the Max Planck Insti­tute for Brain Research in Ger­many as say­ing, the New York Times writes, “The greet­ing call, which I thought was going to be pret­ty basic, turned out to be incred­i­bly complicated”.

Although this kind of study still has a long way to go, this work has already revealed that ani­mal com­mu­ni­ca­tion is far more com­plex than it sounds to the human ear, and the chat­ter is pro­vid­ing a rich­er view of the world beyond our own species.

Stat­ing that it’s real­ly intrigu­ing that machines might help them to feel clos­er to ani­mate life, Tom Mustill, a wildlife and sci­ence film­mak­er and the author of the forth­com­ing book, said, “… arti­fi­cial intel­li­gences might help us to notice bio­log­i­cal intelligences”.

Sci­en­tists have shown that these pro­grams can tell apart the voic­es of indi­vid­ual ani­mals, dis­tin­guish between sounds that ani­mals make in dif­fer­ent cir­cum­stances and break their vocal­iza­tions down into small­er parts, a cru­cial step in deci­pher­ing meaning.

Near­ly three years ago, researchers at the Uni­ver­si­ty of Wash­ing­ton used machine learn­ing to devel­op soft­ware, called Deep­Squeak, that can auto­mat­i­cal­ly detect, ana­lyze and cat­e­go­rize the ultra­son­ic vocal­iza­tions of rodents. 

Deep­Squeak has been repur­posed for oth­er species, includ­ing lemurs and whales, while oth­er teams have devel­oped their own sys­tems for auto­mat­i­cal­ly detect­ing when cluck­ing chick­ens or squeal­ing pigs are in distress.

Google Translation of Animal Languages

What next? We can expect Google trans­la­tion of ani­mal lan­guages in the near future.

Diana Reiss, an expert on dol­phin cog­ni­tion and com­mu­ni­ca­tion at Hunter Col­lege and co-founder of Inter­species Inter­net, a think tank devot­ed to facil­i­tat­ing cross-species com­mu­ni­ca­tion, is quot­ed as say­ing, “Let’s try to find a Google Trans­late for ani­mals”.

If pos­si­ble, Google trans­lat­ing ani­mal voic­es could lead to some seri­ous break­throughs in study­ing the ani­mal mind.

It’s an area that could lend itself to a lot of inter­est­ing new research.  If you can tell what the dol­phin is say­ing, you might be able to under­stand its emo­tion­al state.

So, as Google trans­la­tion of ani­mal voic­es advances to real­i­ty, biol­o­gists will have a pow­er­ful tool for exam­in­ing the con­ver­sa­tions that ani­mals with dis­sim­i­lar brains engage in.



Indi­cat­ing the pos­si­bil­i­ty that such exper­i­ments might also raise eth­i­cal issues, Mustill said, “If you find pat­terns in ani­mals that allow you to under­stand their com­mu­ni­ca­tion, that opens the door to manip­u­lat­ing their com­mu­ni­ca­tions”.

On the oth­er hand, some experts also sug­gest that the tech­nol­o­gy could also be deployed for the ben­e­fit of ani­mals, help­ing experts mon­i­tor the wel­fare of both wild and domes­tic fau­na. Accord­ing to them, this research might prompt a broad­er soci­etal shift by pro­vid­ing new insight into ani­mal lives.

Let’s now con­tem­plate…, it’s not a far-fetched idea that in the future, we might be able to speak the same lan­guage with birds, dol­phins and oth­er ani­mals. Isn’t this Machine learn­ing’s new best?

Leave a Reply

Your email address will not be published.

Join our NewsletterDaily Glimple of Future

Our blog, "Daily Glimpse of Future", strives to make the future much clearer than it is today. Join our newsletter for free now.