WatchMojo

Login Now!

OR   Sign in with Google   Sign in with Facebook
advertisememt
VOICE OVER: Peter DeGiglio WRITTEN BY: Ajay Manuel
Can we FINALLY talk to animals FOR REAL?? Join us... and find out!

In this video, Unveiled takes a closer look at a recent breakthrough that could finally enable human beings to talk to other animal species!

<h4>

How Scientists are Using AI to Talk to Animals</h4>

Understanding animals is tricky. Sure, we can teach our pets to obey commands, but do we ever truly talk with them? Our ability to talk in the way that we do, among ourselves, is something that separates us from other animals… and so, expecting animals to hold a conversation at the same level that we do would be surreal. Nevertheless, AI might soon be delivering exactly that.

This is Unveiled, and today we’re exploring how scientists are using AI to talk to animals.

Humans are great communicators. It’s one of the main reasons why we’ve survived for so long. We’re able to hold long and complex conversations on just about anything, and we can do that in nearly 6,500 languages all told - bridging various countries and cultures all across the planet. The more recent introduction of AI tools and software has then made translating and learning new languages easier than ever. The advent of listen-and-respond robots like Siri and Alexa has broadened the possibilities even further. 

Scientists have long asked if the same could be done with the animal kingdom. Could we ever bridge the gap between us and all of the other creatures on Earth? Could we use AI to figure out the mysteries behind animal communication… and maybe even to talk to the animals, ourselves?

Perhaps the most successful attempt so far at interspecies communication was with the gorilla Koko, who lived at various sites in California, from 1971 until 2018. She managed to use a version of sign language, to seemingly talk with her keepers. But Koko was an extremely rare case, so how could we broaden out? The first obstacle would be to understand what animal communication looks like. We inherently understand that human language is made up of verbal and physical components… and the same could be said for animals, as well. Dogs wag their tails and cats purr, for example. Dolphins click, some creatures change color, others seemingly dance.

With more than 8.7 million species of plant and animal on our planet, however, each with its own communication system, it is incredibly difficult for humans to replicate or even understand the complex structures of communication en masse. Consider that many animals - like most bats - link up at audio levels that can’t even be heard by the human ear. Some, like the mantis shrimp, understand the world in a literally different light - across a wider part of the electromagnetic spectrum - and so will likely incorporate that into how they communicate, too.

There is a growing school of thought that the task isn’t impossible, however. That AI, combined with bioacoustics research, can help identify language-like structures in animals, converting them into something that humans can tap into, as well. Dr. Karen Bakker, in particular, a professor at the University of British Columbia, has written extensively about the efforts ongoing and the plans for the future. For example, there are already numerous trials underway. With digital bioacoustics, we can follow some species with tiny sound recording equipment… which can then enable scientists to capture auditory signals in even remote ecosystems. It’s almost entirely non-invasive, and the findings are almost always unspoiled.

But, understandably, these studies capture (and have already captured) a lot of data. So much data, in fact, that it would take an individual many years to analyze. It could take a single organization many lifetimes to fully comprehend. But AI can solve this problem by applying the same kinds of algorithms found in tools like Google Translate, but this time to detect patterns in animal communication. Scientists can use deep learning AI to build and build our understanding; to combine all of what humans can and can’t pick up, and provide us with an ever more sophisticated “how to” for really talking with the animals.

In a widely-cited study led by Dr. Yossi Yovel at Tel Aviv University, he and his team monitored a group of Egyptian fruit bats for around ten weeks, and recorded the sounds they made. Once the data was collected, the team used a voice-recognition program to analyze more than 15,000 sounds. The AI helped to correlate those with specific actions that were also captured via video. They essentially mapped the bat language. They found that bats, much like humans, argue over food, distinguish between individuals, and send targeted messages. They demonstrated that not only is there a bat language, but that it’s also very layered.

Similar efforts are underway at other organizations, such as at the Earth Species Project, a nonprofit working to decode non-human language, such as in aquatic mammals and primates. In a 2021 paper, published in the journal “Scientific Reports” by “Nature”, the ESP claims to have solved the  so-called cocktail party problem – the issue of identifying a single characteristic sound among countless others. While usually considered through the lens of human life, it’s a challenge that’s very applicable to deciphering animal communication - as, in the natural world, you are almost always listening to multiple animal sounds at once. Elsewhere, software called DeepSqueak has been used to decipher rodent behavior based on their ultrasonic calls. There are also many well established efforts to unravel whale song, which could lead to a firmer understanding of how, why and where species travel. There are increasing studies dedicated to birds; to how they communicate, and how that affects their migrations. Even among trees and plantlife, there are various theories that information is passed in various ways. This is, then, about more than simply talking with animals… it’s about understanding animal society and the environment.

In light of these developments, and more, scientists now increasingly believe that humans will eventually decode the first non-human language, beginning the advent of interspecies communication. There are some predictions that it will happen within the next decade. At these still early stages, the focus has largely been on species that are very vocal, such as birds, pigs, and dolphins. Other groups such as primates and whales are also simpler to study, relatively speaking. By tuning into animals like these, we’re not just learning how they communicate, but also about their emotional intelligence, learning abilities, and self-recognition. Whether they have moods, an understanding of the future, et cetera. But there’s really no upper limit to where we could go. And, as the data fills out, it should become more and more possible to decipher languages for more and more species.

The work is still only just beginning, however. Digital bioacoustics has set the stage for countless future breakthroughs. It bridges between us and potentially every other species on our planet. If we’re successful, then it will be like opening the door to a whole new library, of effectively infinite size, containing untold pages of new information. These efforts will not only help mend the divide between humans and other species, then, but they’ll also enhance our relationship with the planet as a whole.

How do you think the world would change if we could communicate with animals? Let us know your thoughts in the comments!

For now, we really might be on the brink of a watershed moment in history. Even slightly improved human-to-animal links would have significant implications for the use of animals for sports, entertainment, research, and perhaps most significantly as our food supply. In general, it would impact how humans treat and view animals - in the wild, in captivity, and at home. No doubt it would be a change that would also enable humans to learn more about ourselves, as a result.

Equally, while the thought of interspecies communication is exciting, the very reality also raises some major ethical and philosophical questions surrounding the risks of effectively engaging animals in AI-mediated conversations. There are in-built biases within AI systems to consider; there are animal rights issues to debate. And, even beyond these discussions, some remain skeptical of the approach in general. Can we really hope to convert animals into data sifted through by AI? Should we even want to do that? And, even if we can listen to animal language, how long before we can really talk back? There’s an ever-growing list of big questions to grapple with. 

Nevertheless, if this new reality does become possible, then what would you do first? Which animal would you most want to speak with, and what would you say to them? What was once thought probably impossible is now deemed potentially achievable, and soon. Because that’s how scientists are using AI to talk to animals.

Comments
advertisememt