Can artificial intelligence actually help us to communicate with animals?
Decoding nowadays has typically relied on meticulous observation up until recently. Currently, there has been a surge in using machine learning to handle the massive volumes of data that can be gathered by contemporary animal borne-sensors
Human interest and study in animal vocalisations have existed for a long time. The warning sounds of different primates vary depending on the predator. For example, dolphins communicate with distinctive whistles, and some songbirds may rearrange the components of their calls to convey different meanings. Most experts refrain from referring to this as a language because no animal communication satisfies all the requirements. Decoding nowadays has typically relied on meticulous observation up until recently. Currently, there has been a surge in using machine learning to handle the massive volumes of data that can be gathered by contemporary animal borne-sensors, The Guardian reported.
Aza Raskin, on the other hand, decides to strengthen human bonds with other living species while promoting their protection using machine learning, with the goal of deciphering non-human communication. She is the co-founder and president of the Earth Species Project (ESP). With her hands, the dolphin handler makes the ‘together’ and ‘create’ signals. The two trained dolphins exchanged sounds before emerging, flipping around and lifting their tails, coming up with a new trick of their own, and executing it. The founder claimed that this doesn’t establish that a language is there but simply stands as a reason that if they had had access to a language communication tool, then this would have been made much simpler.
However, ESP claims that its strategy is different from others as it focuses on decoding all species' communication rather than any particular one. The founder said, "We’re species agnostic," says Raskin. "The tools we develop… can work across all of biology, from worms to whales," The Guardian reported.
Elodiw Briefer, an associate professor who studies vocal communication in animals and mammals, said that people are now using machine learning to understand animal communication. Briefer co-developed an algorithm to determine using pig grunts if the animal is feeling happy or sad.
Watch | Artificial intelligence will help track elephants’ migration patterns in Botswana
DeepSqueak, analyses rats’ ultrasonic sounds to determine whether they are under stress.
Another project CETI (Cetacean Translation) is another initiative to translate sperm whale communication using machine learning.
Another effort aims to automatically decipher the functional significance of vocalisation. It is currently in process in Professor Ari Friedlaendar’s lab, University of California, Santa Cruz, which specialises in ocean science. One of the biggest aims of the programme is to analyse how wild marine mammals behave underwater despite being difficult to witness directly. The animals are equipped with tiny electronic blogging devices that record their location, the kind of motion they are in and can see things through the video camera.
(With inputs from agencies)
WATCH WION LIVE HERE
You can now write for wionews.com and be a part of the community. Share your stories and opinions with us here.