Get the latest tech news
The Race to Translate Animal Sounds Into Human Language
With big cash prizes at stake—and AI supercharging research—interspecies translation is closer than ever. But what, if anything, would animals want to tell us?
In 2025 we will see AI and machine learning leveraged to make real progress in understanding animal communication, answering a question that has puzzled humans as long as we have existed: “What are animals saying to each other?” The recent Coller-Dolittle Prize, offering cash prizes up to half-a-million dollars for scientists who “crack the code” is an indication of a bullish confidence that recent technological developments in machine learning and large language models (LLMs) are placing this goal within our grasp. Massive datasets are now coming online, as recorders can be left in the field, listening to the calls of gibbons in the jungle or birds in the forest, 24/7, across long periods of time. Now, new automatic detection algorithms based on convolutional neural networks can race through thousands of hours of recordings, picking out the animal sounds and clustering them into different types, according to their natural acoustic characteristics.
Or read this on Wired