Researchers use AI to track whale songs in a sea of noise
Machine learning is helping researchers even as the whales' songs evolve.
This story was originally published on November 16, 2018.
The system uses machine learning to identify the specific calls of humpback, which can be difficult to distinguish on their recordings from other whales, as well as other sounds, like boats passing on the ocean's surface.
A recording of a humpback whale song.
"It lets me go back to what I'm good at," Ann Allen, a research oceanographer in the Cetacean Research Program at the Pacific Islands Fisheries Science Center in Hawaii, told Spark host Nora Young.
"We're biologists, but I've been learning a lot of computer programming over the years in order to address this problem. But it's not what I'm trained for. So it means that if [Google] can create a tool that can address this for us, it means that I can go back to answering the questions that these data address."
To gather the audio, the researchers use something called a HARP, or high frequency acoustic recording package, which is left on the ocean floor for months at a time. In totally, they have collected 170 000 hours of recordings. That adds up to about 19 years.
While Allen put in the work to help put the system together, she said the idea wasn't her own.
"I was talking to my father describing my new job and all this data that we have, and after telling him about it, his response was, 'Why don't you just get those Shazam or Google people to do it? They're really good at recognizing songs.'
So I reached out to Google to see if they were interested."