Exploring The Beauty And Mystery Of Whale Songs, Nature’s Greatest Hits

Whale songs have been a topic of discussion that dates back many years, to the time even when sailors first began navigating the high seas. According to folklore, the haunting sounds from the deep were linked to mythical elements – often thought to be the ghosts of drowned sailors calling out to fellow seamen.
However, science as always, proves otherwise. In 1967, American biologist and environmentalist, Roger Payne and fellow researcher, Scott McVay, identified the sounds recorded by a US Navy hydrophone in 1952 as a humpback whale song.
It was the spark that ultimately began the exploration of whale songs – and something of which has been an ongoing study lasting for over 50 years.
What Exactly Are Whale Songs?
The idea of whale “singing” is derived from the regular pattern of sounds that some species of whales, specifically the humpback, make. For many years, it remained one of nature’s most intriguing mysteries, especially as scientists and marine biologists began recording these unique sounds in an effort to identify what the low frequency roars and groans actually mean.
Over the years, scientists have made headway, identifying these whale vocalisation patterns and linking them to migratory, mating and feeding behaviour. As such acoustics have always remained a major area of study for whale researchers, but it’s a subject matter that continues to be extensive and complex owing to the fact that some humpback whales can sing for up to 30 minutes at a time.
However a new collaborative partnership between Google AI and the National Oceanic and Atmospheric Administration (NOAA) Pacific Island Fisheries Science Center has given way to Pattern Radio: Whale Songs, an online tool that allows anyone to explore thousands of hours of underwater ocean recordings and discover whale songs.
Machine Learning And Bioacoustics
The NOAA has been recording underwater audio using high-frequency acoustic recording packages across a dozen different sites in the Pacific Ocean as early as 2005. Today, they have amassed over 170,000 hours of audio recordings – equivalent to about 19 years of audio data.
Identifying an avenue to apply machine learning to the vast collection of underwater recordings, Google hatched a plan to train an artificial intelligence model to visualise the audio on a vast scale.
“The project really at first started as a fascination,” explains Jonas Jongejan, creative technologist at Google Creative Lab.
“We heard about the Bioacoustics work, which is a collaboration between NOAA and Google AI, revolving around creating machine learning models that could detect humpback whale songs in this massive archive of underwater recordings.
“We received a couple of audio files and instantly got fascinated by the sound of the whales. We then sat down and brainstormed on how we could make that experience more widely accessible.”
According to Jongejan, NOAA was very receptive in making their archives more accessible to the general public. In order to see the vision through, the team also worked with a machine learning expert and artist, Kyle McDonald, to push their understanding of what was possible in determining and presenting the recordings.
“I believe all of us were caught by surprise on how complex whale songs are, and how much nuance there is in it that is hard to detect. For me personally, the machine learning techniques we applied helped me understand the complexity of the songs, as well as other underwater sounds by giving me the tools to categorise and bucket different recordings.
“Ultimately, it let me really understand the breadth of the songs, resulting in a bigger appreciation for the whale songs.”
In total, the project took about six months to come to fruition, before Pattern Radio: Whale Songs was ready for the public eye (and ears).
The website acts as a tool to visualise these audio recordings on a larger scale through spectrograms and now, thanks to AI, it is easy for anyone to explore and make discoveries about whale songs.
Beyond The Data
The most unique aspect of the website is not just about the data and recordings that are presented but also on how useful and interactive the process has become.
Alex Chen, creative director at Google Creative Lab, explains that he has already witnessed Pattern Radio: Whale Songs being utilised in classrooms.
“It has allowed students to learn about humpback whale songs, the science of sound, and machine learning, all combined together in hands-on ways,” he adds.
“Additionally, we’ve also collaborated with musicians Annie Lewandowski and David Rothenberg who have been investigating the structure of humpback whale songs. And they have created tours that walk you through the structure of humpback whale songs on the website.”
Chen explains that both tours presented by Lewandowski and Rothenberg are equally fascinating in structure, repetition as well as unexpected changes.
“Even though researchers have been exploring the structure of their songs for many years, it’s neat that the tool lets anyone explore it, first-hand, on a really wide scale,” he adds.
According to Chen, the data presented on the website will help scientists better understand whales’ behavioural and migratory patterns, ultimately leading to better protection for the species.
As Pattern Radio: Whale Songs also fits into Google’s AI For Social Good programme, which applies the latest in machine learning to the world’s biggest humanitarian and environmental challenges, it begs the question on where else AI can be utilised in other areas of study.
“The project has helped spark interesting conversations with other organisations,” Chen reveals.
“There are so many research projects out there that rely on large audio datasets and machine learning, and we’ve started thinking about how this kind of hands-on, visual interface can aid their research.”
Chen elaborates that Google is currently working with the Indian Government to build a flood forecasting model which sends highly accurate alerts to individuals during a natural disaster.
“We’re also using retinal imagery to predict people’s risk of cardiac events like heart attacks, and we launched the Google AI Impact Challenge, which gave US$25mil in grants to organisations around the world to tackle pressing societal issues with the application of artificial intelligence.”
He adds: “It’s interesting how much can be done through sound, like NOAA has done, such as this organisation fighting illegal deforestation (also an Impact Challenge grantee).
“I’ve always had a personal interest and fascination around sound, so the potential around acoustic monitoring for conservation is especially exciting to me.”
For now though, Pattern Radio: Whale Songs has proven to be a success, not only presenting data recordings of whales in an efficient and easy to understand manner, but also making the information accessible to the masses in order for virtually anyone to chart their own underwater discoveries.
Ultimately, though, it has managed to shine more light on the mysteries of whale songs while firmly spotlighting the beautiful creatures that make them. That alone makes the whale songs worthy of a listen.
Cover Credit: Kerstin Meyer via Getty Images
Writer | Richard Augustin
Two decades in journalism but Richard believes he has barely scraped the surface in the field. He loves the scent of a good story and the art of storytelling, two elements that constantly fuel his passion for writing.
Comments
0 Comments