All species of whales use sound to communicate. Some vocalizations are identifiable to the species or even the local population level. While many whales can be hard to spot visually, their calls can travel for many miles underwater.
To monitor movement and examine changes in populations over time, scientists use underwater microphones to record their sounds. This field of study, called passive acoustic monitoring, has advanced in recent years allowing for recordings over extremely long periods of time and large areas of our oceans. These advancements have resulted in a dramatic increase in the volume of data collected and scientists simply do not have enough hours in the day to analyze the data themselves.
A recent paper in Frontiers in Marine Science documents a successful study that applies the artificial intelligence technique of deep machine learning to analyze a large passive acoustic dataset and identify highly variable humpback whale sounds across broad spatial and temporal scales.