It looks like the sonograms are full of harmonics. The conventional musical notation for a note with rich harmonic content, such as a single pluck of a guitar string, is not a vertical line on the staff with notes at every harmonic; instead, you just indicate the pitch of the fundamental. (Even if the fundamental itself is mostly missing, like in the low notes on an upright piano, that's where you put the note.) Then, notes with different harmonic content (because they are played on different instruments) are plotted on different staffs, although this might be counterproductive for visualizing whale songs. Colors are probably better for that.
It would be interesting to see if a second-order Markov model of the whale song unit sequence finds information that is not captured in a first-order model. More interesting still would be if a stochastic context-free or pushdown model were able to predict whale songs better than a similarly-complex Markov model, as it would indicate that the whale song has a recursive structure, like human language.
It makes some sense that you would use a long, highly-redundant transmission of a sequence of discrete symbols, which then you would repeat after hearing, to distribute information of general interest around the ocean, where travel is slow and the latency-bandwidth product is high. The researchers speculate (largely on the basis of sexual dimorphism) that the information communicated is merely fashion — but surely there is some generally-useful, temporally-changing information of interest to humpback whale survival and fecundity.
How can we attempt to pull patterns out of the song?
What if the songs actually contain the whale equivalent of GPS coordinates? How would we detect it?
I'm sure a few people have spent many hours trying to do so, but I wonder if machine learning could help. It would be a challenge: we'd need factors to correlate to, like the whale's position or information about their environment (location of boats, pollution, or prey).
Perhaps a start would be triangulating the whale's position during each song, and looking for elements that somehow vary with location. I imagine someone has looked for this. Location might not actually be a good thing to look for - whales can presumably determine each other's location from the sound source and distance alone, like a human could hear the direction and distance of a shouting human. What else might they be communicating?
It seems to me that one would not only have to record the whale but the actions of basically all whales (if this does travel across oceans then studying local whales might miss a lot of data). Then you could do some deep analysis on the correlation of while recordings with actions of the whale population. Of course you could try to just study a pod or local group of whales to try to decipher near range communication, but depending on how often they communicate long range, this might have too much noise.
You could look at ocean currents and surface temperature to see if there's a relationship there. I can imagine whales would be interested in collaborating on a regional biodensity forecast.
This is a good point - it seems not improbably that a whale's view of place would have more to do with their location in an oceanic current rather than their x, y coordinate in a plane. Sort of like if you describe your position within a car on a highway - you most naturally would say "im in the fast lane between shelbyville and springfield" rather than "i'm at lat/lng such-and-such"
So to attempt to find correlations with the whale's absolute position may (who knows!?) be looking for a signal the whale isn't sending.
Ocean currents and swells are also how ancient Polynesians navigated the ocean so precisely over such huge distances. They developed the ability to read them like maps. Unfortunately, these skills are almost totally extinct. I believe there's only one Hawaiian "master navigator" in the old ways left alive. That intelligent animals could also survive by using this information is not at all unbelievable. Very cool and interesting to think about.
It really is incredible. It's worth giving serious consideration to pausing human activity in the oceans until we really understand what the whales are doing and saying.
Quite a lot of people, myself included, don't agree that what is "better for humanity" is the best option when it comes at a massive cost to other aspects of the natural world.
I'm now curious how one would estimate the number of megadeaths would result from the elimination of all oceangoing trade and ocean fishing. Heck, what percentage of only Japan's 127 million people would starve?
It would be interesting to see if a second-order Markov model of the whale song unit sequence finds information that is not captured in a first-order model. More interesting still would be if a stochastic context-free or pushdown model were able to predict whale songs better than a similarly-complex Markov model, as it would indicate that the whale song has a recursive structure, like human language.
It makes some sense that you would use a long, highly-redundant transmission of a sequence of discrete symbols, which then you would repeat after hearing, to distribute information of general interest around the ocean, where travel is slow and the latency-bandwidth product is high. The researchers speculate (largely on the basis of sexual dimorphism) that the information communicated is merely fashion — but surely there is some generally-useful, temporally-changing information of interest to humpback whale survival and fecundity.