There have been two interesting studies that further support the hypothesis of somewhat complex communication system. One was done in Florida with wild dolphins who hunt cooperatively. One dolphin drives the fish,while others act as a barrier. Such cooperation requires sophisticated acoustic communication. Another study was an experiment with dolphins in managed care. Dolphins were asked to push two separate buttons at the same time, but they could not see each other. So they had to communicate acoustically to achieve the perfect timing and push the buttons together.
We do, actually, for dolphins at least. Researchers did an experiment where dolphins had to coordinate their actions (push 2 buttons together) without seeing each other. So they coordinated their actions acoustically by communicating when they should push the buttons.
Motivation could be an issue. What if they do not want to communicate with us? What if we are too slow for them? Dolphins produce burst pulses that have interclick interval of just few milliseconds. We cannot even hear these individual clicks and for us the burst pulse sounds like a creaky door. But they, supposedly, can hear individual clicks. We do not understand or use echolocation, they do, so their "dolphin-socrates" probably would have their own allegory of the cave that is not based on what individuals can see.
No, it is not that simple. Humpback whales have been observed to interfere and mess up with orcas protecting gray whale calves from being killed. It was observed several times and researchers think that humpbacks just really cannot stand orcas harming other whales, read more here: https://www.smithsonianmag.com/science-nature/what-humpback-...
It is not easy, we have not yet cracked the code, plus will they even have the motivation or any interest to communicate with us?
What we do know is that they produce a variety of signals, some very complex and their communication is super fast, which is another obstacle for possible "communication". For example, we cannot generate a burst pulse with our current technology, we can only record theirs and play it back.
But many research groups are trying. We are starting Dolphin Chat citizen science project on Zooniverse in a month or so, to classify and prepare a large dataset of bottlenose dolphins' vocalizations for out deep learning model. You can check it out and even participate, it will definitely help to appreciate how complex their vocal repertoire is (and how "chatty" they are).
The obstacle is that we still do not fully understand nor are able to replicate their sonar, even the Navy that has been studying dolphins for decades still cannot duplicate biosonar (read more here: https://www.nationalgeographic.com/news/2019/5/140328-navy-d...)
The burst pulse is extremely complex, some can have 400 single clicks in one pulse (our ear cannot even hear these single pulses, they merge and for our human ear it sounds like a creaky door), and the pulse duration is like a few seconds. Each click is broadband (can go up to 100 kHz and beyond), it is frequency modulated with varied peak frequencies, center frequencies, RMS, some clicks can have 2 peaks, etc. It is super fast and super complex, we we cannot just generate one, only dolphin's sound producing mechanism can.
It is not active yet, it will be in a month or so. You can also sign up for our newsletter here: cetalingua.com , we will send notification when this project goes live.
We are trying to do something similar (citizen science + AI) but for the acoustic data,our first model can identify manatee calls and mastication (chewing) sounds. https://manatee-chat-demo.appspot.com/
Humpback whales have mirror neurons (the ones implicated in "theory of mind") so they might show "compassion". There is so much we still do not know about marine mammals...
Humpback whales appear to show empathy towards other species. We have 115 documented cases of them doing so (and it is a lot given how hard it is to observe them).
>The meaning of a lot of human languages or alphabets is completely lost even if they were conceived by humans almost identical to us biologically. Hoping to understand alien languages without the aliens helping us is a pipe dream.
There has been limited success with parts of Voynich Manuscript (some plants names?) and some interesting developments with rongorongo, but overall, yes, super hard. Hopeflly, Natural Language Processing might help, some promising results were reported for Linear B.
(https://neurohive.io/en/news/researchers-use-machine-learnin...)
I'm not sure the manuscript really qualifies as a language but it does highlight the difficulties in understanding an artificial construct, coming from an intelligent mind.
We understand natural constructs because we kind of understand the natural laws that govern them. We're nowhere near understanding the laws that govern the intelligent, conscious mind. Even our own.
P.S. Good luck with your project. Sounds very interesting and right on the money for this conversation.