Hacker News new | past | comments | ask | show | jobs | submit login

> entropy is necessary for complexity to emerge.

Something about this particular line does not sit well with me.

How would we define the relationship between entropy, complexity and information?




Information thermodynamics is what you are looking for, it's the unification of thermodynamics and information theory. Bear with me because I'm not a physicist, but my understanding is that information needs a medium, in which way it is similar to heat, and you can use the same statistical mechanics to describe it, or fluctuation theorem, which is more precise.

My understanding is that this is pretty cool stuff that solves Maxwell's demon and also sort of explains biological evolution, because apparently a system responding to it's environment is computation, performed by changing the system's state as a function of a driving signal, which results in memory about the driving signal that can be interpreted as computing a model, a model that can be predictive of the future. Now apparently how predictive that model is equals to how thermodynamically efficient the system is. Even the smallest molecular machines with memory thus must conduct predictive inference to approach maximal energetic efficiency.

https://www.researchgate.net/publication/221703137_Thermodyn...


I agree it doesn’t feel right. But complexity, like life, does emerge even though the 2nd law holds. It is a matter of scale. Entropy does not mean everything becomes disordered. And now I defer to the physicists, because as an engineer I am going out of my lane…


Minor lightbulb went off in my head: you might be interested in slide 17 here, from a presentation on Shape Dynamics [0]

tldr:

    (Shannon) entropy [1]: expected description length of a state of a system

    (Shape) complexity [0]: a 'clumped-ness' metric: clump sizes / inter-clump separations

    information: not sure anyone ever really resolved this in a satisfying way :^) [2]

[0] https://philphys.net/wp-content/uploads/2019/Barbour-Saig_Su...

[1] https://physics.stackexchange.com/questions/606722/how-do-di...

[2] https://monoskop.org/images/2/2f/Shannon_Claude_E_1956_The_B...


thermodynamic entropy is to heat as Shannon entropy is to information?


Hmm, not entirely sure if those terms fit exactly. It's easier to point out you can go from one to the other by setting to hamiltonian to the negative logarithm of the probability density (or use the Boltzmann distribution to go the other way).




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: