Information thermodynamics is what you are looking for, it's the unification of thermodynamics and information theory. Bear with me because I'm not a physicist, but my understanding is that information needs a medium, in which way it is similar to heat, and you can use the same statistical mechanics to describe it, or fluctuation theorem, which is more precise.
My understanding is that this is pretty cool stuff that solves Maxwell's demon and also sort of explains biological evolution, because apparently a system responding to it's environment is computation, performed by changing the system's state as a function of a driving signal, which results in memory about the driving signal that can be interpreted as computing a model, a model that can be predictive of the future. Now apparently how predictive that model is equals to how thermodynamically efficient the system is. Even the smallest molecular machines with memory thus must conduct predictive inference to approach maximal energetic efficiency.
I agree it doesn’t feel right. But complexity, like life, does emerge even though the 2nd law holds. It is a matter of scale. Entropy does not mean everything becomes disordered. And now I defer to the physicists, because as an engineer I am going out of my lane…
Minor lightbulb went off in my head: you might be interested in slide 17 here, from a presentation on Shape Dynamics [0]
tldr:
(Shannon) entropy [1]: expected description length of a state of a system
(Shape) complexity [0]: a 'clumped-ness' metric: clump sizes / inter-clump separations
information: not sure anyone ever really resolved this in a satisfying way :^) [2]
Hmm, not entirely sure if those terms fit exactly. It's easier to point out you can go from one to the other by setting to hamiltonian to the negative logarithm of the probability density (or use the Boltzmann distribution to go the other way).
Something about this particular line does not sit well with me.
How would we define the relationship between entropy, complexity and information?