Hacker News new | past | comments | ask | show | jobs | submit login

thermodynamic entropy is to heat as Shannon entropy is to information?



Hmm, not entirely sure if those terms fit exactly. It's easier to point out you can go from one to the other by setting to hamiltonian to the negative logarithm of the probability density (or use the Boltzmann distribution to go the other way).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: