Boltzmann and Gibbs turn in their graves, every time some information theorist mutilates their beloved entropy. Shanon & Von Neumann were hacking a new theory of communication, not doing real physics and never meant to equate thermodynamic concepts to encoding techniques - but alas now dissertations are written on it.
Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x) - multiplying it with its own logarithm and summing doesn't tell us anything new. If it did, it'd violate quantum physics principles including the Bell inequality and Heisenberg uncertainty.
The article never mentions the simplest and most basic definition of entropy, ie its units (KJ/Kelvin), nor the 3rd law of thermodynamics which is the basis for its measurement.
“Every physicist knows what entropy is. Not one can write it down in words.” Clifford Truesdell
Gibbs’ entropy is derived from “the probability that an unspecified system of the ensemble (i.e. one of which we only know that it belongs to the ensemble) will lie within the given limits” in phase space. That’s the “coefficient of probability” of the phase, its logarithm is the “index of probability” of the phase, the average of that is the entropy.
Of course the probability distribution corresponds to the uncertainty. That’s why the entropy is defined from the probability distribution.
Your claim sounds like saying that the area of a polygon cannot be a measure of its extension because the extension is given by the shape and calculating the area doesn’t tell us anything new.
"It's tempting, if the only tool you have is a hammer, to treat everything as if it were a nail." Maslow 1966. The essay is about physics, but all comments are about formatting and LLMs.
The conversationalist tone of the essay is misleading too. Hilbert, Minkowski, & Poincare, had done all the heavy lifting math and had held Einstein's hand all through 1915. As mathematicians they wouldn't qualify for Noble prize so made no claim to the discovery of GR.
FOSS has impoverished generations of programmers. As originally championed by Stallman & others in the 80's, its scope was foundational & infra-structural codebases, ie. OS, firmware, compilers, databases, and later browses etc. Anything that would bind you to a vendor and inevitably to the Deep State.
He was a visionary and we must all give credit where it's due. However, FOSS, as applied to apps, games & tools is nothing more than tech slavery, which is why nowadays it's championed by FANNG and the Deep State (some countries like Israel and India commercialize FOSS source codes as a matter of statecraft).
I’m not disagreeing with you on the eventual outcome, but what makes you think Stallman and his cronies ever intended to limit the scope of the Free Software movement to foundational software?
The only limit in scope I’ve heard about is “devices intended for installing software”.
In China and other places where you want to squeeze all the performance from gpus of a generation or two ago. But it's not a portable skill set (Google won't hire you) so be careful of what you wish for...
Industrial/commercial adoption of LLMs is quite varied and critically depends on the quality vs criticality match.
In healthcare, engineering, construction, manufacturing, or aviation industires adoption is mostly on the admin side for low priority/busy work - virtually no doctors, pharmacists, nurses, engineers, technicians or even sales people use LLMs on the job. LLMs products have serious quality issues and are decades behind industrial databases, simulation and diagnostic tools.
On the other hand in academics, consulting, publishing, journalism, marketing, advertising, law and insurance industries it's wildly adopted and is surging. For example, Westlaw's Co-counsel is better at drafting and case law briefing than all but the most experienced litigators. Not only it has not replaced lawyers, but is causing a boom in hiring since the time and cost of training new associates is drastically reduced and allows firms to take on more clients with more sophisticated case work.
by many accounts Gerard T Hooft is the most underrated living physicist, today. His contributions far exceed that of any String Theorist, and I'd put him above Feynman, Gell-Mann and Weinberg in both depth and breath of his physics (his super-determinism notwithstanding).
Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x) - multiplying it with its own logarithm and summing doesn't tell us anything new. If it did, it'd violate quantum physics principles including the Bell inequality and Heisenberg uncertainty.
The article never mentions the simplest and most basic definition of entropy, ie its units (KJ/Kelvin), nor the 3rd law of thermodynamics which is the basis for its measurement.
“Every physicist knows what entropy is. Not one can write it down in words.” Clifford Truesdell
reply