Hacker News new | past | comments | ask | show | jobs | submit login

So one thing I've never understood is how you can "count" microstates, or bits required to describe them, when the relevant physical parameters all seem to be real numbers. For instance, a gas of N atoms is described by 6N real numbers (3d position and velocity) regardless of how hot it is. The article talks about quanta of energy, but that seems like a simplification at best: a given interaction might be quantized, but quanta in general (e.g. photons) exist on a real-number spectrum, so it's possible to have an uncountable infinity of energy packet sizes right? (That's the point of a black body, right?)

What am I missing? Is this a weird measure theory thing, where hot objects have even bigger uncountable infinities of states and we get rid them all with something like a change of variables? If you told me spacetime was secretly a cellular automaton I could deal with it, but real numbers ruin everything.




I asked the same question to my professor when I took thermodynamics as an undergrad. In that class we are told that a particle in a box of size 2 can be in twice as many places as a particle in a box of size 1. But in real analysis we learn there are just as many numbers between 0 and 1 as there are between 0 and 2. The answer I was given, in true physicists form, is hand-wave it. There's an intuitive notion that twice as big means twice as many places to be, therefore just accept it and let the mathematicians cry over our abuse of the reals.

The true answer is that "quanta of energy" is not a simplification. The idea that physical variables like energy and position come in discrete units is the quant in quantum physics. If you imagine the position of a particle in a box of size 1 to be discretized into n states, then a box of size 2 really would have 2n states. So all of your concerns are moot because quantum mechanics replaces all the uncountable sets with countable ones.

But this still leaves us with the issue that Boltzmann did all this work before quantum mechanics existed so there must be some useful notion of "bigger uncountable infinities". The answer, as far as I know, is that you can always approximate classical physics with finite precision variables so long as the precision is high enough (replace the reals with floats). The idea of counting states works for any arbitrarily precise (but still finite) discretized variables, and as far as physicists care an arbitrarily precise approximation is the same as the real thing.


> energy and position come in discrete units

This is not true; position/time are not quantized in the standard model. String theory is not canonical. I think a better way to think about it is not in terms of size, but in terms of time. A particle in a bigger box, on average, can go on a random walk for longer without hitting a wall. It will take longer for a particle to sufficiently (arbitrarily close) exhaust the phase space of a bigger box.


I thought the Planck constant somehow tied in with the smallest unit of length allowed. Sort of like the pixels of 3-space.


This is an excellent, and puzzling, question! Let me try to provide some insight into how physicists think about such paradoxes, by addressing the specific example you mention, of the presumed uncountable infinity of different possible photon energies in a finite range of frequencies. In the case of blackbody radiation, when physicists analyze the set of possible photon energies more carefully, they find that there is really only an infinite number of different possible photon energies (for a finite range of frequencies) if the volume of space containing the photons is infinite. In any finite volume of space, if we allow ourselves to place boundary conditions on the electromagnetic fields at the edges of that volume (for example, suppose we think of our volume as a cube with mirrored walls), we find that there are only a finite number of oscillating modes of the electromagnetic field in any finite range of frequency. In the case of a cube, there is a very lowest frequency of radiation whose wavelength will allow it to form a standing wave in the box, and all the other allowed modes are multiples of that lowest frequency. So the density of possible photon states per unit of frequency is actually proportional to the volume of space we allow to hold the photons. (By the way, this is also precisely related to the quantum complementarity of uncertainty between momentum and position. To confine a photon to a volume of space, the uncertainty in its momentum must be the same order as that carried by the lowest frequency standing waves which would be compatible with the container.)


I think part of my mistake was not thinking of the thermal energy packets (phonons?) as waves in boxes, with the box being the boundary of whatever thing has the energy. Which is still weird for a gas expanding into a vacuum I guess, but works for a hot solid object or a confined gas.


This is a very good question, which goes to the heart of statistical Physics. We use phase spaces for this (typically a 6N-dimensional vector space in which each microstate is represented by a point). The system has a probability of being in (or rather very close to) each microstate, which depends on several factors, like the conditions (isolated system, constant pressure, temperature, number of particles, etc). Counting microstates is “just” calculating integrals of that probability weight in the phase space. Of course, most of the time it is impossible, so we have tools to approximate these integrals. There are a lot of subtleties, but that’s the general idea.

The phase space does not change depending on temperature, so there’s nothing weird like the space getting bigger. But the probability of each microstate might, as high-energy states become more accessible.


So would it make sense to think of a microstate as a region of phase space, a point and those points "very close to" it? And "increasing number of microstates" just means a larger number of these regions have non-negligible probabilities? In continuous terms you would see this as the distribution flattening out. I might be having trouble visualising what we're integrating, since if it's a probability the integral over the whole phase space can only be 1, right?


Yes, that is the principle. The probability of a single point is zero because an integral over a point is zero, hence “very close to it” (in an infinitesimal volume around the point).

The integral of the probability over the phase space is indeed 1. This is the purpose of the partition function, which is the normalisation factor. The weight function is not normalised a priori.


That helps a lot. Thanks!


Good point. I think the what's missing is how electron energy states work in quantum mechanics. The wikipedia paged (linked below) has a pretty good explanation:

"A quantum mechanical system or particle that is bound—that is, confined spatially—can only take on certain discrete values of energy, called energy levels. This contrasts with classical particles, which can have any amount of energy. The term is commonly used for the energy levels of electrons in atoms, ions, or molecules, which are bound by the electric field of the nucleus, but can also refer to energy levels of nuclei or vibrational or rotational energy levels in molecules. The energy spectrum of a system with such discrete energy levels is said to be quantized". (https://en.wikipedia.org/wiki/Energy_level)

To put things another way: while there could theoretically be infinite sizes of energy quanta, the permutations of energy states for matter are in fact discrete.

Disclaimer: I am an engineer, not a physicist.


What you say is true, but also incomplete. We are perfectly able to quantify the accessible states in purely classical systems, such as ideal gases, without requiring discrete energy levels. The trick is to think of a continuous probability density instead of discrete probabilities. This framework is very general and does not depend on the quantum-ness of what you look at.

Even in some systems that actually follow quantum mechanics (such as phonons or electrons in a material, or photons in a black body), we often use continuous probabilities (densities of states) because it’s much more convenient when you have lots of particles.


IANAP but there's two different things that come to mind.

Even if the number of states were infinite, as long as there's a reasonable probability distribution you can know you can integrate over the probabilities that have some property vs another (say: solid vs not).

Secondly, energy is quantized (on a very very very small level) in reality. Ymmv though, I tried googling this and read some stuff about waves being quantized and particles not and about how it depends on what system you're looking at and I had to drop out of quantum physics when I took it :-)


For quantum objects, the distinction between a wave and a particle is not very meaningful. The energy levels of an electron around a nucleus are discrete, regardless of whether the electron behaves more like a wave or more like a particle in the specific experiment you’re doing.

Otherwise, you’re right: we count states by integrating (at times discrete, continuous, and often very complex) probability distributions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: