Hacker News new | past | comments | ask | show | jobs | submit login

> Quantisation is a nearly well understood process

The layman’s explanation I received for this, perhaps incorrectly, is that you fill in variables for factors that you don’t understand or can’t measure with whatever value makes the rest of the theory / equation work. Is that at all correct?




No, that sounds more like a (very distorted) description of regularization and/or renormalization.

Regularization techniques take effective field theories like the Standard Model, which are known to be inaccurate at high enough energies, and isolate their low-energy behavior.

For example: classical electrodynamics, interpreted naively, says that the total energy stored in an electron's electric field is infinite, since the field strength blows up to infinity at the position of the electron itself. As this isn't actually true, we know there must be physics going on very close to the electron that our theory doesn't account for. But even if our theory isn't the Final Truth, it's still capable of making perfectly good predictions far away from the electron, and we can't just sit around waiting for quantum mechanics to be discovered.

So we make the structure of the electron a parameter of our theory: it can't tell you what the force between two electrons is, but if you posit that the electron's charge is distributed like so, it can tell you what the force between two electrons would be.

We then use empirical data to reabsorb our new degree of freedom: this is renormalization. Instead of trying to predict the force between two electrons, we measure it, and then work backwards to figure out what the electron's charge distribution would have to look like to produce that force. In this particular case: the electron behaves as if it were a sphere about 10^-15 meters across. This is of course not the actual structure of the electron, but it reproduces the same low-energy classical physics.


> We then use empirical data to reabsorb our new degree of freedom: this is renormalization

Notably, in a less rigorous field, this process would be called "doing science to determine physical properties of things", which most people consider an acceptable pastime for physicists. As it turns out we can be pretty confident that the "experimental evidence" is bunk and there's something going on the theory can't account for, but it's funny that so many of the Monday morning quantum physicists think all this math stuff is just confusing the issue and pure, blind empiricism is the way to go.


I hope that's a misunderstanding or miscommunication. It might be useful in some limited contexts, but it would point to gaps in the theory (at minimum).


There are 19 free parameters in the Standard Model. These include such values as the mass of various particles (electron, muon, tau, six quarks, Higgs), Higgs vacuum expectation value, the strength of several interactions, and the mixing angles and phases for certain interactions.

You can rearrange some of the parameters so that they're not free, and redefine each of these in terms of some other related parameter — your choices here are pretty arbitrary; these values being chosen as the 'free' ones does not represent some fundamental truth.

We do not have some underling deeper theory that indicates why these numbers are the way they are. Why is the electron mass ~511 kEv while the up quark is 2.2 mEv? Why not 1.9 mEv? Why not 2.3 mEv? The Standard Model doesn't generally explain it, nor does it even really intend to explain it, just to describe it.


One theory, which I find quite plausible, is that there are many universes, all of which have variations on these constants. And the reason our universe is the way it is, is because we are here to observe it - given that most other configurations of these constants wouldn’t support life (the strong anthropic principle).


It's one that comes up, but first consider that it's not fundamentally clear to the layman that these values are actually completely "free parameters".

Imagine, if you will, benchmarking some unknown CPU, and determining that the fused multiply-add operation takes, idk, 17 times as long to execute as an increment operation. We might postulate that there other CPUs out there where it takes a different amount of time — arbitrary amounts of time! Alternatively, we might gain knowledge of the underlying CPU architecture and understand that fused multiply-add is implemented with a certain set of transistors, and that it's fundamentally more complex, though there's room for some variability based on the specific implementation. In such a world this "free parameter" is set as it is for a very specific reason: a transistor arrangement.

We have limited visibility into what's actually happening "underneath" our laws of physics. Some of the values we see could be truly arbitrary. Some of them might actually be controlled by some other field and change over time (though we haven't seen evidence of that so it seems less likely). Some of them could be a deeper artifact of the way the universe works.

If you look at things like string theory, which do try to describe in more detail and dial down the free parameters to just one ("length of the fundamental string" more or less) we are left with something that's frustratingly nonspecific until you locate a more-specific solution within the broader string-theory solution space and call it "the laws of physics." That specific location might indeed seem quite arbitrary; the question might then become, what relation does this hundreds-of-dimensional solution space have with our concepts of physical reality? And can other areas of that landscape be probed in any way meaningful to our experience of physics?


In terms of the parameters of the standard model, in any other field setting parameters with theory rather than experimentation would be ridiculous. But somehow when it comes to quantum mechanics everyone rediscovers their reductionist hat and demands that every theory neatly explain all the details without anyone actually entering a laboratory.

I'm not saying I'm not a little miffed there isn't yet a nice, lower-level theory that predicts all of QM, but man is it frustrating when people flip flop to and from pure empiricism at the drop of a hat, based on a layman's understanding of a mature, developed field.


All correct.

Oh ... and if things are still a bit floppy, just fix the gauge and you are good.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: