Hacker News new | past | comments | ask | show | jobs | submit login

I hope that's a misunderstanding or miscommunication. It might be useful in some limited contexts, but it would point to gaps in the theory (at minimum).



There are 19 free parameters in the Standard Model. These include such values as the mass of various particles (electron, muon, tau, six quarks, Higgs), Higgs vacuum expectation value, the strength of several interactions, and the mixing angles and phases for certain interactions.

You can rearrange some of the parameters so that they're not free, and redefine each of these in terms of some other related parameter — your choices here are pretty arbitrary; these values being chosen as the 'free' ones does not represent some fundamental truth.

We do not have some underling deeper theory that indicates why these numbers are the way they are. Why is the electron mass ~511 kEv while the up quark is 2.2 mEv? Why not 1.9 mEv? Why not 2.3 mEv? The Standard Model doesn't generally explain it, nor does it even really intend to explain it, just to describe it.


One theory, which I find quite plausible, is that there are many universes, all of which have variations on these constants. And the reason our universe is the way it is, is because we are here to observe it - given that most other configurations of these constants wouldn’t support life (the strong anthropic principle).


It's one that comes up, but first consider that it's not fundamentally clear to the layman that these values are actually completely "free parameters".

Imagine, if you will, benchmarking some unknown CPU, and determining that the fused multiply-add operation takes, idk, 17 times as long to execute as an increment operation. We might postulate that there other CPUs out there where it takes a different amount of time — arbitrary amounts of time! Alternatively, we might gain knowledge of the underlying CPU architecture and understand that fused multiply-add is implemented with a certain set of transistors, and that it's fundamentally more complex, though there's room for some variability based on the specific implementation. In such a world this "free parameter" is set as it is for a very specific reason: a transistor arrangement.

We have limited visibility into what's actually happening "underneath" our laws of physics. Some of the values we see could be truly arbitrary. Some of them might actually be controlled by some other field and change over time (though we haven't seen evidence of that so it seems less likely). Some of them could be a deeper artifact of the way the universe works.

If you look at things like string theory, which do try to describe in more detail and dial down the free parameters to just one ("length of the fundamental string" more or less) we are left with something that's frustratingly nonspecific until you locate a more-specific solution within the broader string-theory solution space and call it "the laws of physics." That specific location might indeed seem quite arbitrary; the question might then become, what relation does this hundreds-of-dimensional solution space have with our concepts of physical reality? And can other areas of that landscape be probed in any way meaningful to our experience of physics?


In terms of the parameters of the standard model, in any other field setting parameters with theory rather than experimentation would be ridiculous. But somehow when it comes to quantum mechanics everyone rediscovers their reductionist hat and demands that every theory neatly explain all the details without anyone actually entering a laboratory.

I'm not saying I'm not a little miffed there isn't yet a nice, lower-level theory that predicts all of QM, but man is it frustrating when people flip flop to and from pure empiricism at the drop of a hat, based on a layman's understanding of a mature, developed field.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: