I think a little context here is in order: from a 10000-foot high level, the particles that can be seen depend on the amount of energy present in the system, which is why we spend billions of dollars accelerating basic particles (protons in the case of the LHC) to high energies and literally SMASH them together, so we can see what flies out.
We see four distinct, different forces in the universe: strong and weak nuclear forces, electromagnetism, and gravity. Our Standard Model predicts that these forces become more unified at higher energy levels:
1. Electromagnetism merges with weak nuclear force at 246 GeV
2. Electroweak force merges with strong nuclear force at around ~10^16 GeV ('grand unification')
3. Finally, quantum mechanics predicts these forces become unified, which is to say indistinguishable, at the Planck energy, around 10^19 GeV.
As we get closer to predicted unification energies, we see different mixes of particles, and the Higgs boson is in fact the particle responsible for electroweak symmetry breaking, with a mass of around 126 GeV.
The problem is that the LHC produces collisions of around 10^4 GeV, so from our current energy scale up to the next unification, with the strong force, we're off by a factor of 10^12.
Back of the envelope estimate is that a supercollider that can reach grand unification energies with our current technology would be around the size of the solar system.
Hence the article: particle smashing is a brute-force approach to investigating new physics, but now there is an extremely wide gulf between what we have discovered, and what we think lies next, hence we need to be more clever than using brute force.
This explanation needs a couple of major caveats: The Standard Model does predict unification at ~250 GeV, but it does NOT predict unification at either 10^16 GeV or 10^19 GeV. Those predictions come from particular extensions of the Standard Model (grand unified theories or various quantum gravity models, respectively) which have been studied by particle theorists for their simplicity.
Which is to say: Those unification scales come from speculative models, not from tested physical law. We do not know with any certainty what lies beyond the energy scales we've tested. You should take with a generous grain of salt any claims that we have physical models that work across an additional 14 orders of magnitude. Consider the range of phenomena that we've discovered in moving from the ~1 cm scale of a glass of water to the 10^-14 cm scale where quantum chromodynamics kicks in.
The Plank scale (Grand Unification scale) comes from the simple dimensional analysis, that combines the gravitational constant with the Plank constant (and the speed of light). It doesn't not really depend on the details of a grand unified theory. It was fist calculated to be 10^19GeV in 1899 by Max Plank, well before any quantum field theory (well before quantum mechanics for that matter). Because it combines the Plank constant with the Gravitational constant, we know that it corresponds to the energy scale at which quantum phenomena unify with the gravitational phenomena. Currently the only theory that describes this unification is String theory, but as the op correctly noticed, it's far from any possible experimental confirmation.
I was at a seminar of Alexander Polyakov, one of the founders of string theory, who compared the string theory to the knowledge of Democritus about atoms. He said that even though the atomic theory of Democritus was essentially correct, it was about 2,000 years ahead of any possible experimental verification. In his opinion this is how long it will take us to verify our Grand unified theory experimentally.
I've never quite believed this argument. It's worked in a lot of circumstances, I agree, but it requires the dimensionless constants to be of order 1. This is usually a reasonable assumption, but it's known to be false in gravity! The cosmological constant ought to be the leading order term in the EH action, but it's too small by ~100 orders of magnitude.
The argument that dimensionless constants are of the order 1 works well for the theories that have been tested experimentally so far. We don't know the value of the cosmological constant. I am not even certain that it is not exactly zero. But it's not dimensionless, it has dimension length^-2, and is usually expressed in the units of Planck's length (currently thought to be 10^-122 I think). If it indeed turns out to be not zero, it is a new fundamental constant, much like the gravitational constant or the speed of light.
I didn't not understand your statement about EH action: the cosmological term there has a dimensionless coefficient of the order 1. Are you talking about some effective action, arising from a string theory model?
Mmm... Particulars of unification depends on how you extend SM, but general idea about unification at 10^16 GeV based upon EM/Weak and QCD coupling constants energy dependence taken directly from SM. Basically, if you plot those coupling constants vs log(E), you'll get their merge at circa 10^16 GeV. That's baseline and it is pretty much pure SM, details are filled differently from different GU theories.
I usually just use the upvote button, but in this case, I must join some other commentors in expressing explicit gratitude for your easy-to-understand explanation.
Minor clarification. The standard model does not describe gravity. It ignores it, which is fine at LHC energies because it's orders of magnitude weaker than the other three forces.
At the planck energy scale, this is no longer the case, the relativistic mass of particles becomes so big that gravitational interaction is too strong to be ignored, and the SM loses it's ability to predict how particles interact, decay, combine etc
> the relativistic mass of particles becomes so big that gravitational interaction is too strong to be ignored
It's not relativistic mass that's the key factor: relativistic mass is frame dependent, and it's not the source of gravity. The relevant factor is stress-energy: energy density, momentum density, pressure, and other stresses. The key factor at the planck energy scale is that the density of stress-energy is high enough that we can no longer have confidence that classical General Relativity is an accurate description of gravity; we expect to see quantum gravity phenomena at that stress-energy density.
I always understood that the Stress–energy tensor in general relativity has a momentum component, so two electrons whizzing past each other at near the speed of light would exert a stronger gravitational pull between them, in all frames of reference, than if the electrons were 'at rest', relative to each other? I admit I didn't study GR so happy to be corrected!
> I always understood that the Stress–energy tensor in general relativity has a momentum component, so two electrons whizzing past each other at near the speed of light would exert a stronger gravitational pull between them, in all frames of reference, than if the electrons were 'at rest', relative to each other?
The stress-energy tensor does have a momentum component, but remember that all components of a tensor are frame-dependent. So is "gravitational pull". Obviously the trajectories of two particles passing each other at relativistic speeds will be different from the trajectories of two particles at are at rest relative to each other at some instant; but the difference is not quite as simple as "more gravitational pull", although the two particles having relativistic velocities does mean that the center of mass energy of the system is larger than it would be if both particles started out at rest.
(Actually, the electromagnetic interaction between electrons is so much stronger than the gravitational that the gravitational effects are negligible in the scenario as you state it; but we could eliminate that issue by considering, say, two neutrons instead. My comments above assume that the scenario has been modified accordingly, which is why I said "particles" instead of "electrons".)
The solar system is already close to a vacuum. Makes me wonder if a new solar system sized accelerator could just be a series of nodes orbiting far out to aim and accelerate the particles around.
Still way too hard for us but also way easier than building a tube going around the solar system.
The tube isn't just to provide a vacuum; it's there to provide continuous acceleration of the particles while keeping the charged particles in the particle beam from dispersing.
There's another issue though.
The reason that accelerators are circular is so that you can keep applying an electric field to accelerate the particles. The catch is, that when you accelerate a charged particle, it emits a bit of radiation, and this radiation increases with the amout of acceleration it's undergoing. The problem is that while you can keep nudging the particles faster and faster tangentially to the track, the circular track requires you to increasingly pull the particles back to the center of the track (i.e. apply a greater centripetal force).
This means that you increasingly lose energy what is called 'synchrotron radiation'. This means that you have a maximum speed (or energy) that you can accelerate particles to with a given radius. Hence if you want more energetic particles in your collisions, you need bigger colliders.
So if you had just a bunch of point to point satellites, you'd lose the benefit of having a greater radius (which doesn't actually apply anymore because you're not doing continuous acceleration anyhow), because you have to increase the amount of acceleration you have to apply at each satellite to pass it to the next one.
What if we just built a really long, straight particle accelerator in space, and we just shot all our highly accelerated particles outside of the solar system? Aside from really ruining someone aliens day in a billion years, what’s the disadvantage of doing this?
The nice thing about rings or other methods like racetracks etc is that the particles come around and go through the same accelerator structure multiple times. Secondly, if you want to collide two beams, the advantage of a ring is that each particle has multiple chances of scattering of the other beam, as it "comes around" to the intersection region millions of times.
That being said, linear accelerators are a thing, as they have other advantages.
No. Not at all. You can expect particle density to be anywhere from 5cm^-3 to 80cm^-3 if you stay within 1 astronomical unit distance from the Sun. Besides, all that plasma is accompanied by frozen-in magnetic field, plus you have energetic particle events, recurrent fast solar wind emerging from coronal holes and plethora of other effects and structures which are terribly interesting to study but which will ruin your high energy physics experiment even if you somehow acquire god-like powers and build a huge non-contiguous accelerator.
With a space-based facility, it seems like we'd be able to observe the types of collisions we're talking about without bothering to build an accelerator at all, if we only knew when and where to look for the events.
People can get slightly better vacuum pressures in elaborate lab settings than are present on the moon. With trivial amounts of effort (and non-trivial sums of $), you can get within a couple orders of magnitude. Source: have done so myself.
The difference is that space is filled with lots of other stuff that makes running experiments less than ideal.
Charged particles radiate away energy if you accelerate them (force them on a curved path), this energy loss is larger for larger accelerations / smaller radii.
To bend the particle beam you need magnetic fields. Particles with more energy need larger fields. Larger radius means less field strength for given energy or larger energy for given field strength.
So a circular accelerator needs strong magnets nearly everywhere to force the beam on a circular path. In case you would put it in outer space, it's not like you just bounce the beam between some small number of spacecraft, that wouldn't gain much if at all, you would need something very similar to the structure in the LHC.
>> We keep gps sattelites in a very precise orbit.
I think it's more like GPS orbits are precisely known, not so precisely controlled. If you have fixed ground stations the GPS sats can accurately figure out where they are (like GPS in reverse). If the GPS sats can transmit their location and time accurately, your receiver can figure out your position accurately. The only thing that needs to be well controlled in this is the ground stations locations, which are not moving.
Yup, early in my career I worked in the group, at the JHU/APL, that did the satellite location calculations for the Navy's version of GPS.
One of the research projects to help with this, actually in effect to have carefully controlled orbits, was a "drag free" satellite. Basically just put the satellite in a ball. Then the ball still has drag, but the satellite does not. Sure, soon as the satellite without drag moves with respect to the ball that does, have to tweak the orbit of the ball to keep the drag free satellite at the center of the ball.
> We keep gps sattelites in a very precise orbit. Maybe that can scale up?
Source for this? It was my understanding they weren’t, hence why the almanac data is so important, but I’m not positive that assumption is valid so I’d love a source to back up your claim.
There are cosmic rays many times more energetic than anything we could hope to create on earth going through space all the time, we’d just need a detector out there.
There's the ISS-CREAM cosmic ray detector on the space station that can "measure elemental spectra of Z = 1–26 nuclei over the energy range 10^12 to >10^15 eV" Given that CREAM is roughly 2x1x1m^3 and weighs 1258 kg while ATLAS on the LCH is "46 metres long, 25 metres in diameter, and weighs about 7,000 tonnes" it may not be possible to put LHC-class detectors in space
You put everything in a way that someone with high school physics can mostly understand the issue. Would it be possible for you (or anyone else) to explain what is merging of forces? Or is it beyond high school?
It's probably helpful to think of this as phase transitions. At the plank scale "temperature" everything is mixed together, but when the system cools and undergoes a phase transition we start seeing different behavior.
Sounds possibly like an invalid analogy, but in fact there's substantial overlap in the theories. If you want to read more try reading wikipedia articles on "ising model", "spontaneous symmetry breaking", "higgs mechanism".
It turns out that seemingly distinct forces are related in some clever way. The most approachable example of this (though it doesn't quite extend to the other unifications listed above) is electricity and magnetism.
Before Faraday's experiments, we understood static electricity (charged objects attract/repel charged objects of the opposite/like sign, proportionally to the product of the charge, and falling off with the square of the distance), and magnetism (magnetic poles repel).
Once we saw that moving electric charges generate magnetic fields and vice versa, we opened the door to realizing that these are both just aspects of the same electromagnetic field.
Thank you! That's probably the simplest possible explanation. "Related like electricity and magnetism are". How far is it from the "actual reality"? Would you (or anybody else) try giving a more difficult explanation?
One possibility is in using space based detectors. Essentially, you put the detectors up above the atmosphere in a relatively 'neutral' area of space to avoid background from other sources [0]. Then, using this floating detector, you use other sources in the universe to do the colliding for you. Something like an astrophysical jet[1] coming out of a black hole, some neutron stars colliding, some super novae collapsing, etc. Taking all these detectors together, you can then get the results you want.
The major issue is that you have nearly no idea when/where these events will occur, and they are really rare to begin with. Add the distances that these things occur at and your error bars go through the roof[2]. Still, over a VERY long time and with a LOT of these detectors spread about the Sol system (and likely a bit further out), you can get the results you'd need.
It's not cheap or easy or timely. But with 12 orders of magnitude to go, it's a lot cheaper, easier, and more timely than trying to build an accelerator with Jupiter in the way :)
[0] as in, not next to the Sun, Earth, Moon, near gravitational perturbers, etc.
How does this map to the idea that we haven’t unified the fundamental forces? It sounds as if we understand exactly where and how they diverge... surely it isn’t just that we haven’t experimentally verifies this?
There are lots of possible models of unification (a short list of where electroweak and QCD unify are listed here[1], not to mention many more models where gravity unifies), and without experimental data, we don't know which one is correct.
Is there some reason why new phenomena can't occur except at these energies? Couldn't there be something interesting that isn't the unification of forces?
I asked a similar question a couple weeks ago and was pointed to neutrino-lepton scatterings. If you search you will see an electron can interact with a neutrino.
I find it fascinating that the prevalent comment on this are variations of "that can't be right, we still don't understand [things, such as dark energy, etc]." It demonstrates one of the most pernicious biases in science.
Particle accelerators have been the mainstay and principle tool for particle physics, and have produced amazing results. Further, there are obvious questions in fundamental physics that the Standard Model revealed by these experiments does not answer. But, it does not follow these further questions can be answered via accelerators that us humans can construct. The universe has made no promise to us that its mysteries are accessible, least of all through some particular method. We had a particular reason to believe the LHC would reveal new physics (the Higgs boson prediction) and we have no such reason for the proposed future accelerator; our evaluation should change as a result.
My guess would be that astrophysics is a better route to understanding the mysteries of fundamental physics; astronomers can measure distant phenomena with surprising accuracy already, and objects such as black holes are one of the few cases where exotic phenomena are currently widely expected to occur. Perhaps the money is better spent on a truly gigantic space telescope?
the James Webb space telescope was originally supposed to only cost 1 billion, but now is around 10 billion USD. Still, we could build two more of those for the price of a new LHC.
And the JWST is going to give us insights about galaxy formation, exoplanets, and more!
I'm pretty sure someone said everything in physics has been discovered just before the start of the 20th Century boom in theoretical physics too.
Michelson (of Michelson-Morley fame), apparently, IIRC. I'd warrant that despite his credentials Yang (of Yang-Mills fame) is making a similar mistake.
But I do think we're likely to see a retarding of the pace of change in fundamental physics (I think we're on to a reflective phase, where science consolidates, and social change progresses; leading to more focus on higher-order sciences - biology and such).
The difference is that back then there were experimental observations that couldn't be explained and that led to completely new branches of physics like general relativity and quantum theory. Right now that is nothing than can point the way. General Relativity, Quantum Mechanics and Electromagnetism are extremely successful with no obvious flaws (other than not always working well with one another) and the few things about which we have no clue (dark energy, dark matter, etc) are so removed from our capabilities that they're intractable.
There are a lot of dark matter searches out there which would disagree with your last statement.
But further than that, we still have a lot of puzzles to solve. Proton radius, muon g-2, proton spin, ...
And they'll keep searching... and finding nothing. :o) And is solving those puzzles worth the tens of billions of dollars needed to build the next generation of accelerators? How much further would that same amount of money go if invested elsewhere?
Hell, invest a few billion in building a true quantum computer or working towards true AI (just to name two areas where there are many more puzzles and obvious directions to explore) and see how much more impact that has and how much faster you can solve those same physics puzzles.
We don't know how to build a true quantum computer. But if you suggest we should also spend money on that, I agree. I don't think it's a smart idea to compare the validity of physics sub-fields. Look at the total amount of money that is spend on science compared to any other program. It's a joke. And it's not that we the people don't want it. Ask them, ask how much they think should be spend on science. It's multiples of what's actually spent.
It's not a matter of comparing validities... just a matter of where can we get the most bang for our buck? The 20th century was the century of physics and we made tremendous progress in all areas of Physics. Precisely because of this progress, it is now time to look elsewhere. Throwing more and more money exploring increasingly esoteric regions of particle physics just because that's what we are used to doing isn't the best policy.
And while I do agree that much more money should be spent on Science, I don't think particle physics is the right area in which to invest. We can make much much more progress in other fields.
You can't separate A from B in the above. For instance dark matter and relativity. Dark matter was invented (or at least expanded) precisely because certain rotational characteristics of distant galaxies seem to contradict what we'd expect based on relativity and gravity. But we believe these theories are correct, so there 'must' just be a bunch of matter there we can't see or measure. And it must be invisible to the point that it only interacts with itself and gravity -- dark matter by another name. However, another recent discovery now indicates that galactic rotational curves are fully explained by visible matter alone, which poses a direct contradiction to the dark matter model that can't be so easily massaged into it. [1]
The point of this is that 'something' is very wrong, and nobody really knows what it is. And this something could be something incredibly fundamental. If it turns out dark matter simply does not exist then we'll 'set physics back' (it would be progress, but the effect would be the same) many decades, but also open the door to an explosion of new knowledge and exploration.
Then there are question like why is the universe accelerating? As you mention the bandaid here is dark energy, but again A cannot be separated from B. There are even open questions about things such as the big bang. Why are things like the cosmic microwave background radiation near homogeneous, even in areas of space that should not be causally connected (light has not had time to go from A to B since the birth of the universe)? One explanation for this is cosmic inflation which in turn created it's own dark matter analog - the 'inflaton' that's even more bizarre than dark matter, but once again it's just a bandaid to try to explain something that goes against everything we'd otherwise expect.
There's an immense amount of room for progress in physics by removing these bandaids - either by testable physical explanations, or by providing better bandaids. And this is also just scratching the surface. There's so much we have no clue about. Even basic observations remain inexplicable, fast radio bursts, the rapid dimming of stars such as KIC8462852, and so on.
I would speculate that there are probably more 'known unknowns' in physics today than ever before. Of course all of the questions seem incredibly difficult to answer, but this is what it always looks like when you're at any given state in science. We only see the answers as obvious or easy once we already know what they are. And then the next questions are the ones that suddenly take the place of seeming incredibly difficult to answer, only for this pattern to repeat to seemingly no end.
Several years ago, shortly after the Higgs discovery, I sat down with a couple CERN people, and their comment was that the current problem is that we need more theoretical physicists coming up with measurable predictions. As many of the other posters state, obviously there is more to learn, but we could learn faster if we had better (and more) predictions about what we might find.
It's harder and harder to do without well developed intuition. To develop good intuition, we need to make good mental model, but we got good (but incomplete) mathematical model instead.
I agree and I think the main reason is the anti-realist camp in physics which has taken hold in the past 100 years.
This view probably developed because the realist models of quantum mechanics were too difficult for people to accept. It was easier to say that intuitive models are outside the scope of science than to change their set of inconsistent preconceptions.
Could you expand on your comment; I thought a (small?) majority of theoreticians were, shall we say, "super-realists" such that they view the Many-Worlds Interpretation as actuality rather than merely a model?
When you say "intuitive models" do you mean ones that echo somewhat Newtonian/Classical physics?
Perhaps part of the problem is that schools follow a method of teaching physics where they teach what is easy to teach, knowing it's wrong, and then teach something a little more complex, knowing that's wrong too, etc.. Perhaps we need to start off teaching that the world is non-Newtonian, that it has Quantum weirdness and relativistic effects (according to our best models), and allow that to underlie our intuitions.
Many worlds is one of the more popular realist models, but it's quite rare for physicists to claim they think it represents reality. It's not taught in any textbooks I'm aware of and even now papers often dismiss experiments having multiple outcomes without even mentioning that they are doing so, coming to absurd conclusions as a result.
The situation has certainly improved a lot since the days of Everett, who was basically dismissed as a crack-pot.
I've seen Hawking quoted as saying MWI is obviously right.
My statement was based on an IoP survey result I saw some years ago (I'm a theoretical physics graduate); I'll try and dig it up as my recollection could be wrong.
Hawking said it was right but that it's "just probability". It's not clear what he really thought about all those worlds being real. He didn't seem to believe in quantum immortality, so I don't think he took it seriously.
There have been a few surveys over the years, but mostly of conference participants. The results depend heavily on which conference you attend.
As a layman, from your statement, what do we have now? [meaning if we don't have enough theoretical physicists]. Is it that we have enough theoretical scientist, but the predictions are not testable? Or not enough talent?
Please try to be at least a little bit skeptical when a handful of naysayers claim that an entire field's worth of very smart people are all spending their careers on pointless work. Those naysayers could be right, obviously! (Even as experts we have to recognize the possibility, and to the many laymen looking on from outside it's even harder to judge.)
But if you read Peter Woit's dismissive rants and accept them as gospel truth, just remember that in doing that you're choosing to believe that hundreds (thousands?) of dedicated, passionate, deeply knowledgeable physicists are either painfully deluded or deliberately running some sort of collective scam. You're choosing to believe that either none of us have bothered to read Woit's objections or that we've all failed to accept them even though they're right.
I'm not sure what our motivation for that is supposed to be. I promise you: people who go into fundamental physics aren't in it because it's an easy road to a steady paycheck! We do this because we're driven to understand the building blocks of the universe. If I were to become convinced that my current approach to trying to understand the universe was fundamentally a dead end, I'd be wasting my life if I continued to study it! I'd be throwing away everything I've trained for, everything in science that I care about. And I have no idea what Woit thinks I'm getting in return.
So your call. Either Woit's right and we're all knowingly wasting our lives, or the situation isn't as cut and dried as Woit claims it is.
I think you forgot one case : it’s psychologically extremely hard to abandon something you’ve been working on / believing in for a huge part of your life, and start over from scratch, no matter how smart you are ( i’d say the smartest you are, the hardest it is to believe you’re completely wrong) . So teachers could be tempted to hang on to the narrowest hope their work still has meaning and keep taking new phds and this could last for a very long time.
Which is probably why it's hard to convince String Theory haters to re-evaluate their opinions, i.e. non-testable explanations.
One, the claim that string theory makes no testable predictions is just plain false. The predictions are just out of reach of current technology.
Two, we should probably be more suspicious of explanations that rely on folk (interpretations of) psychological effects. How strong is the Sunken Cost fallacy? Is it sufficient to overcome the motivation to shift subfields? At what rate should we expect theorists transisioning out of String Theory, under the assumption that Woit is right?
Three, String Theory has been very useful already. It's development has provided us with novel tools for mathematical tools for studying a plethora of applied quantum phenomena.
Yours is a fantastic, spot-on comment. It's easy to get swayed by controversial, non-mainstream opinions, especially when you're not knowledgable about the field. One trick that I'm forcing myself to use is to take evidence of what people do v/s what they say.
String theorists may not be vocal in countering the objections raised, or they may simply consider getting into a debate a waste of time. But the fact that hundreds of smart people are choosing to spend time on something is _some_ evidence that the project is not completely misguided.
Of course, the positive evidence has to be balanced with the evidence of group thinking and sunk cost fallacy.
I could not disagree more with the general statement of the author.
Actually, we are now going from a boring phase of particle physics where the theory was able to predict anything we were able to measure afterwards to a phase where no theorists has a clue what might happen.
This phase is called "exploration". It consists of many "blue shots" of which many will probably have zero results but have to be done in order to find out what is really going on. Theorists had a good run with prediction, now it's the experimentalists turn to lead the way with exploration by producing new data.
When the Muon (basically a heavy version of the electron) was first discovered it 'seemed so incongruous and surprising at the time, that Nobel laureate I. I. Rabi famously quipped, "Who ordered that?"'[Wikipedia]
The new CERN collider has got to be build. To risk to miss the next great "Who ordered that?" moment would be grossly inconsequential.
I feel this comment assumes that experimentalists weren't "exploring" this whole time.
There are many, many, MANY papers that fly out of these particle physics experiments. Most of the papers are about how they do not find stuff.
These papers take the form: "Search for the pair production of X in proton-proton collisions at s√ = 13 TeV" or other more generic papers where they just look for anomalies against Standard Model predictions. Fun fact - nothing found yet! This was the case at the Tevatron, where countless experimentalists poured over the data trying to find the slightest hint of something new. It also is the case today (so far) with the LHC.
To suggest that we're in a new phase where we're going from a "boring" phase to a new one where theory doesn't know kinda misses the point that experimentalists have already been doing this.
Usually, when we reach the limits of theoretical physics we have unexplained experimental results or when the experimental side starts getting boring, we have theories that can be used to make new predictions. Right now, the problem is that there is no clear direction in which to proceed. No new theory predicts anything that can be feasibly tested and there's nothing experimentally surprising that can't be explained theoretically. Until there's a breakthrough, a theoretical framework with testable experimental predictions, there is no point in wasting a few more billion to build bigger holes on the ground in Switzerland.
While I am a theoretical physicist by training, I have no doubt that those same billions could be much better used by many other fields such as biology, chemistry, computer science, ai, etc...
This is the classic theorist talking who apparently knows nothing about the real world. It takes a lot of expertise to build a particle accelerator. This expertise will literally die off, if we stop building accelerators for say the next 50 years. The next generation will have to start at zero again and it will take 80+ years to build the following one instead of 20.
In my phd thesis I had to partly rebuild an experiment that was done 35 years ago. (We were not able to improve the systematical errors. The statistics was vastly improved due to modern electronics though). All people who were involved in the old experiment were either retired or dead. The practical problems we encountered were "theoretical trivial" but plenty and time consuming (e.g. finding the right glue with has the right optical density and which does not dissolve the radiation hardened wavelength shifters). If we had just one person to talk to who had done the old experiment we could have done it in 1/4th time or better.
> This is the classic theorist talking who apparently knows nothing about the real world. This expertise will literally die off, if we stop building accelerators for say the next 50 years.
If we're talking about that being the best argument in the real world for building accelerators at the moment, let's spend $200 million on a project to archive every conceivable piece of information related to building an accelerator and save 99% of our budget.
In my case the problem was indeed that things that were "obvious" were not documented. Some of those "obvious" things were in fact the result of decades of trail and error. In the end the student did it in that particular way because it has always been done that way and for that reason found not necessary to document it.
But even if you document everything many essential things will be gone. Especially knowledge about things that have been tried but don't work because failures are usually not published or documented.
It is a little bit like having a document with runes of an ancient language where you may be can find out the meaning of the words and sentences. But you will never find out how the language actually sounded like.
I think people underestimate how incredibly hands-on experimental physics is.
It's not just about bolting, gluing, and connecting things you can buy to a few things you have to make. It's cutting-edge applied engineering with some of the closest tolerances you'll find anywhere, supported with post-doc math and often with similarly cutting-edge software development.
It gets a lot less attention than theoretical physics, which is a shame because there wouldn't any theoretical physics without it.
I'm all for investing in cutting edge engineering to build something new. I just want the new thing that we're building to actually be useful in itself. A quantum computer, room temperature super conductors, (closer to) human level AI, etc... are all useful things in and of themselves. A particular accelerator a lot less so.
Trust me, "quantum computer, room temperature super conductors, (closer to) human level AI" get plenty of funding to the point where people don't even know where to put money.
> I'm all for investing in cutting edge engineering to build something new.
If you asked oven engineers 70 years ago to build the most efficient oven they can they probably would have build something that is 2 or 3 percent better than what was currently on the market. But nobody of those oven engineers would have build a microwave.
Hence, in order to solve the engineering problems of tomorrow we have to do blue shot fundamental research today because the solution to many problems is not always obvious.
> This is the classic theorist talking who apparently knows nothing about the real world. It takes a lot of expertise to build a particle accelerator. This expertise will literally die off, if we stop building accelerators for say the next 50 years. The next generation will have to start at zero again and it will take 80+ years to build the following one instead of 20.
That expertise can be preserved to a large degree. I'm reminded of a story I read about the F-22. Lockheed shut down the production line, but as they did so, they extensively documented every procedure -- to the point of making DVDs of someone performing each production step -- in case they ever needed to restart it again.
I think you're greatly underestimating how difficult it is to preserve this sort of knowledge. A particle accelerator is not a case where you're going to build exactly the same thing again, like you might do with an F-22. Any new accelerator is going to inevitably try to do something new. Therefore, the most important kind of knowledge is the knowledge of the planning and development process, and of all the things that went wrong (or could have gone wrong, but didn't), and how the design was changed to accommodate those. I think this sort of knowledge is incredibly difficult to capture, in large part because people usually don't think about it explicitly; it's almost a reflex as much as anything else.
There was a paper a while back about Google's software engineering practices, and one of the interesting points that I didn't see discussed much was that "most software at Google gets rewritten every few years" [1]. The paper includes some justifications, but I think one of the biggest reasons is that it's very hard to institutionalize the sort of knowledge that you get by building something yourself. For the same reasons, I think Lockheed's documentation effort might be sufficient to build an F-22 identical to what they do now, but wouldn't be sufficient to design a new plane inspired by the F-22; and similarly knowing exactly how to build an LHC wouldn't be sufficient to build the next big accelerator.
> but I think one of the biggest reasons is that it's very hard to institutionalize the sort of knowledge that you get by building something yourself.
I think that's true, but that fact doesn't really lend much support the idea of building another large accelerator now rather than in a few generations time. Maybe this generation has the time to build another big accelerator using its existing experience, but why build one at all if there aren't clear goals for what it will accomplish and the experience will be lost regardless? All you'd be doing is spending money to spread out the re-learning process.
So your argument to spend 20 billion isn't even the science that you might get out of it, but the engineering knowledge you'll preserve? That seems like an even worse reason to do it.
process and operational knowledge is the most valuable form of knowledge we possess. It is the only form of knowledge essentially only obtainable through practice, and we're always only one generation away from losing it.
We can witness the huge cost of rediscovering this knowledge in space exploration right now.
Is that process and operational knowledge so useful to society that it's worth that big money though? It does not transfer to other, more useful projects so easily. Compare launching LHC with something like launching ITER or solving environmental problems. Much more useful projects, but the LHC process and operational knowledge value there is minimal. It is not clear to me we need to sustain CERN on life support just because they got good at building big particle accelerators. Times change, priorities shift.
> ...I have no doubt that those same billions could be much better used by many other fields such as biology, chemistry,...
Has there been any speculative fiction exploring what a fiat economic system would look like if instead of bankers placing bets on various business ventures, we had scientists and engineers describing, we need to rule out (or in) this, this, and that possibility to move the boundaries of human knowledge forward, and let the entities who have first crack at fiat economic issuance that eventually flows outward into the "real" economy be these science and engineering efforts?
I do think it's worthwhile to question building an incredibly expensive particle collider without a specific purpose in mind, when the money could go towards fields like fusion energy, quantum computing, or cancer research
We have more than enough money to do all of the above. Science is a sideshow; it absorbs a nearly infinitesimal part of the economy, and could be easily funded by a military funding decrease small enough that it'd scarcely be noticed.
Yes, a larger collider is unlikely to find much, but it's cheap enough that we should do it anyway. As bets go, it's a good one.
The argument the article is trying to make is that as bets go, it's actually not a good one. It's not obvious we should spend the money on a larger collider for no other reason than "because scientific progress." There is much more nuance to it than that.
You have to consider more than the upfront cost. These machines are incredibly expensive to run and consume a tremendous amount of energy. The long term environmental cost needs to be factored in as well.
No it's not. The LHC uses as much power as a small to medium sized city [0]. The next generation collider will obviously require a lot more energy.
What's your point anyways? We should only consider the cost and environmental impact of something if it's comparable to the most costly and impactful endeavors out there?
To first order, environmental impacts are roughly proportional to the amount of energy used, which is proportional to the amount of money spent. To first order.
I am not sure why the above comment is being downvoted.
I remember ... early 90s ... when the Superconducting Super Collider was being proposed/started in Waxahachie Texas. The original claim to the broader scientific community funded by NSF, DOE, and DOD was that building this wouldn't impact other science funding.
Then my thesis advisors grant was reduced as part of cost savings to move money around for the SSC.
So ... far from being a good expenditure, real science was cut to make room for a project that ultimately was shut down. Hit me directly, as I couldn't take a research assistant position with my advisor, I had to take a teaching position to provide me income and tuition support.
These were not fun times.
Sabine's article is quite good, and she asked a meaningful set of questions. A new collider is probably not the best use of funds ... though ... honestly ... I'd like to see a helluva lot more money pushed to real science, so we can build the infrastructure (non-retired) scientists need, fund the software they need to develop.
Doubling or trebling NSF budget would help. Similarly for NIH, CDC, and others.
Not that I think we should repeat the funding mistakes of the past (Ph.D. in physics in 90's, think 1000 applicants for each open tenure track position, and 100's of applicants for each national lab position). We should make sure we are doing quality work, and enable researchers to take risks. Current grant process doesn't really allow this.
Eh, the SSC example is a case about poor scoping and oversight, the arguments Hossenfelder make are more fundamental.
Still, it is a bit weird to get upset about the Future circular collider when the International Linear Collider is the next big High energy physics project.
The point that I was trying to make, which apparently must be downvoted into oblivion, is that comparing big numbers to really huge numbers leads to some very distorted conclusions. No matter how really huge the defense budget is[1], spending $20 on a drinking straw is a lot.
Speaking of someone who has been on the receiving end of lots of research money throughout my career, still...$6 billion is a lot of money, and it should not be wasted. I agree with the original article that a new particle collider probably is not what we should do with this money.
[1] (and yes I agree it's too much--go fight that battle if you like, but you probably won't win a single cent back)
What criticisms of the military budget always fail to take into account is that the military budget funds science too. The US Government is the largest sponsor of science in the world, and the military currently funds more science dollar for inflation adjusted dollar than it did even during the cold war.
The military's budget for Research, Development, Testing, and Evaluation in 2016 was 69B, or 11.9% of the total budget.
development, testing and evaluation in a military context does not sound like fundamental science at all...
sure there will be research, and some of it will be open, but all the research behind closed doors does not enjoy the scrutiny of normal research, so there is plenty of opportunity to waste money on nonsense "research" ...
Observing the superior tech used by even standard military it's obvious that there is a fuckton of pure reserch going on, most of it will be top secret for many years though. NSA records this message and will apply some superhuman quantum algorithms to it and theres nothing I can do about it. Hail our petrodollar overlords!
Agreed, but good luck convincing Congress. You have to deal with the reality of the situation. Sure, the money exists, but you won't get it. So what do you do with the money you _can_ get?
The U.S. isn't a member state but the US government (through the NSF and DOE) contribute a ton of money to the research, development, and operation of the LHC and it's major experiments.
What's the budget towards fusion energy research? Given the political difficulties in funding fundamental research, would that be a better sideshow for making a positive difference in the world if we had to prioritize one or the other?
Accelerator itself is mostly CERN responsibility, but detectors are usually international collaborations. US contributed a lot to the CMS detector at LHC (as well as Russia, another non-member state).
Put it all toward life extension and physicists can spend hundreds of years deeply researching these problems instead of having to start from scratch every generation only to get only 20-40 years of useful working life per individual.
Can someone please help me understand why anyone outside of a small circle of curious science buffs should care about particle physics?
It seems the field has advanced so far that EVEN IF new discoveries emerge, they would be of no practical value.
I'm not saying it's not interesting (I like reading about it FWIW), and I'm not saying it will never ever prove useful. But from resource allocation perspective, tens of billions of dollars required for high energy experiments seem to be much better spent on other areas of physics. That is, until the civilization advances far enough that understanding the depths of particle physics or cosmology becomes relevant.
> It seems the field has advanced so far that EVEN IF new discoveries emerge, they would be of no practical value
Particle physicists would disagree. A complete understanding of quantum mechanics, squared with general relativity, is very likely to have practical applications.
Cliche analogy: if you thought the world was flat and ships were falling off the end of the ocean, you might be investing your money in world-edge-detection; and you might say there's no point in studying the edge of the world itself because no practical value can come of it. You wouldn't have any idea that circumnavigation was (relatively) easy once you understood more about the nature of the world.
Well, it's searching for the unknown and competitively FOMO -- fear of missing out. Searching for the unknown has worked great in the past. The physicists are under enormous pressure to publish good papers, and to this end they will push hard against the unknown, by whatever cleverness, directions, ideas, means, techniques, etc. they have. They may move to applied physics, research in engineering, etc.
Maybe the real way of progressing is not by building bigger accelerators with the same technology (and of course a LHC successor would need several new evolutionary developments), but finding out ways of producing higher energy collisions without resorting to accelerating particles in a tube.
We're not even sure if there's anything in those higher energy ranges, there are certainly knowledge gaps, but maybe a bigger collider is not the answer. At least not until we have some other hints and new theoretical ideas.
I still think we have a lot to learn from Bose-Einstein condensates.
If we're hitting a wall with respect to increasing energy in a system as much as possible, maybe we should make sure we've picked all of the low-hanging fruit from decreasing energy in a system as much as possible.
It seems very strange to say that "particle physics may be done", given that are several enormous mysteries and problems in physics that clearly indicate our knowledge is incomplete.
A few examples: QFT and GR are not integrated, dark matter, dark energy, the vacuum catastrophe, and so on...
One could make the argument that a larger collider is not the best way to attack these (I have no idea), but that is a different statement.
The author's argument is not that there are not questions that still need to be answered; but there is no reason at all to believe that a larger collider, that we are capable of building with today's technology, will answer any of them. (This point is made explicit in the comments).
If the funding for your field is predicated on building a larger collider to discover new particles; then you're in trouble. That is the sense 'Particle Physics may be done' is meant in this context.
Ok, I see where the argument is coming from, thanks. I am having trouble buying the idea that without new colliders, there is nothing left to do in the field of particle physics or even with the Standard Model. But I could off base here, or perhaps my interpretation of the article is overly literal.
Well there is good news: The Universe contains structures capable of accelerating particles to the energies we want to observe and blasts them in our direction occasionally. It may take a lot longer, but we can observe those.
Collider physics is not the whole of particle physics. There are in fact known inconsistencies in the standard model (for example, experimental evidence shows neutrinos have mass[0]) that remain unexplained and need further study.
It takes a truly reactionary definition of "standard model" to claim that it's inconsistent with neutrino masses. The extension is straightforward and was not part of the original formulation simply because there was no experimental data requiring it when Weinberg first wrote down his model of leptons.
He also didn't include quarks, so by the same logic, the "standard model" is inconsistent with those too.
There's also the whole issue of figuring out what dark matter is made out of. Neutrinos account for a small part of it, but the rest is completely unknown.
Dark matter is a supersolid that fills 'empty' space, strongly interacts with ordinary matter and is displaced by ordinary matter. What is referred to geometrically as curved spacetime physically exists in nature as the state of displacement of the supersolid dark matter. The state of displacement of the supersolid dark matter is gravity.
The supersolid dark matter displaced by a galaxy pushes back, causing the stars in the outer arms of the galaxy to orbit the galactic center at the rate in which they do.
Displaced supersolid dark matter is curved spacetime.
That simply isn't true. Not discovering a new particle outside of the standard model, not discovering supersymmetry, or bumping against an energy desert does NOT spell the end of the line for particle physics any more than GR isn't over just because they discovered gravity waves. What you guys don't understand is that model building is not a hard activity at all. SU(5), SO(10), E6 whatever. Graduate students do those calculations all the time. What is hard is understanding concepts and there are more than a few surprises hidden away inside the standard model. Strange matter is poorly understood. There may be ways to pretty up the standard model that haven't been discovered yet. People like to go around complaining how "ugly" it is. It's not. It's prettier and smarter than they are, they're just jealous. Particle physics is doing just great and the only thing wrong with it is that it is dramatically underfunded.
Before we throw huge money, we have to make sure we are really using it the best way. These ever larger colliders are big investments. The Higgs was an obvious hole missing in the puzzle, let's find another obvious hole in the puzzle.
Saying particle physics is dead because we discovered all basic particles that we could with our resources - is a lot like saying biology is dead because we discovered DNA.
Stephen Hawking said something to the effect that since the Higgs was discovered particle physics was less interesting. To him but not to me. Particle physics is better off than ever. Experimentally, things are going to change immensely. Deep learning revolutionize particle physics and it's only going to get better. Technology gets better. Regarding theory, I personally think particle physics is in that same awkward stage as calculus before Weierstrass, Dedekind and Cauchy made their pioneering discoveries in analysis. Even after that there was still much to do. That was just the beginning Lebesgue, Danielle integral, Moore-Smith convergence. We're at the very beginning stages of particle physics theory. Nobody even knows what the Feynman integral really IS. Most of the mathematics we currently use is sophisticated but in another sense, quite dippy. Even superstring theory is suffused with what we'll likely look back on as broken maths. All the easier discoveries have been made and now the game is going to be different. That's a great place to be.
yeah but at what cost? how many more billions of taxpayer money need to be poured into this only to further push the theoretical constraints of our parameter space?
Perhaps quantum computers will help break the deadlock? They'll make it possible to extrapolate the consequences of theories that were previously intractable. This might mean that a large number of candidate theories could be tested wholesale to see of they match existing evidence. In particular I know that it's currently very hard to get predictions from quantum chromodynamics, which means that the standard model hasn't yet been fully tested.
This is unfair. Obviously our knowledge is incomplete. We have yet to make that all important connection between general relativity and qft and as far as I can tell, next steps towards that process are not clear. So banging shit together even harder to see if it can produce some anomalies doesn't seem like the worst possible idea, and at the very least it might rule out more possibilities.
The author's point is that you wouldn't expect to see quantum gravity at any scale we could feasibly probe for the next few centuries. If one can't come up with a theoretical justification for such an anomaly then perhaps that money should be put towards a more promising project.
Opportunity cost plus future budgeting. If we build a bigger collider and find nothing useful, it’ll be that much harder to persuade lawmakers to fund an even bigger one later.
Well, as far as I know (I might be wrong) she hasn't done any HEP phenomenology, so maybe at this moment she can't get that electroweak precision tests at a Higgs factory aren't a question of if, but when. She has enough training to understand why, so it'd be an interesting exercise for her to explain why that'd be futile, leaving aside the blogging standards.
I'm surprised this has been downvoted. I just can't understand how can be relevant the opinion of a physicist about something they've not worked on just because they have a blog. Sabine is on some kind of crusade nowadays, that's not science, it's something else. You can't dismiss the whole field of particle physics just because, get into the details you're supposed to be aware of and then you'll be adding something more than noise. And I'm sorry she thinks there's nothing worthwhile to be found with upcoming feasible experiments, that's incredibly disappointing, to say the least. The SM is an effective theory, the details of any underlying physics are coded into observables that we aren't done measuring yet, how's that not important?
I can only hope we've reached "the end of the line."
I believe little additional monies should be directed to more powerful particle accelerators until compelling evidence of something worth pursuing comes from theoretical physics. Meanwhile there are many problems where the investment will provide a predictable and useful return.
You misunderstand: I am not critical of physics experimentation in general; I am critical of the never-ending pursuit of evermore powerful particle accelerators.
At this time I oppose further pursuit of ever-bigger accelerators which draw interest and money away from other more productive areas of science.
Scientifically, discoveries of new kinds of particles and their strange behaviour. On the practical side, accelerators and generators of those particles for imaging and radiation treatment.
This quote has nothing to do with the article's point. The article is not making a claim that there are no open questions remaining in particle physics.
Yeah, the context of the quote is that he's making the argument that more exact measurements are enabling us to measure deviations between physical theory and reality, and that these deviations are where new science lies.
In other words, having already determined in broad strokes how the mechanics of the world work(conservation of mass, conservation of energy, thermodynamics, etc.) it now falls to us to make increasingly precise measurements of physical systems. Assuming that these laws always apply, we then need to figure out why our precise measurements do not match up with the predictions of our science.
He's not talking about the end of science at all, and it's pretty uncharitable to think Michelson would ever say such a thing.
A lot of stuff in cosmology about the curvature of the universe, dark energy, expansion. Some of the quarks were not observed until after. Observation with high beam neutrinos. Controllable quantum systems were barely imagined in 84, and now they are "common" and we might even be able to scale them to sizes necessary for quantum computing in the next decade.
Perhaps it's time to stop building colliders bigger and bigger, start building the ultimate big one (like around the whole planet) and concentrate on doing math and other kinds of experiments until it's finished.
"There is nothing new to be discovered in physics now. All that remains is more and more precise measurement."
- Kelvin, 1900
Prof. Stephen Hawking mentions this in general context that, 'they thought physics was a closed subject in early 20th century' & proceeds to state how the universe was unfolded in theoretical physics in 'Brief Answers to the Big Questions'.
Yes, we get it. You're the second person to cite that quote in this thread. But the article is not saying there is nothing new to be discovered in (particle) physics.
My bad that I didn't see the earlier comment, interesting that it was attributed to Albert Michelson; seems a similar statement was made by him as well.
As far the article is concerned, I didn't quote to suggest they say it is end of particle physics. I just didn't agree with,
> But with the Higgs found, the next larger collider has no good motivation
Theoretical physicists were in consensus that to find precise answers w.r.t what happened during Big Bang we need high energy collider of the size of our solar system, that didn't deter them from creating LHC.
This is reminiscent of the plot of Cixin Liu’s “Three Body Problem” trilogy of science fiction novels, where an alien race actively interferes with particle physics experiments on earth in order to limit the progress of physics research. Highly recommended for fans of hard sci-fi.
But they also stretch out an elementary particle to gargantuan proportions in order to write out a circuit board on the surface, so I don't know if its really a bastion of hard science fiction. It's closer to fantasy about sciencey stuff.
Even "hard" sci-fi can make a few changes to the universe, or speculate about something. Hardness is a spectrum anyhow, but a lot of us see hardness as being more about having rules, following them carefully, and letting the rules participate in the driving of the story itself, rather than that the rules must be exactly and only the rules we currently believe the universe follows. (Which are, incidentally, known to be incomplete, so even that's not really attainable.)
Greg Egan, for instance, has several stories that can only be described as quite hard, but are based around entirely different physics, not merely tweaks to ours, or are based around particular models of black holes that can't be proved, etc.
I've dipped back into reading SF after a long time of only reading the classics (Heinlein et al), and the Three-Body Problem, which I read a few days ago, did seem to me to fit into that mould.
But what are some other modern "hard SF" books you've enjoyed?
Recent - James Corey (what the Expanse is based off of). Daniel Suarez would be popular in these circles, but more Crichton-y science than next 100 years. Some of Stephenson's books (Seveneves, etc). Others Stephen Baxter, Richard Morgan (Altered Carbon).
All great, but I'm more old-school, William Gibson, James P. Hogan, Greg Bear (Eon series), Mary Doria Russell (The Sparrow).
This comment is a pretty big spoiler to the entire first book. I agree it's a great series; would you consider making an edit so as to be a little less specific?
While I do believe our knowledge of particle physics is still incomplete, the money spent plumbing its depths seems disproportionately high. When was fusion energy meant to be produced again?
Nuclear fusion won't work, because stars are not powered by fusion. What happens in the stars is that heavier elements capture protons, until they eventually reach lead, where it keeps cycling - lead 206 captures proton, turns into bismuth 207, which decays into lead 207, captures another proton, turns into bismuth 208, captures yet another proton, turns into polonium 209, which either decays into lead 205 (releasing a helium core), or captures another proton and the resulting astatine 210 decays into polonium 210, and the polonium 210 decays back into lead 206 + a helium atom.
Which is why direct fusion of hydrogen/deuterium will never work at expected temperatures and controllable speed, it can only work in explosions, if at all.
First of all, we can do/have done fusion on earth (from H-bombs, which are quite calculable, to ICF, also quite calculable).
Second, proton capture for heavy nuclei is endothermic (so you cool down) and there is no way "cycling" can produce energy since that would be a perpetual energy source. That's why fusion stops in iron rich stars (fusion is exothermic until you reach Fe-56). We have A LOT of astro observations to back this up.
The heavy elements that do get produced are generally produced by neutron capture, not proton capture (r-process). This is an endothermic process and happens during large energy releases such as neutron star mergers and potentially core collapse supernovae (although there still needs to be more experimental evidence).
It doesn't seem to be working, so it's reasonable to expect it's wrong.
>Second, proton capture for heavy nuclei is endothermic (so you cool down)
No, it isn't. Whoever calculated that Iron is where it ends forgot to add the binding energy of the added proton. The binding energy per nucleon decreases, but there is one nucleon more, so it's still a net gain.
> there is no way "cycling" can produce energy since that would be a perpetual energy source.
It wouldn't you're using up hydrogen.
>We have A LOT of astro observations to back this up.
We see four distinct, different forces in the universe: strong and weak nuclear forces, electromagnetism, and gravity. Our Standard Model predicts that these forces become more unified at higher energy levels:
1. Electromagnetism merges with weak nuclear force at 246 GeV
2. Electroweak force merges with strong nuclear force at around ~10^16 GeV ('grand unification')
3. Finally, quantum mechanics predicts these forces become unified, which is to say indistinguishable, at the Planck energy, around 10^19 GeV.
As we get closer to predicted unification energies, we see different mixes of particles, and the Higgs boson is in fact the particle responsible for electroweak symmetry breaking, with a mass of around 126 GeV.
The problem is that the LHC produces collisions of around 10^4 GeV, so from our current energy scale up to the next unification, with the strong force, we're off by a factor of 10^12.
Back of the envelope estimate is that a supercollider that can reach grand unification energies with our current technology would be around the size of the solar system.
Hence the article: particle smashing is a brute-force approach to investigating new physics, but now there is an extremely wide gulf between what we have discovered, and what we think lies next, hence we need to be more clever than using brute force.