You're right, but it's also important to remind everyone that this type of fusion research is only relevant for fundamental physics and for nuclear weapons research. This is not remotely a plausible path to fusion power generation. And, the NIF is part of the branch of the US government that handles nuclear weapons.
Worth pointing out that "the part of the branch of the US government that handles nuclear weapons" is the Department of Energy, which of course also handles fusion energy research.
Is there a physical law that indicates it's not feasible? Because if not, this is just like saying multiplying a bunch of matrices is not a feasible way to build a machine that speaks English. These things are unpredictable.
I think it'd simply be the fact that the facility isn't aimed at achieving break even, they're mainly interested in performing tests which validate the viability of the nuclear stockpile without having to test the bombs directly.
For example, there isn't really a means to extract the energy released to generate electricity from it in the facility (as the pellet has to be equally compressed from all sides by lasers).
Similarly, the lasers they're using are pretty old and inefficient by modern standards, they're sticking to them because improving electricity-to-laser efficiency is not the bottleneck to their system, it's laser-to-pellet efficiency (along with the stability and accuracy of their optics etc). But if they were concerned about power generation, electricity-to-laser efficiency is obviously important.
Basically, while the general concept of this kind of fusion reactor might be potentially viable, this specific facility likely is not (with its current mandate).
Laser efficiency is important if you're actually building a power plant, but for an experimental facility, it's easy enough to correct for the inefficiency of your old lasers. That doesn't make your research inapplicable to power plants.
I don't think so. The point of these tests is to maintain the deterrent power of nukes by showing that they are still intact.
The West doing this by putting effort into all sorts of extremely advanced machinery like NIF in contrast to Russia having to resort to setting off nukes would be a convenient situation for propaganda, since it only makes Russia look even more like a warmongerer.
As for the advantage provided, IIRC the US has been performing "dry" tests, where a bomb with no nuclear material is detonated to verify the trigger mechanisms. That, combined with the tests at NIF and other facilties to verify the viability of the nuclear material, should be comparable in terms of verifying functionality.
Point of tests is to verify state of the pits after long storage. Plutonium is a bitch to manage because it is alpha-active and alpha particles are in essence, helium atoms, so over time a piece of plutonium gets full of helium caverns and develops internal tension. It's also very complex in terms of crystallic properties (has several stable crystal forms), which are impacted by temperatures and yes, those internal tensions. While it is modelled as much as possible, no one truly knows how good are the pits that were kept in storage for decades, anymore. They certainly still work, but whether they are good enough to properly initiate a secondary, no one really knows. Testing could be very instrumental in finding that out.
Well, somewhat. The amount of precision that the physical worlds demand in the construction of the fuel pellets, and the amount of energy involved in using them, guarantee that you need an extremely expensive mechanical process for the fuel. So, while physically you probably can extract energy from the pellets, it'd be like a steam train powered by gold bars instead of coal.
For what it's worth, the claim that improving laser efficiency should be straightforward sounds right to me. That they ignored laser efficiency to focus on ignition sounds like a principled approach to research: pick a specific target and focus exclusively on that target.
We've already produced lasers with about a 100x efficiency improvement over what the NIF currently uses (65% vs 0.5%). Same wavelength, but obviously very different power levels.
If you count the efficiency of the steam turbines to actually generate electricity from the fusion thermal energy you'd need something like 750MJ of fusion energy to break even. (assuming your steam turbines are 40% efficient)
Given that you'd want to actually generate electricity rather than just break even we're talking about three orders of magnitude rather than two.
First computer fit in a hangar, consumed enormous amount of energy and provided a tiny fraction of the computing power that you now have in your smartphone.
While you are right, sometimes I can't help but feel like Moore's Law (etc) has done us a disservice by making it so we compare every kind of technological progress to the progress in computer hardware (or I guess electronics more broadly) and expect that kind of progress in other domains. Are there any other fields that have experienced the same sort of staggering, exponential improvement? Off the top of my head, think of say, food/agriculture, biology, aerospace engineering, construction engineering, etc. All have seen steady, impressive improvements, but nothing comparable to the steady (over many decades), yet exponential improvement of Moore's Law - nothing comparable to going from room-sized computers to having 1000x the compute power in a smartphone chip.
(EDIT: This isn't to say that those fields are worse, or the scientists there less skilled, or something. They're just different domains. "Increase transistor density" may simply just be an easier problem to solve - despite being an incredibly difficult problem - than the issues in those fields.)
I'm going off on a tangent a bit, but all I'm trying to say is, I feel like "if electronics manufacturing can improve at X rate, then surely Y field can also improve at that rate" is a bit of a fallacy.
Of course you're right in general but the fusion triple product actually did increase exponentially, at a faster pace than Moore's Law, from 1970 to 2000. Then for a while everybody decided to put most of the money in a giant construction project in France that still isn't finished. Now we're partway back to the system of competing smaller projects that we had during the exponential period.
Lasers have also been improving dramatically. In particular the power of fast lasers has been going up exponentially.
a computer simply automates something you can do with your bare hands: calculate. Manipulating the strong nuclear force is not even comparable.
My opinion about fusion is that by the time they figure it out (which I think could eventually be done, if we invest a large portion of humanity's knowledge and wealth), it won't even be worth it. We could have almost-free energy now with fission, and renewables keep getting better. Fusing atoms (and getting more energy back) will be an astonishing feat when we accomplish it, but not offer much benefit over existing power generation. For instance, financially it would take a lifetime to ever recover the costs invested. Even once it's figured out, it will still take decades to build the plants, which will be buggy-first-generation models (that still contain dangerous radiation, just more manageable). I really wanted it to succeed (20 years ago, say), but now I think it's a lost cause.
For computing devices being smaller typically means using less energy as well, so it’s a bit different than a power generation facility where the whole point is power.
The article linked says the laser energy is 2 MJ. So even a 100% efficient laser would only have a 2x gain. And some quick googling gets me 80-90% as max feasible laser efficiency.
And you would probably need more like a 10x gain to make it feasible so would need another order of magnitude from something beyond laser efficiency. Can you trigger more fusion with the same laser energy by scaling the system up?
> The article linked says the laser energy is 2 MJ. So even a 100% efficient laser would only have a 2x gain. And some quick googling gets me 80-90% as max feasible laser efficiency.
This doesn't sound right to me. The NIF's laser efficiency is less than 1%, so an 80% efficiency laser would be ~100x gain.
Edit: Actually, I'm not positive I'm reading this right. It says the laser was less than 1% efficient in 1996, there may have been upgrades since then...
The point is that the fusion reaction has produced 2x the power that the laser fed into it. So a 100% efficient laser (which is not physically possible) that injected 2MJ of power into the pellet would mean a net 4MJ of generated fusion energy. Then, you need some way to turn that energy into electricity, for which no realistic design exists in the case of ICF, so you'll lose more power.
I'm generally on team NIF as a laser guy, but my biggest gripe is the calculation for ignition they employ uses the UV light into the hohlraum which ignores the 3-omega frequency tripling as the light (originally IR at 1053nm) enters the main target chamber. That frequency tripling will always rob a lot of energy from the lasers as it does generally for normal laser setups. I feel like loss from other parts of the laser like the amplification and other general losses are understandable because it's a laser and that's unavoidable, but they absolutely should include the loss from the frequency tripling because that seems like an added on thing (this improves penetration into the walls of the hohlraum), even though that will push them below the ignition threshold again.
The loss there is about a factor of 1/2 or so, so they'd have to improve things by that much.
It's not even laser energy break even if you count the energy used to actually generate the laser. It's only break even if you just count the energy from the laser going into the fuel pellet.
Prior to this, ignition had only been achieved in hydrogen bombs.
Basically, they confirmed that it is possible to have a controlled fusion reaction where the reaction puts out more energy than was put into the reaction, a prerequisite step to being able to put out more energy than was put into the entire machine.
Everyone assumed that controlled ignition was possible, but it's still meaningful to be able to prove it experimentally, particularly since now they can probe the limits and understand how different factors affect the result.
I attended a presentation by the NIF guys earlier this year at a conference, where IIRC one of the bigger challenges was the optics.
Due to the amount of energy being put through them (particularly since it was pulsed), any imperfections would be amplified, quickly rendering the component unusable. They ended up developing an entire automated system for fixing these using an approach I can't recall.
So I guess the losses in terms of reaching break even (which this facility is not specifically aiming for, its main purpose is to ensure our hydrogen bombs still work) are the electricity-to-laser efficiency (IIRC these lasers are pretty old now and less efficient than modern lasers), making optics which can better tolerate the energy, getting the timing right so that the pellet is compressed equally (any imbalances manifest as reduced efficiency) and making better pellets (since of course, this is also an energy intensive process at the moment).
The former is a milestone for the latter. In order to demonstrate that net energy for the process is even possible, they have to show that the ignition can be breakeven. By analogy they have to show that combustion is possible before spending even more money on building a giant gas fired power plant.
It's a small milestone, but it's a very important stepping stone if there's going to be any future for it. Getting it to the commercial power plant stage is a much more holistic problem that will probably take 10x more investment which no one wants to spend sight unseen.
While it's true that total net energy is what ultimately matters, the article points out that 99% of the energy that goes towards the lasers is wasted. So it seems like a logical next milestone.