They would like to charge you $32 to look at it, nature-ally.
The fusion yield is (?) 14 kilojoules (inferring this from physicist Mark Herrman's "5 million billion fusions" [WaPo], at 18 MeV per fusion), which is a moderate improvement over the 8 kilojoule achivement from last fall:
The "1%" energy efficiency figure [a] is misleading: 14 kJ is about 1% of 1.5 MJ of ultraviolet light hitting the fuel capsule. But creating that UV pulse consumed 3 MJ of infrared light, which in turn took 400 MJ from the flash bulbs driving the IR laser. So the system efficiency is more like 0.003% (and throwing in hypothetical turbines to generate electricity, 0.001%).
[a] I'm referring to this: "while more energy came from fusion than went into the hydrogen fuel, only about 1 percent of the laser's energy ever reached the fuel."
(update: from yosyp's link, the fusion figures were 14.4 kJ, and 17.3 kJ, on two different runs:
As it was already mentioned in the threads like this before, the NIF was built using the technology that was available in those years of the past when NIF was planned and started. Flash lamp pumped lasers are bigger, more expensive and much less efficient, i.e. mere few percents efficiency at best vs. tens of percent efficiency of modern laser diode pumped IR lasers. (Even with such improvement though, i still think laser driven fusion is inferior to many other forms of inertial confinement and much less probable to result in system-level break-even. My personal favorite, Sandia Z-machine is on the scale of 15% efficiency of generating X-ray pulse, and schemes like deep plasma focus or polywell by-pass the energy to radiation conversion stage all together and thus avoid related losses.)
Upvoted for the sentiment.
The actual paper marks just another incremental step in a triple marathon that we seem to be running veeeery slowly.
Also, if you do "pull an Aaron Schwartz" please change the ending - we need a very public Supreme court administered kick-in-the-teeth for all the out of control copyright regime and its criminal sentencing farce.
I accessed over a vpn link into campus, signed-in by my user ID. The md5sum of the link provided above (and the others in the thread) are identical to the copy I downloaded from my campus IP block. Unless you're also on the campus of Iowa State, I don't think they're watermarking this paper.
By terrorists you mean those illusionary guys that the politicians use as the boogie man, to beat down the rationality of ordinary people by provoking fear, so they are legally allowed to do virtually anything without nobody to stop them?
Much more likely we gonna be killed by diabetes, or falling from the stairs.. the propaganda is unproportional to the size of the threat.. meanwhile, other types of "predators" may scape from our sight.. not minimizing the problem.. just saying that there are bigger ones that deserve more attention
.. and overreacting wont help, i think, anybody.. so, back to the original topic..
Muahahaha Ha! I'm not American. I'll read it TWICE and cost you TWICE the taxpayer dollars, then I'll share it with all my friends and soon all the dollars in America will be GONE. Better call the president and tell him to start enacting measures, cause there are no breaks on THIS terrorist train!
No offense to anyone, but this link should be deleted. You may not like the system, but HN isn't the place to protest IP laws with civil disobedience. A lot of YC companies have their own IP they'd like to protect I'm sure.
"Lawrence Livermore National Laboratory (LLNL) is a Federally Funded Research and Development Center (FFRDC) founded by the University of California in 1952" [1].
My tax dollars paid for this research I should be able to read it.
As far as YC is concerned AirBNB and Uber encourage people to break the law all the time, some rules are just stupid and outdated and don't make sense and should be broken. This is "Hacker" News after all.
And that's a silly comparison. Copyright law already excludes work created by official duties of officers of the federal government from copyright.
That's not this work, but arguably the original motives and intent of the law is subverted if what's happening here is the government outsourcing the creation of the work to a third party.
This isn't protesting the law with civil disobedience, it is breaking the law because the direct expected value of breaking the law (people having access to this specific paper) is greater then the expected cost of breaking the law (~0).
Why can't it be both? Saying that the law is imposing costs (equal to the value of breaking it here), and that you're against it enough to flout it, is a protest. Protests don't need to be accompanied by a manifesto.
Are you a psychopath? In what world does personal expected value versus expected cost of breaking the law ever become a good way to make moral decisions? If you have the chance to get a promotion by killing your boss but have near zero chance of being caught due to some circumstance, would you kill your boss? What the fuck.
Why call your opponent names and become emotional? It's not effective.
In what world does personal expected value versus expected cost of breaking the law ever become a good way to make moral decisions?
There are at least two counterarguments to this.
1) The law is written for precisely this reason. Most people do live by a cost vs benefit mindset, and the penalties of laws are generally tuned until people stop committing the crime. I wouldn't want to live in a world where the penalty for murder was only a week in prison.
2) Anyone who uses the law as a way to guide their own moral compass probably isn't very moral, because throughout history every society has had laws which are later agreed to be evil.
In this case it is clear he is talking about utilitarian expected value rather than personal expected value. He receives no personal benefit by uploading the paper, but reading it provides value to others at ~0 cost to anyone.
Actually, I was thinking in terms of personal expected value (namely a sense of satisfaction and good doing), however, I can see the argument that uploading it is also net positive globally.
That's just silly. If you kill your boss, who's going to promote you, and who is going to pay out the higher salary? No, the rational thing to do is to wear his skin to impersonate him and sell all the company's assets. Then eat him.
Your first sentence is a subjective value statement (that I probably don't agree with). Both your first and second sentence are irrelevant to my argument.
How is saying Drug Patents lasting 20 years vs copy write on a tweet lasting the lifetime of the author plus what 80 years a subjective statement?
IMO, IP is a loaded term that distorts the wide range of issues. IM for Intellectual Monopoly is much closer to the truth. To be clear I have no problem with the idea of copyright or a well run power company being run as a local monopoly, but there outside of the normal business. And the goals should be finding rules that best serve society not the owners of said monopoly.
Except those companies didn't use publicly-funded researchers to both write and review the papers they put in their blogs (seriously, who buys printed copies anymore).
The point was the research is being performed by a government funded lab not private money.
If the public is paying for the research should they not have access to the published papers resulting from it without paying $32 to a private company for a PDF download?
The PR guys often work on a different schedule than the actual science, but those times do usually correlate well with the funding deadlines for some strange reason... It's probably work as usual at the NIF because everyone is working on a big, long, grinding project that's slowly making progress rather than bashing out stunning Eureka moments complete with Dom Perignon showers like you see in a Hollywood flick.
They are working towards ignition as measured by total neutron output. As soon as they detect significant enough jump to warrant ignition claim there will be Eureka moment with Dom Perignon.
Right now they are 3 orders of magnitude off (10^15 they get, 10^18 or better 10^19 they want).
17.3 kJ for the latest shot, according to the paper. But only some 8 kJ of that came from fusion (the balance came from the laser input).
D-T fusions are about 17.6 MeV, but 80% of that is in neutrons, which escape unless captured in a special blanket. The 8 kJ reported was all alphas. So 8 kJ / 3.5 MeV = 1.4e16 fusions, vs. 5e15 in the quote. He was probably talking about a previous shot.
For comparison, a typical AA battery stores 16 kJ.
You're misreading. E_{DT} is not the D+T fusion yield; it is the heat absorbed by the D+T fuel target (sum of external heating and self-heating from fusion alphas). E_{fusion} is the fusion yield.
E_{fusion} is the calculated neutron yield Y_{total}=6.1e15 times 17.6 MeV: 17.2 kJ + rounding. Check out page 3, right column.
As I understand, they are hoping to achieve fusion ignition, where no additional external energy will be required to generate power. So current efficiency calculations are irrelevant in that case.
they also say this isn't possible with their current techniques.
That's fine by me, it was not very long ago that we were unable to get more out of the fuel than we put into the fuel. That's step one to fusion ignition, and was done for the very first time last fall, so as far as I'm concerned they are still fleshing out step one.
> led NIF's critics to label the facility an enormous waste of taxpayer dollars
> government shifted NIF away from its fusion goals to focus on its other mission: simulating the conditions inside nuclear weapons
I think right there lies the problem with our world. People take up more issue with a multi-billion dollar research facility for science than one for military applications. If we spent a small fraction of the world's military spending on these big, as Google likes to put it, moonshot projects we could probably solve some really fundamental world problems (i.e. energy, climate change) in the near future vs. waiting many, many decades (if not centuries).
Do not want to argue with the main thrust of your argument (because I agree with it), but at least part of the reason why people had an issue with the NIF as energy research, but less so as a nuclear test simulator is because the NIF as a device is pretty clearly designed to simulate nuclear detonations. If the problem is "you're wasting taxpayer money", the issue isn't the dollar amount spent, its the return on the money. As a science experiment, the return of the NIF is probably pretty crappy. As a nuclear arms tester, its almost invaluable (infinite return!!!) if you accept the dual propositions that America should obey its nuclear test treaties obligations, and that America gains significant benefits from maintaining their nuclear arsenal.
The NIF was clearly built to maintain America's nuclear arms capabilities, both in providing employment to nuclear physicists, as well as maintaining the viability of the current stockpile. It was practically transparent from the beginning.
It's not quite that clearcut. Since the 1970s a lot of fusion scientists have been keenly interested in laser fusion for power production. It's just that they had to give the program a military justification in order to get funding.
Source: Search for the Ultimate Energy Source: A History of the U.S. Fusion Energy Program, by Stephen O. Dean, who was a major figure in the U.S. fusion program.
there's two ways you can take 'return on investment' - the wholistic qualitative way (if x has a tiny chance of improving the whole world then its worth it) or the narrow quantitative way (once we research it, we'l probably have to give x away basically for free and its is likely not to work out anyway and will take years so its practically valueless to our organisation on this years financial model).
unfortunately the latter is used almost exclusively. even in public organisations who should be able to take the broader of public good.
"Return on investment" generally means what you get back for what you put in. Both quality and quantity matter and I think this is often recognized, but if you're getting essentially nothing, it doesn't matter what quality of nothing you get.
yeah - perhaps to better word what I said 'return on investment' for a public body should include a broad definition of 'what you get back' as in 'what the public gets back' not just 'what the agency gets back'. the latter is generally easier to quantify and easy to quantify things are easy to consider and contrast.
The B-2 Bomber is child's play compared to the F-35's 1 Trillion USD program [1], but the comparison to the LHC's pricetag really puts things in prespective.
It must be a resounding success - the UK is about to commit to buying some - like we committed to buying Skybolt and the F-111 (cancelling the awesome home grown TSR 2 in the process):
> Yeah... I see no benefit in those, having a few more colliders instead would be really great.
That's because you're seeing it through the lens of our time. Remember, the B-2, and many other programs including the F-22, were devised during the trailing end of the Cold War (edit: that's actually a lie; the call for contracts and other such nonsense that would lead to the B-2 started in the 70s, and the F-22 program had existed in some form since the very early 80s). Whatever your conviction about whether or not such things should exist, government (and their militaries) in particular has a great deal of inertia, both in terms of expenditures and policies. So while both programs were in full swing during the collapse of the USSR, that era is primarily what laid the groundwork for such expenses.
Personally, I'm a fan of both. I love the hardware used in high energy physics. And I love military hardware.
If I were about 15 years younger and lived in a wetter climate, I'd go outside and splash in the mud right now. ;)
Here's another article that looks at some acquisition programs just by the US Marines. Navy, Army and Air Force acquisitions are often much bigger. Scroll for the tables.
Just so you know, the purpose of the NIF was for military testing right from the beginning. The reason it initially got funded (shortly after the end of the cold war) was to help with stockpile stewardship (making sure our aging weapons still work via simulation, without having to do actual weapon tests).
One of my favorite novels, Carter Scholz's _Radiance_, spends much of its length all on NIF and this controversy, actually: http://www.gwern.net/docs/2002-radiance
Oh, it's not that exciting. Much of the book is on the corruption of science by funding pressures and sponsors, so it's more of a slow corrosive burn than action.
This seems like a myth to me. Radar? Rocket engines? Turing? Stealth bomber technology? Nuclear powered aircraft carriers? ICBMs? Stuff that was happening regardless of a war and stuff that was nearly useless.
Perhaps military first adopters are a net positive; like, the first buyers of a technology who will pay top dollar for working prototypes. But what on the original innovation side can you actually attribute?
Rocket engines were known before WWII (a thousand years ago in China, and efforts like Goddard in the early 20th C), but you can't disregard the Nazi Germany invention of the V2 just because of that; it was a big deal, not just a fleshing out of prototypes.
Later stealth, and nuclear carriers, and ICBMs, were a result of the cold war. I don't see how it matters whether the war is hot or cold.
I'll grant you Turing (aside from crypto), not that anyone is claiming differently.
The Internet directly arose from the Arpanet, which was purely a DARPA (Department of Defense) project, and there are lots of other well known examples.
The problem with sinking a lot of money into science is that it only pays dividends if solutions to the big problems actually exist. They may not. It may in fact not be possible to do better-than-breakeven fusion on a small (relative to a star) scale. The only way to find out for sure is the hard (and expensive) way.
By way of contrast, when it comes to military technology you're not competing with nature, only with other humans, and so a "solution" (if you'll excuse the irony) is almost certain to be found by one side or the other.
> It may in fact not be possible to do better-than-breakeven fusion on a small (relative to a star) scale.
the H-bomb is better-than-breakeven and much smaller than star scale. What you say may be true for Tokamak like confinement schemes whereis H-bomb is inertial confinement, and it works. And it is pretty clear that inertial confinement in several orders of magnitude smaller scale than H-bomb would work too. Look at Sandia Z-machine - an order of magnitude up, and it would get breakeven. Not that anybody really needs it :)
OK, let me be more precise: it may not be possible to do better-than-breakeven fusion on a small scale in a way that makes the resulting energy safely harnessable for useful work.
> we could probably solve some really fundamental world problems (i.e. energy, climate change) in the near future
I have a feeling that if we had fusion plants providing limitless cheap power, we'd get back on an 8% yearly growth track for energy consumption, like with oil consumption before 1970.
Then within a century or so, we'd be producing an amount of power equivalent to 20% of what the earth receives from the sun. It would give a whole new meaning to "global warming".
It is a tiny bit unreasonable to project continuous exponential growth in human energy usage on such a ridiculously long timescale. It's like arguing that Moore's law will continue for another 200 years.
However that growth is precisely what the present human civilization and economic system are predicated on.
That growth will stop sometime and a considerable amount of present pain seems to be that global growth is slowing markedly (and has been since the 1970s).
Yes, and when you're a child your caloric intake and your yearly increase in height are very tightly correlated too. Lots of society is built around this growth, such as the expectation of having to buy bigger clothes every year. But even a child can understand that one day they will stop growing and eat approximately the same amount of food for the rest of their life.
It would be "strange" to argue that, as a teenager, you were doomed to a life of misery because in order to keep up with this so-far exponential growth rate you'd have to eat entire planets by the time you were 1000 years old. It is similarly bizarre to argue that human society is in some sort of trouble because we can only double our energy consumption a few dozen more times.
This isn't even bringing up economic concepts like diminishing returns to utility: a person who makes 100k per year does not spend twice as much energy as someone who makes 50k per year. There are plenty of other things to buy that make us happy, and that can only get even more true when we're living in some hypothetical future with hundreds of times the economic productivity of today.
Humans run through a well-understood lifetime cycle of growth, maturity, and decline. Contemporary orthodoxy neoclassical economics doesn't generally recognize this, though it's beginning to. See Tim Worstall at the Telegraph UK: "Infinite growth on a finite planet? Easy-peasy!" http://blogs.telegraph.co.uk/finance/timworstall/100017248/i...
Your assertion that a person earning 100k doesn't have twice the energy (or environmental) footprint of someone earning 50k really needs some sort of factual support.
And your hypothetical future with 100s of times today's economic productivity flies in the face of the law of diminishing returns which you've invoked yourself.
Free energy buys us a century or more of energy growth. Nobody has any idea what anyone would use that much per capita energy for -- how much A/C, food and transportation are you capable of consuming?
Fortunately, if we had a lot more cheap energy we could probably get to space pretty easily too. So a lot of the growth wouldn't be happening on this planet.
Covered by the linked article to an extent and Asimov IIRC. Exponential growth requires FTL travel within a period shorter than recorded history even with free energy + space travel.
I don't think fusion power plants are going to be that much cheaper to run than current nuclear power plants. Fuel costs are a small part of the operating costs of a nuclear plant (16% to 28%) and the capex costs probably won't be that different (e.g. they will still be based on heating water to create steam to power turbines to turn generators).
I totally agree with you, but now I'm wondering what the upside to fusion is over fission, besides being a great science landmark. No radioactive inputs or outputs?
Stuff like no possibility of meltdowns, no direct long-lived radioactive waste, plus cheap and abundant fuel.
Fusion reactions can only be self-sustaining at incredibly high temperatures and pressures. If the containment mechanism loses power, the reaction stops. Unlike fission reactions, at least as used in current reactors, which have the potential to keep on going unless we do something active to stop them. Yeah, the modern, well-designed ones have lots of designed-in safeguards against this, void coefficients and all, but it's still safer still to be running fusion.
Hydrogen and it's isotopes in, with other isotopes or helium out is better than assorted radioactive heavy metal isotopes with relatively long half-lives. The only possibility for dangerous waste from a fusion reactor is any transmutation that takes place in reactor components from the neutron flux. Seems likely to be better, but I don't think anybody has really investigated what this part will look like in a running fusion reactor with reaction rates high enough for commercial power production.
I guess availability of fuel is one big advantage, along with the cleanup of a fusion plant hopefully being a lot easier and cheaper than a fission reactor:
> Then within a century or so, we'd be producing an amount of power equivalent to 20% of what the earth receives from the sun. It would give a whole new meaning to "global warming".
Assuming we don't use the new energy in significant part to colonize other planets, with the colonies representing a significant part of the growth in human energy consumption.
An abundance of energy would result in new technologies, engineering and infrastructure to use it.
Water shortages could be addressed with large scale desalination.
Shipping/transportation would become real cheap without significant fuel costs.
Food issues could be addressed by commercial hydroponics.
Exactly. There is so much incentive to keep using more and more power that we'd eventually cook ourselves in waste heat. You can't beat thermodynamics.
We could also tackle those fundamental world problems if we had about 100 Bill Gateses. But we don't, because of how much the government loots.
I'm not an anarchist, so I do believe there should be a government that provides military defense. I'm not a progressive, so I don't believe the government should try to loot people to try to solve other people's pet problems.
Just want to toss in, I think military applications are much easier for the layman to conceptualize. You hear science research and your gut doesn't send any strong signals.
This sentiment is interesting "Over the past few years, NIF has been getting a fat 'F.'" Perhaps now that 'grade' will change.
Actually, I think framing it as a grade is beyond silly; it is irresponsible. Giving a letter grade to long-term scientific project makes little sense. It is not a one-shot thing with a predefined notion of correctness. What can we compare such a grade to?
Instead, we should be asking what we've learned and how the project has advanced science.
Is this even basic science? It sounds more like an engineering problem, albeit a very hard one. I am not a physicist, but I would consider this more like the Manhattan project -- the science being done is not of the 'deeper under standing of nature' type, but rather 'how do you make a controllable explosion' type.
Given that, the way to evaluate the project is not on grants received, articles published or citations in premier journals, but rather 'Did you make a controlled explosion? If not when will you be able to?' And I think, so far an 'F' is fair.
It's definitely still a science. For example, there are still basic fluid-mechanics questions we don't fully understand about how plasmas behave. Also there's this journal: http://www.ans.org/pubs/journals/fst/
Disclaimer: I worked in a fusion lab in undergrad.
"What was happening, of course, was that all the boys had decided to work on this and to stop their research in science. All science stopped during the war except the little bit that was done at Los Alamos. And that was not much science; it was mostly engineering."
Good way of putting it so that people can understand. Feynman was a scientist, not an engineer, so that's a clue that the Manhattan project involved science, not just engineering.
The same is true of fusion research. That's why it's "research", not merely "development".
In particular, the physics of plasma instabilities has always been quite imperfectly understood, and that has been one of the key problems with controlled fusion since the 1950s.
The Manhattan project was a boy band or had a significant boy band component. It clearly involved a number of amateur male musicians[1][2].
Science is about understanding, engineering is about building something.
If the Manhattan project had resulted in no new understanding, but had created a nuclear bomb it would have been a success. The science was incidental to the success of the project.
Engineering is built on science so there will often be scientists involved in Engineering projects, however the assumption with an Engineering project is that the science is already understood (or at least significantly understood) and so the main effort is in the design and
construction.
The problem with Science projects is that they are hard to sell to the public. Thats because its hard for someone who isn't a domain expert to put a value on its significance. This is why a lot of Science projects are dressed up as Engineering projects[1] to sell them to the public.
Its pretty clear that the above fusion research is a Science project, i.e about understanding.
If this facility is being used to produce new bombs then that is an engineering project but if it is just being used to gain understanding about existing bombs then its science.
You mistake my point -- the nature of the work Feynman did at Los Alamos was different than the nature of the work he did at the Institute for Advanced Study. All of the work was science, but at Los Alamos the target was a working bomb, at Princeton it was a deeper and more fundamental understanding of nature, and thus criteria for judging is different.
The Manhattan Project can be judged by the simple question, is there a working bomb? Evaluating basic research is much harder.
That's not what was said. OP said it's not about fundamental science, which is somewhat true. Nuclear weapons research isn't fundamental, although it does require some fundamental research.
That's not enough. Learning what doesn't work is sort of useful, but only if it ever leads to something that DOES work. There's an infinite supply of dead ends out there. You need a higher bar than 'learned something' if that something is a negative.
It depends what question you are asking. If the question is "should we fund projects even if we don't know if or when they will have demonstrable results?" then I would argue that, yes, some portion of research should go towards that.
How long is "long enough" to deem that a line of experimentation "didn't work out?" There is no period long enough. Sometimes a negative result is quite useful down the road. Some research just comes together when the right things are learned and tried. So any notion of a "dead end" is really just a tentative assessment, frozen in time.
Without harsh assessments, a massive project will eventually dedicate some of its resources into self-perpetuation, that is justifying its existence rather than producing results.
It's a risk we need to take (or mitigate) if we want to have a steady flow of new breakthroughs. Science, just as programming or any other creative discipline, is best done when you have more money than you need and nobody is looking at your hands.
I'm not a scientist, but I am a programmer, and I find that learning new things is very often a process of eliminating dead ends through trial and error. That's how you narrow it down to the thing that actually does work.
It sounds like you're saying "don't bother exploring dead ends, either do it right the first time or give up," which is a lot like saying "I just want a penthouse apartment, stop wasting time building a foundation."
Yes, I think what you are saying can work well for programming. But with certain scientific questions, there are many theories that could work. It isn't just a matter of ruling things out (though that helps too). One key part of the scientific process is generating a theoretical insight (perhaps) that lets you look at something in a new way. Another key part is figuring out how to test that idea.
> You need a higher bar than 'learned something' if that something is a negative.
Not really. The point of big scientific research is that at the moment a project is being conceived, no one has any idea where it will go and whether it will bring anything useful. Yet in time, it always brings breakthroughs.
I'll put it bluntly: we need more trust. The right way to do science is to keep throwing money at it, no questions asked[0]. IMO the reason we've seen so much progress during last two wars was not because war per se stimulates research, but because during a conflict the military has infinite budget and it starts throwing more money at research that can be used, so scientists can do whatever they want without justifying it for general public[1]. Free of money&management issues, creative minds could focus on doing fundamental research in whatever they felt like doing[2].
Come to think of it, science is best done in the same conditions the programming (or any creative endeavour) is - when you're free to think, don't feel the pressure of looming deadlines, don't have to deal with managerial bullshit, and when you have enough money to not have to think about it (both in life and in cost of work materials).
So yes, I advocate some kind of almost blind trust. Just throw some part of taxpayer money at R&D, no questions asked. It's a long-time investment.
[0] - Of course there's a need to build in a system to limit the amount of parasites who will feed on free money; but they ultimately cannot be avoided, we can only try to limit their numbers.
[1] - I think it was Feynman who pointed out that the problem with science in the 90's was that scientists suddenly found themselves accountable to general public and struggled with explaining why their research was useful.
[2] - If that last paragraph rings bad to you, please note that I'm skipping over a lot of points related to creativity needing to be free of trivial bullshit, intrinsic motivation beating monetary rewards/prestige in performance, etc.
No. Feynman actually criticised scientists who weren't prepared to explain and justify their work to the public. He says this in one of the bbc videos, can't remember which, as part of a critique of the social sciences.
Standing on the shoulders of giants is many times a means of saying that something that "DOES work" was derived from something that didn't work, or perhaps that it was derived from the difficult yet iterative steps of others.
Look at the Large Hadron Collider. Do you suppose they scrub from their data stores all tests that fail to produce a "working" result? Are you suggesting that trying and failing produces irrelevant knowledge?
I would assume that they are only working with a narrow subset of dead ends. They probably have promising leads and work on those. I think it's a constant process of refinement.
Newspapers love to act as if it they call the shots. NPR in particular styles itself as the stalwart of the disenfranchised - has its shit together more than Occupy, but is unsullied by big corporate interests. They're not much worse than other papers, but unfortunately, it sometimes maintains its readers disenfranchised in order to maintain the illusion.
I mean, why don't they admit that this is four-month-old news? Why is this getting sudden publication? Is their agenda as transparent as informing the public, or is there a strategic element to it?
This is newsworthy despite "They didn't get more fusion power out than they put in with the laser"?
After decades of work they are orders of magnitude away from break-even. Makes me wonder if the goal is actually break-even and/or power generation. Not really; but it's a revealing question.
I'm no great fan of the budgets these huge projects pull down; I think the bang-for-the-buck is greater elsewhere. But modelling nuclear weapons is an even greater waste of time. It's all kind of a sad epigraph about national science and technology initiatives.
Assuming the modest proposition that fusion energy is possible, why not make fusion energy a 'man on the moon' kind of national goal? It's hardly in doubt that we need a large source of clean energy. Is it a failure of imagination? Is it a failure of the political system - Can't get Bubba to vote for no fusion thing. Is the status quo energy system resistant to change? Whatever, the NIF thing just makes me depressed.
> It's hardly in doubt that we need a large source of clean energy.
From one of my nuke profs, depending on how fusion is achieved its not all together clear that it would be any cleaner than fission. Something like 16MEV neutrons are going to cause neutron activation in tons of shielding. So all we do is swap tons of spent fuel from fission plants for tons of activated material from fusion plants, assuming we can even get there.
It is newsworthy. We generated fusion be shining light at molecules, which never has been done before. Aside from that, if you read the article you became slightly more science literate.
Fusion with laser has been done before, by the same team. The difference is that they improve the experiment and this time they get ~50% more energy. Or if you evaluate it in other words, the efficiency improve from ~0.6% to 1%, or less depending on what do you define by input energy. More details in other comment: https://news.ycombinator.com/item?id=7227620
I think it is newsworthy, but not for the technical details. It's newsworthy because it's such a huge budget, because it's been neutered, because fusion energy research is pathetically funded, and because NIF is like a mean old dog lying in front of the fireplace, farting from time to time - it creams the resources and makes alternatives unappealing.
"Strictly speaking, while more energy came from fusion than went into the hydrogen fuel, only about 1 percent of the laser's energy ever reached the fuel. Useful levels of fusion are still a long way off."
The rest was lost to energy conversion losses. Yeah, not the best breakthrough I've heard today. Their best breakthrough was this CAD-flythrough video: https://www.youtube.com/watch?v=1Sp1sDpn_M0
It was always my understanding that achieving nuclear fusion wasn't the problem. You can do that with a tabletop device like a Farnsworth Fusor.
The hard part is getting more energy out of it than you put in.
lessee:
refinedgeek, 10/8/2013: "However NIF has announced today that, for the first time ever for any fusion experiment, their reaction released more energy than what was pumped into it;" where there's a link to: http://www.bbc.co.uk/news/science-environment-24429621
which says (10/7/2013) "The BBC understands that during an experiment in late September, the amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel - the first time this had been achieved at any fusion facility in the world."
meanwhile NPR on 2/12/2014 says: "Omar Hurricane, a researcher at, says that for the first time, they've produced significant amounts of fusion by zapping a target with their laser. "We've gotten more energy out of the fusion fuel than we put into the fusion fuel," he says."
p.s. Here's an article from October 5, 2012 which in a very biased way dismissed the possibility of this happening.... then it happened, TWO DAYS LATER [1]
> The lab's leaders predict that "ignition"-—the point where the 192 lasers actually deliver more energy than they consume—could occur as early as next year.
Hurricane says no one knows for sure whether NIF can really reach the point of ignition. "It's not up to me; it's up to Mother Nature," he says. "But we're certainly going to try."
One of the proposed drives for the Daedalus starship design involved laser fusion. The pellet would detonate and would push against a magnetic nozzle to produce thrust, like the Orion nuclear craft, but more efficient. Also, the expanding magnetic fields could cause induced currents in networks of wires designed to capture energy, powering the ship and the next detonation.
I thought it was kind of hard to take the article seriously after that. I know it's the guy's real name, so I feel bad about this (as bad as I can feel for someone with an awesome name!), but it doesn't change that it sounds like something from a comic book. A fusion scientist named Omar Hurricane? Come on! Next thing we hear is there's been a horrible lab accident involving high powered lasers and lead scientist Omar Hurricane has died. Only he hasn't really, he escaped death and and went into hiding, avoiding public shame while also training to use his new found weather controlling laser eyes. But for good or for evil? Is Mr. Hurricane a villain or a hero? Buy the next issue to find out!
Given that hydrogen bombs are fusion bombs such a claim can be leveled at all fusion research. Maybe nuclear weapons research is the way they get the US government to shell out for fusion research, like the optics researchers did with SDI.
Actually, most of the energy in most H-bomb designs actually comes from fission - but the neutrons that cause this fission come from the fusion of the secondary - so fusion is the key component of the whole thing working.
A notable exception to this was the Soviet Tsar Bomba which used non-fissioning tampers in its multiple tertiary (and probably its single secondary) stages - resulting in an explosion of over 50Mt where 97% came from fusion:
All research into nuclear energy can be used for nuclear weapons. A nuclear reactor is just a well-cooled bomb that isn't supposed to explode violently (so to speak).
Cool! If we ever have WWIII, I'm going to drop a well-cooled H-bomb that uses laser-based confinement mechanism to not explode violently on my enemies. In fact, I await their preemptive surrender even now.
It all fits into my evil plan! You see, I plan to use this "laser" to turn the moon into a weapon. With this "death star", all the governments of the world will be powerless against me! If they try to stop me, I'll use the "death star" to create "nuclear fusion"! [holds right pinky to the corner of his mouth]
http://www.nature.com/nature/journal/vaop/ncurrent/full/natu...
They would like to charge you $32 to look at it, nature-ally.
The fusion yield is (?) 14 kilojoules (inferring this from physicist Mark Herrman's "5 million billion fusions" [WaPo], at 18 MeV per fusion), which is a moderate improvement over the 8 kilojoule achivement from last fall:
https://news.ycombinator.com/item?id=6459289
[WaPo] http://www.washingtonpost.com/national/health-science/fusion...
The "1%" energy efficiency figure [a] is misleading: 14 kJ is about 1% of 1.5 MJ of ultraviolet light hitting the fuel capsule. But creating that UV pulse consumed 3 MJ of infrared light, which in turn took 400 MJ from the flash bulbs driving the IR laser. So the system efficiency is more like 0.003% (and throwing in hypothetical turbines to generate electricity, 0.001%).
https://en.wikipedia.org/wiki/National_Ignition_Facility#NIF...
[a] I'm referring to this: "while more energy came from fusion than went into the hydrogen fuel, only about 1 percent of the laser's energy ever reached the fuel."
(update: from yosyp's link, the fusion figures were 14.4 kJ, and 17.3 kJ, on two different runs:
https://news.ycombinator.com/item?id=7227950