Hacker News new | past | comments | ask | show | jobs | submit login

Look at your paycheck, and imagine the number is 15% lower. Should answer the question.



If alternate pay is 90% lower, not big of a difference. i.e. If I lose job, I would prefer new one to pay 85% than 10%.


It does. And no, it's not that significant.

I'm not a nuclear fanboy (well, I wish I made sense since I like the idea of free energy, but renewables are just as good), but I think it's important to make accurate arguments.


An accurate argument would include the financial aspects of +/- 15% in generated output and the impact that has on grid capacity....


With such a small difference, the burden of proof in this case is probably on the person arguing that it is a big difference, since you'd generally expect a roughly linear relationship between these.

In fact, I think I'd suspect that downtime is less relevant for nuclear, since I believe most of it will be schedulable, as opposed to renewables where downtime is random and based on conditions. Since it's schedulable, it could be done in the off season, or different plants could be planned to be out at difference periods.


You actually have no idea about electricity transmission. For long term problems, look no further than France. Short term, look up how electricity generation, transmission, markets and grid stability are linked. Wikipedia is a good start. Followed by studies about 100% renewable grids, those explain why baseload is much less of an issue than people think. And too much inflexible capacity, aka baseload, can actually be a problem itself.


While I'm not a practicing economist, I have a bachelor's degree in economics and took multiple classes on the economics of electricity markets.

While all of those are certainly relevant if you're comparing nuclear and other sources of power, I fail to see how that's relevant to the question of whether there's a significant difference between 80 and 95% capacity factor over a year.


Now imagine you are working in controlling at a power plant operator, and your yearly generated output is 5% lower that whatever plan has before. What do you think would happen?

- investors, management and board saying "no biggy, 5% is not relevant"

- something else

Companies are doing restructuring and mass lay-offs to save less than 5% on bottom line costs, they are incredibly happy when the top line grows by 5% and worried, if not in crisis mode, if the top line declines by 5%. And for the financing part of a new power plant, those 5% are the difference between the investment being a good or a bad one...


Hold on you are an economist and are telling me the difference between a capacity factor of 95 and 85% is not relevant? What do you think the ROIs and margins are for nuclear investments? Considering that nuclear is almost completely dominated by upfront investment, I'd be surprised if the that difference is not the difference between comfortably making a profit on your investment and loosing a large amount of money.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: