TFA says that heat flux out of the supervolcano is the main mitigating factor in build-up of explosive potential inside the caldera.
Therefore, increasing heat flux by 35% would (we think) prevent Yellowstone (e.g.,) from ever erupting again, based on current heat influx and the perceived rate of eruptions (n=3).
So, inject more water to cool it more quickly, and, bonus, use the hot water to create power at $0.1/kWh, up to 20 GW.
What's terrifying is how quickly we're drying up the water elsewhere in the country, and the existing water in Yellowstone is the primary method of cooling the magma [tfa]. If we dry it out, it's boom time a lot sooner.
They have a whole section on this, but I'm fairly certain this is the next environmental movie disaster plot: Increasing water use dries out the only heat sink preventing a volcano that destroys life on the Western Hemisphere, and causes a near-extinction in humans everywhere else.
That is very interesting. I live in Portland Oregon, recently we were driving out to a park and looked over and saw a mountain we had never seen before. It literally took us 5 minutes, looking at google maps, to realize it was Mount Hood - a mountain that we had seen hundreds of times before. But we didn't recognize it because it was snow free. In 21 years we've never seen it without it's cap of snow/ice and couldn't recognize it. It was shocking to say the least and really hit home what is going on with water in the region.
As a side tangent, Oregon is a part of the world where volcanoes do just sort of spring up practically overnight, geologically speaking. Mt Bachelor was formed between 18,000 and 8,000 years ago, according to wikipedia.
They can also disappear literally overnight. Mt Mazama erupted about 7,700 years ago and is now Crater lake. There were a lot of eruptions around Newberry caldera around that same time, and there've been eruptions as recently as about 1,300 years ago.
Good point. Also worth noting that that was comparatively mild; according to Wikipedia, St. Helens ejected a little more than a cubic mile of material, whereas Mt. Mazama is estimated to have ejected about 27 cubic miles of material. It's hard to imagine that kind of explosion.
Mt. Mazama seems to have been a bit of an outlier, though:
> The United States Geological Survey has referred to the 7,700 years ago Mazama eruption as the largest explosive eruption within the Cascades in the past million years, and one of the largest eruptions during the Holocene epoch.
Seems kind of improbable that the biggest anything happened in the most recent 0.77% of the observed time interval, which makes me wonder if there may have been others we just don't know about. Improbable events do happen though, so maybe it's just dumb luck (if we want to call it that) that we had such an event within relatively recent history.
> Seems kind of improbable that the biggest anything happened in the most recent 0.77% of the observed time interval,
The biggest has to happen sometime, and whenever it does will be an improbable time for it to have occurred if the probability is uniform and the resolution of time tracked is high relative to the total interval.
Only 20 GW thermal? In my head nuclear power plants emit roughly 4GW thermal which is converted to 1GW electric energy, so cooling Yellowstone seems well within our technological capabilities... with open cycle water cooling.
To save water, I wonder if you could use another system like air shafts or a closed-cycle water system, similar to how nuclear power plants have primary and secondary coolant loops -- the primary loops being closed and the secondary loops being "mostly closed" or have some other mechanism of discarding heat.
Yes, the cooling aspect is what TFA says is doable. It's the heat transmission / drilling part that's hard (It's 400C or worse and 1+km down).
They have some reasonable solutions. We'll never do it. Environmental conservation groups would see it as a capitalist cash grab, and the right would see it as a waste of money and energy based on alarmist scientists getting too much press. And anyway, we'll drain all the water out of the west coast in 50 years and there'll be nothing left to set up a pump / capture plant.
Water in the Yellowstone ecosystem comes from local snowpack -- the area contains the headwaters for the Yellowstone and Snake rivers and so is a net exporter of water.
I worked in Brian Wilcox's section, at JPL. He reviewed a couple of my concepts and of course dismissed them as silly.
My impression of him as a classic polymath is reinforced by the fact that he wrote seminal texts on both volcano defusing and rover wheel diameter optimization, both of which have come up in recent discussions.
He has the signature "first principles" approach of using mostly physics with some clever choices of approach to make it work out nicely.
The year 1816 was known as "The Year Without A Summer" because it was 3°C cooler (in France) than normal. Crops failed, and there was much misery throughout the world.
Scientists later determined that the 1816 temperature anomaly was caused by volcanic eruptions in 1815 and 1814, which sprayed volcanic ash and sulfuric acid into the atmosphere, thereby shading the oceans and preventing them from storing the usual amount of heat over those two years.
Our modern Civilization is more vulnerable to an average once-a-millenium volcanic eruption than it's ever been, on account of our numbers and our dependency on supply chains for our sustenance.
From the linked article: "In fact, the U.N. Food and Agriculture Organization estimated the 2012 worldwide food storage to last for 74 days." (pg. 9)
Grains are useful because they store well. Storage is not economical, but it is essential for reducing the suffering which would result from an 1816-like scenario in our modern world.
Edit: the 1816 anomaly is classified as a volcanic winter [1]. Solar minimums have also resulted in years/decades with cooler summers than normal.
> Our modern Civilization is more vulnerable to an average once-a-millenium volcanic eruption than it's ever been, on account of our numbers and our dependency on supply chains for our sustenance.
I think it's more robust than ever, because of our ability to ship food around the world. Second, most of our food is grown to be eaten by animals that we then eat. If we ate the food directly, you could feed way more people than through the meat. So if there is a reduction in capacity through such an event, unless it's too extreme, we'll still have lots of food left.
To reassign crops for feedstock to human consumption is not as easy as it sounds. If Covid-19 has taught us anything, is that our global food supply chain is a complex beast.
Also, just a reminder to those lucky enough to survive this pandemic: a marginal increase in food prices (plus inflation) due to supply chain disruption might leave poor people with less foods on their tables.
> To reassign crops for feedstock to human consumption is not as easy as it sounds.
It is done all the time. On dry years, when grain prices increase, meat producers hush to sell their cattle and the extra grain does get into the people's diet. The reverse happens all the time too.
On the customer side, a change on food price leads first to a change on meat consumption, what couples nicely with the production side. Of course, any price change pushes some people over the starvation line (on whatever direction), but
this is something societies can fight.
It's just not done on the scale that a supervolcano would imply.
> To reassign crops for feedstock to human consumption is not as easy as it sounds.
This is correct, but feedstock type foods are easier to store and re-route than other food with a shorter shelf life. Over the pandemic I never had an issue finding basics like flour or potatoes. Foods that became scarce were meat, dairy and highly-processed foods.
> Over the pandemic I never had an issue finding basics like flour
flour was almost impossible to buy early in the pandemic in the US at least. Seemed like more of a demand side issue as everyone (including me) started baking bread at home. It took months for them to start having full stocks of flour again.
One pound bags of flour coming to grocery stores through the grocery supply chains were in high demand, because previously almost no one bought flour on that channel. Meanwhile, bakeries kept on trucking along, AFAICT, with an uninterrupted supply of 50lb bags from the non-retail supply chain.
At least in the midwest that was just the retail small bags of flour. I noticed the same as other comments - none of the restaurant supply stores around me were ever out of 50lb bags of flour, which is what I ended up buying. Yeast was a little more scarce however, even the bulk bags.
Of course it's hard, but note that the volcano eruptions happened one year before the, so there's a year to prepare. Ideally you plant human edible food in the first place. As for "those lucky enough to survive this pandemic"... most people didn't die. It's only a small minority that actually died from covid. It's been significantly more than would die usually, so this was extremely annoying for the healthcare system, but it's not been sth. like the plague or so.
Corn raised for feed is not the same as corn raised for human food. You could probably eat it wafted some form of processing. But judging by our slow response to the pandemic, I doubt we could pivot fast enough before it rotted.
Sweet corn (what we eat) needs to be cooked but is edible raw. Feed corn is tough as nails. Field corn is more starch than sugar and has a much lower moisture content. Almost all corn is field corn, more than 95% of the total crop. You can eat field corn but it’s about as appealing as hard tack.
Does the example of the world's recent adaptations to COVID-19 give you confidence that "we" would make the necessary adjustments to redirect livestock feeds to humans in an emergency?
I doubt that for any famine of the last 200 years, including those that killed tens of millions, the problem has been, "not enough food in the world". The problem has been, "failed institutions don't put the food where it's most needed".
Vaccines are even easier to ship than food. Are they getting to where they're most needed?
There were three periods in the COVID-19 response. The initial response was limited, because people were still unsure whether the virus was a serious threat. Then, in March 2020, we saw determined efforts to stop the virus that were unprecedented in speed and scale. People were playing it safe, because it was obvious that the threat was serious, but nobody knew how serious it would be. Around May 2020, it became clear that the threat was not that serious after all. The focus then switched to the cost-effectiveness of the response from various points of the view.
The worst case for COVID-19 was maybe 1% of the population dead, 10% with serious long-term health consequences, a global depression, and a handful of coups, revolutions, and civil wars. It would have been bad, but only on the level we should expect a few times in a century anyway. It was serious enough to warrant a large-scale response but not serious enough to mobilize the entire society.
Unless anyone pulls a Stalin and intentionally mismanages things to starve people we'll probably be fine. Modern communications and commerce channels should let us route around stupidity unless we're prevented from doing so.
If someone does pull a Stalin at any meaningful scale that's probably not gonna be possible to keep on the down low and heads will roll, likely literally.
Note: "a supervolcanic eruption
has not occurred in the Holocene (past approx. 12,000 years) and, therefore, modern
human civilizations have not witnessed such an eruption."
This seems to be from 2015. That's not to say that it's out of date but when it starts with "There has been a significant effort over the last two decades", it's good to know which "two decades" it's talking about.
Often it's the reaction to external threats which proves decisive. A rotten civilisation doubles down on its rotten values, fearing to try something effective.
But in the case of supervolcanoes our recently improved ability to grow food in vertical farms under LED light might be of great help. Ideally backed by nuclear fusion since solar would be less effective under the attenuated sunlight. Obviously, we don't have that power source yet, but, hey, there are a dozen high quality fusion projects out there, what are the chances they'll all fail?
There may be trouble ahead But while there's starlight and music and truth and romance Embrace the future and dance!
I'm a geologist who has worked a little in Enhanced Geothermal Systems and (independently) in volcanology though I am a specialist in neither (I mostly work on earthquakes, and probabilistic seismic hazard analysis). I think this paper is quite interesting but the caveats are briefly noted though actually enormous.
A key one is here: Beyond human intervention, huge pulses of heat energy into the magma chamber may at times precipitate eruptions, with brief periods where the heat flux is so large that engineering solutions would be impractical. If heat flow were sufficiently massive then it may be impossible to mitigate supervolcanic eruptions.
From what we know about more well-studied volcanic systems (which are volcanic arc systems like the Cascades in the US or Mount Fuji in Japan), the heat flux into the upper magma chambers comes in the form of episodic injections of extremely hot basalt from the deeper magma plumbing systems (5-20? km depth). The size of these injections likely scales with the size of the volcanic system, so for a mantle plume they may be enormous. The heat flux is not steady state and the shallow crust/ upper magma chamber does not just heat up linearly with time until the eruption threshold is reached. We definitely have no control over the deep basalt plumbing systems and can't drill anywhere near that deep and hot (Deepwater Horizon, which by most metrics is the deepest well drilled, reached 12 km depth in cold crust, not getting above 120 deg C [1], but in the Yellowstone area this would be almost certainly well above 500 C (perhaps 900-1100 C)...
Additionally the failure mode for hydrothermal cooling of the volcano is really bad. Explosive eruptions are basically triggered by the release of volatiles in magmatic systems (expanding gases in a magma froth as H20 and C02 come out of solution). As is stated a few times, the increase in temperature with depth above the magma chamber basically follows the increase in boiling point of water, so the system is essentially on a critical threshold. Perturbations to this can be quite bad because any volatilization of fluids in the system can cause a small phreatic (partially hydrothermal) eruption that could remove some shallow mass, which decreases the confining pressure at deeper depths and causes further volatilization due to decompression, leading to a runaway cascade of larger and larger eruptions culminating in a major magma-chamber-emptying eruption.
They mention this here: in any realistic system there is a possibility that, as we artificially extract heat energy out of the magma chamber, we could cause phase changes (e.g. volatiles coming out of solution) that would reduce the overall density causing expansion and cracking in the overburden, possibly opening a channel to the surface and precipitating an eruption.
Now also consider that n=3 over 2.1 million years. If this is a Poisson process, then the probability of an eruption in any given year is ~1.4e-6, or 1 in 1.4 million. Though the most recent 3 events are quasiperiodic (mean 0.7 M years apart, stdev 0.075 M years), before these 3 eruptions, the eruptive center was in a different spot farther to the southwest, and there was a 2.3 M year quiescence between events[2]. I don't think there is any clarity over whether the next major eruption would occur in the same magma chamber as the previous several events or would form to the northeast towards Red Lodge, MT. So even in a periodic, geographically stationary system, which is sort of the most dangerous scenario, we could say that we're 90% of the way to the next eruption which therefore has an annual chance of happening of 1 in 140,000.
I think that any sort of mitigation efforts are still going to try to address a rather low probability event with uncertain technology, possibly in the wrong geographical location, and could trigger the event we are trying to forestall. At a certain cost of billions (pocket change w/r/t the costs of an eruption, sure), and the major modification of the first and probably most famous National Park (and therefore the permanent slandering of otherwise promising geothermal technology in the eyes of environmentalists). Not sure what the value proposition of all of this is but maybe our efforts could be spent elsewhere...
> Additionally the failure mode for hydrothermal cooling of the volcano is really bad. Explosive eruptions are basically triggered by the release of volatiles in magmatic systems (expanding gases in a magma froth as H20 and C02 come out of solution). As is stated a few times, the increase in temperature with depth above the magma chamber basically follows the increase in boiling point of water, so the system is essentially on a critical threshold. Perturbations to this can be quite bad because any volatilization of fluids in the system can cause a small phreatic (partially hydrothermal) eruption that could remove some shallow mass, which decreases the confining pressure at deeper depths and causes further volatilization due to decompression, leading to a runaway cascade of larger and larger eruptions culminating in a major magma-chamber-emptying eruption.
Why not just drill further away then, along the periphery? Shouldn't any additional geothermal exploitation necessarily help cool the thing? Yellowstone is a growing fever, and if it's not safe to put coldpacks on the forehead we may as well put them on the arms and legs.
Fascinating to see some of the research put toward inward-looking events like supervolcanoes. Part of me wonders why bother since the Yellowstone one has such a low chance of erupting, but then again, same can be said about an asteroid impact, and that seems like it has slightly higher awareness in the public sphere.
Also, is there a reason why this is an effort from NASA, and not, say, the USGS?
The best idea that comes to my mind is to go vegetarian and to learn to distribute food better.
We are already producing way more food that people need. If we just stopped feeding it to cows, pigs and chicken, readjusted production for more vegetarian diet.
You mean a mycophile diet? Mushrooms will save humanity. High protein, doesn't need the sun for energy, lives off of refuse, can remediate waste, some have medical properties, etc...
There ought to be serious R&D towards ways to survive a global agriculture outage at scale (supervolcanos aren't the only threat) -- maybe mushrooms could save humanity then, I don't know.
If crops fail, it's going to depend on what crops fail. We won't have the option of radically reapportioning our crops on the fly; while a pre-emptive move to vegetarianism might help, that's a radical solution for a problem that may happen every 20,000 years.
So, consider what happens. If crops like alfalfa end up more resilient, we probably won't eat those past their sprout stage. However, if they can feed rabbits, it may be incredibly valuable as an option.
However, we would absolutely be looking to prioritize vegetables that can be processed for either human or animal consumption, even if they're not particularly palatable. I think the real problem, however, is the calories that would be available from such.
I think it's fair to say that if the food dries up, we would cull herds early and prioritize the most efficient food sources. We would likely be eating much less meat, if at all. We would not have room to be picky with our food.
> We would likely be eating much less meat, if at all.
In the short term, we would probably end up eating a lot more meat -- most of the hundreds of millions of cows, pigs and chickens we could cull wouldn't be slaughtered at the optimal time, being too young or too old, but we'd still be insane not to preserve as much of the meat as possible.
> We won't have the option of radically reapportioning our crops on the fly...
We can't turn alfalfa into wheat or rice, nor grow rice or wheat in all places we now have pastures, but we will have the option to redirect the corn and soybeans we use for livestock feed. These feeds are produced to lower standards -- for instance, there are higher limits on residual pesticides -- but in a catastrophe it's better than nothing.
I wonder if we have the milling capacity on hand to convert feed maize to corn meal for a one- or two-year crop shortage. It needs to be milled and nixtamalized before it's fit for human consumption.
It might be more efficient to distribute it as feed and just raise a bunch more egg layers in suburban lots. Probably faster than trying to stand up a bunch of mills right quick, too.
The above poster asked you to clarify your point in good faith and this is how you respond? Clearly he was asking for details about how vegetarianism would produce a more robust food supply.
It would be best to approach the supervolcano problem with defense in depth.
Increasing food supply production, efficiency, and resiliency are commendable goals in itself and should be done in conjunction in figuring out how to defuse the supervolcano.
Bitcoin mining has been suggested as the solution for incentivizing the effort without massive government subsidies. It might be economically impossible to fund the effort if there is not enough electricity demand in the local area, resulting in huge costs without returns. I.e. Bitcoin mining could be more profitable than selling electricity to local network, and make the whole effort actually profitable.
The heat is waste. The problem is how to make profit from it to finance the project, not to make moral judgements about the use of electricity. You could build energy-intensive data centers too, if that makes you feel better.
It's not a moral judgment, it's an economical judgement.
Mining bitcoins costs lots of economic value (energy, hardware, personnel) and produces virtually none. It's extremely uneconomical in the purest sense of the term.
Economic value is not an objective phenomenon. It isn't like mass or energy or charge or fluid flow. It follows no conservation laws.
It's a subjective, often ephemeral phenomenon that emerges from the interactions of buyers and sellers. When I say "x has value y" all I'm saying is that in current market conditions, some human would be willing to sell x at price y and another human would be willing to buy x at price y.
So your statement that bitcoin mining is bad because it "costs value" and doesn't "produce value" is not really meaningful. And even if it were, the simple fact is that bitcoin mining is often profitable, depending on the cost of watts and ASICs.
Economic value is rooted in both perceived and real, non-negotiable needs (water, shelter, food, etc).
Economic idealists tend to skip over the latter, probably because they have never faced poverty.
Rather than unscrambling numbers in pursuit of a diseased fantasy, compute could be put to work solving complex allocation problems, simulations, training, etc. to advance economic efficiency. For food production & distribution, water management, natural disaster mitigation, housing allocation, etc.
Good money is a prerequisite for a functioning economy. The majority of world's countries don't have access to good money, and that's why Bitcoin is really important. It really does provide water, shelter and food to these people.
Mining produces blocks, and miners are rewarded with bitcoin, which have a certain positive value because people want to buy them. The operation generates more value than it consumes, and therefore it is economical.
I bet any argument you could give for that statement could be equally applied to dollars.
Bitcoin is valued because it functions as an [imperfect] form of money. People value it (aka are willing to exchange their dollars for it) because it lets them transact with other people in a way that is, barring massive computational power, immutable and uncensorable, and because it was the first system to do this and therefore has the longest history of transactions. There are plenty of valid criticisms of bitcoin, but it would be idiotic to deny that a currency which functions in this way would be valueless or not in demand. Some people are certainly fools, but nobody has to be a fool in order to explain a world in which bitcoin commands a high price.
What would happen if nobody wanted to buy or receive BTC tomorrow? Some people would be very angry.
Now imagine what would happen if the dollar ceased to have any value? Unimaginable, isn't it? Economies would collapse, the whole world would be in disarray.
Economies wouldn't collapse because people can switch to bitcoin. In free markets, people always try to switch to a stronger currency, when their own currency loses value. Bitcoin is volatile, but it's still better than a currency that loses value in long term.
Recently, the president of Turkey said that they are in war with Bitcoin. This is exactly what happens when cronies manipulate money supply for their own benefit, but they can't do anything to stop people from choosing a better currency. Cronies get demonetized and finally they lose power. Same thing is happening in Africa in many countries.
We know that in a free market, people tend to choose money that has certain properties. Historically, gold has been the best form of money, and it has been chosen by a free market, not by legislation. The most important property of Bitcoin is that it can't be debased arbitrarily, which is similar to gold. Bitcoin can be also easily stored and transferred, which makes it better than gold.
So, as long as people keep choosing to use Bitcoin rather than something else, it will also gain or hold its value. It is not immune to losing value, but there has to be even better currency, or it has to be banned. Both of these options might be impossible.
I don't think banning it is that implausible. The energy costs of transferring it are not trivial, making small transfers impractical. That cost also makes it politically interesting to suppress it.
The metric of energy per transaction doesn't actually make sense, because there can be almost unlimited number of layer 2 transactions, whether they're on Lightning Network[0] or something else. Most Bitcoin transactions happen already on Lightning Network. Also, the energy used in mining is mostly independent from the number of on-chain transactions.
The energy is not used to append new transactions, but to keep the complete transaction history objectively immutable, i.e. the network is designed to defend digital property rights with energy.
I've been out of the btc loop since around 2016 but I remember a lot of people were very negative about lightning network back then. The main contentions were that it was not actually trustless and it would create a semi-centralized hierarchy of third parties rather than being p2p, leading to big bank-like actors that could veto your transactions.
Has this changed significantly? Just asking because you seem knowledgable
I think most of that talk was just speculation coming from the Bitcoin Cash camp. It's open and permissionless network so it's always possible to route around bad actors, for example if someone tries to censor transactions. Transactions are onion routed anonymously so individual nodes don't see where the transactions originally come from and where they are going.
LN is trustless when you run your own node. When using a third-party node you have to trust the wallet/node provider to some extent.
Thanks. Yeah a lot of what the BCH people were saying at the time made sense to me, the fact that BCH flopped notwithstanding. I'm always suspicious of attempts to transact crypto in a way that isn't "on-chain" because it just intuitively seems like a way for our existing financial/legal power structures to undermine the ultimate authority of the blockchain. I'll have to do some reading on how LN actually works in current year, onion routing sounds like a great idea assuming that, unlike tor, it would be cost-prohibitive for one actor to just run 80% of the nodes and deanonymize you.
Therefore, increasing heat flux by 35% would (we think) prevent Yellowstone (e.g.,) from ever erupting again, based on current heat influx and the perceived rate of eruptions (n=3).
So, inject more water to cool it more quickly, and, bonus, use the hot water to create power at $0.1/kWh, up to 20 GW.
What's terrifying is how quickly we're drying up the water elsewhere in the country, and the existing water in Yellowstone is the primary method of cooling the magma [tfa]. If we dry it out, it's boom time a lot sooner.
They have a whole section on this, but I'm fairly certain this is the next environmental movie disaster plot: Increasing water use dries out the only heat sink preventing a volcano that destroys life on the Western Hemisphere, and causes a near-extinction in humans everywhere else.