This reminds me of the desirability of pre-WWII battleship steel in particle physics experiments. Due to required detection sensitivity they need to construct experiments from materials that have as low background radiation as possible in order to not mask the actual information of interest. Ever since the first atomic bombs were detonated, sufficiently low-radioactivity steel became much more difficult to find. A large portion of available low-background steel supply is from battleships that were built before the bombs [1]
Note that there is a similar particle-physics demand for lead from Roman-Empire warships ("ancient lead"). But in that case, it doesn't have anything to do with anthropogenic radiation. Rather, lead that is in the ground is constantly kept slightly radioactivate by background decays. But lead that has been pulled out of the ground and allowed to sit unmolested for a thousand years will deactivate.
I saw a talk on one of these experiments while I was a grad student. Apparently their lead source had "historical significance" according to the government of France, and getting it out of the country required beurocratic maneuvers of questionanable legality.
The real shame is that ancient lead isn't the only alternative, just the cheapest. Lead could be spun in centrifuges and then securely cast to remove such contaminants. This has been done with silicon, in that case to remove unwanted isotopes. It is a wildly expensive process but at least it is a process that could produce an infinite amount of product, rather than ancient lead of finite supply.
But what compound? PbH4 isn't stable. PbF4 is barely stable. Maybe Pb(CH4)4 ??? Naively, I guess that you'd get even better mass resolution. Molectual mass of UF6 being ~352, vs ~267 for Pb(CH4)4. Good use for all those leftover centrifuges ;)
You're getting downvoted because you're nitpicking vocabulary. Most people understood "infinite amount of product" to mean "functionally infinite, given the low demand and large supply."
The presence of Cesium-137 and Strontium-90 has also been used to reveal art forgeries. Before atomic bombs were tested, these isotopes did not exist in nature. Art produced before 1945 will not contain these isotopes, while art produced after that time usually will.
How does this work? The old art still exists but how does it avoid exposure? Why is exposure only relevant at the time of creation? Will old paint used in modern times display the isotope?
The issue is usually in contamination of raw materials.
Similar to carbon-14 dating - the isotopic ratio of carbon in the atmosphere is maintained by cosmic radiation hitting the carbon-12 and carbon-13 atoms, but as soon as the carbon is incorporated into solid matter it ceases to be part of the process that renews the carbon-14 levels.
And one source for this steel is courtesy to Admiral von Reuter who managed to scuttle his fleet, guarded by the Brits and interned in the Orkney archipelago, in June 1919. Better to sink that serve the enemy, right?
Another interesting story involving the scuttling of a ship is the Battle of the River Plate (https://en.wikipedia.org/wiki/Battle_of_the_River_Plate). The German ship, the Admiral Graf Spee, was forced into a neutral port due to damage. Rather than be forced out to face an overwhelming force they were certain to lose against, Captain Langsdorff instead scuttled the ship and committed suicide.
I couldn't find a straight answer to this the last time I looked, but it sounds to me like this is more of a matter of cost then the actual hard need for pre-WWII steel.
We could mine ore that hasn't been exposed to the atmosphere since then, and use air (either purified or found in some air pockets underground) for the furnace behind an airlock.
This would be much more expensive than just finding some pre-WWII steel, there's a lot of that around, but I don't think it's the case that if we somehow didn't have it we'd be screwed.
You should make the opposite conclusion. The fact that atoms are so tiny that almost every notable event has a detectable affect on isotope ratios means you should downgrade the meaningfulness of the OP article. (See my other comment on ancient lead, which has nothing to do with nuclear tech.) Likewise, when seizmic technology improves and we can sense explosions from further away, we don't conclude that building demolition is somehow more important than we previously thought.
Yeah, a similar observation is that there are more atoms in a glass of water than glasses of water in all the oceans. So, dumping a glass of (somehow trackable) water into the ocean would eventually cause every glass of water to contain atoms from that glass. See also: http://samkean.com/books/caesars-last-breath/
Or it takes 10 years for one of your breaths to be spread around the atmosphere, and everyone is very likely to be breathing at least one molecule of that breath in every breath they take.
Not necessarily a bad practice. Most of the graphs I've seen use this technique, do so to make the changes in value visible, not to deceive the viewer.
Doubling is pretty significant, as is the rate of increase prior to, and dropoff after, 1963.
If some sort of catastrophe befalls our current civilization and future archaeologists attempt to C14 date our fossils and artifacts derived from carbon sequestered during this era, their results will be wildly unreliable.
There are many important changes that are hard to see on a 0-based graph.
If you plot human body temperature in Kelvins, the difference between someone with hypothermia and someone with a raging fever are relatively slight (under 2%).
0 is only relevant for linear utility/impact functions The effect of an isotope isn't necessarily linear in its magnitude from 0. The baseline should be something that matters, like maximum safe dose.
If you have a baseline and a scale that indicates an actual effect, then by all means use it. That’s not the case here, nor has it been the case for any other such graph I can remember seeing. Universally, the bottom is chosen as the minimum value in the data being plotted.
Only if you’re trying to show the CMB and galaxies in the same image. Since doing so would be misleading as to the brightness of the CMB, that seems like a feature, not a bug.
No, if you insist on a uniform colormap that includes 0K (or whatever measurement you're showing the CMB with) then there's absolutely no way to make the CMB visible. The average temperature is 2.72548 K with variations on the order of 0.00057 K. Its incredible uniformity is one of its more important features.
A paleontologist friend of mine often quips that in his field, everything from the 1950s onward is considered “present date”, because the additional radiation pretty much ruined any chance of accurate carbon dating after the atmospheric testing era.
Granted, paleontology looks back in the span of millions to billions of years, so those ancient fossils are ok, but I imagine future scientists will grind their teeth an awful lot when trying to make scientific sense of our present day.
Just about every living thing that has lived since 1945 has had tracers of the atomic age within its body, in the form of isotopes that were far less common, prior to atmospheric nuclear tests.
This struck me as very surprising. Straight Dope's take[1]:
> While coal-burning power plants are responsible for producing most of the sulfur dioxide out there (and thus acid rain), they don’t contribute that much of the compounds that actually cause silver to tarnish, namely hydrogen sulfide—best known as a key player in the smell of rotten eggs and flatulence—and the similarly pungent carbonyl sulfide. About 90 percent of the hydrogen sulfide and more than two-thirds of the carbonyl sulfide in our atmosphere come from (you guessed it) volcanoes, salt marshland, undersea vents, and other natural sources.
The process of creating steel requires heating it with coke in a large furnace. A large volume of atmospheric air is pumped into the furnace, and it is this atmospheric air that introduces the radioactive particles to the mix.
There are vacuum furnaces that lack this atmospheric air, but their process is much more expensive.
> From 1856 until the mid 20th century, steel was produced in the Bessemer process where air was forced into Bessemer converters converting the pig iron into steel. By the mid-20th century, many steelworks had switched to the BOS process which uses pure oxygen instead of air. However as both processes use atmospheric gas, they are susceptible to contamination from airborne particulates. Present-day air carries radionuclides, such as cobalt-60, which are deposited into the steel giving it a weak radioactive signature.
Forgive my ignorance as I really don't know anything about this but would it not be possible, when using post WW2 steel, to take before and after readings and diff them?
When they say “bathed in radioactive cloud” they actually mean “wind carried trace amounts of radioactive materials”. Anytime anything “nuclear” is involved, the reporting gets very poetic.
Particularly visible in the graph of radioactivity from various times. Fukushima isn't even a visible blip - it's just that our tools are unbelievably sensitive.
Yeah, I wish they would point out that the pollution from China regularly swamps the radiation detectors in the US that are monitoring Fukushima.
"Bathed in a radioactive cloud" actually holds more for Chinese coal pollution than anything from a nuclear plant with the possible exception of Chernobyl.
According to those sources, there’s a major difference between the radioactive Sulfur(from the nuclear plant) vs Sulfur dioxide(from coal plants).
Apples vs oranges.
Sunlight does effectively produce radiation burns in your skin, however. And that definitely causes cancer. If we treated sunlight the way we treat barely detectable nuclear radiation, we’d never go outside and we’d use blackout curtains.
Just want to point out that low level radiation is probably good for you. It's cool that they found it, but this is not any kind of health danger.
Low level radiation probably acts as a beneficial stressor, like exercise or fasting [1]. Although this level is so low it probably does nothing at all.
You also probably don't have to worry about the radiation you get from bananas or plane flights. And it's possible those dumb sounding radioactive water spas might have actually been helpful.
Fukushima is aweful for the nearby region and still a cautionary tale. But it didn't poison the whole world.
Here's a very straightforward study, "Evidence That Lifelong Low Dose Rates of Ionizing Radiation Increase Lifespan in Long- and Short-Lived Dogs" [1].
Two studies on beagles:
"One exposed the dogs to whole-body cobalt-60 γ-radiation."
"The other evaluated dogs whose lungs were exposed to α-particle radiation from plutonium."
For both studies, excess radiation improved their lifespans by 20-50 percent. It is a substantial beneficial effect. Above a beneficial level of radiation their lifespans shortened to the level of dogs who were not exposed at all and then to substantial reductions in lifespans.
The lack of public trust in science is a real problem, and mostly unjustified or misdirected. That said, nothing reinforces the evil scientist image like irradiating puppies.
I don’t know, given p-hacking and reproducibility problems, science “journalism” that tends to report the results of single experiments as confirmed findings, and pay-gating of research behind for-profit journals, the scientific establishment does have some credibility challenges atm.
It is pretty dark. There's even a cute beagle picture in a graph. Apparently beagles are the preferred model animal for radiation studies going back to the 1950's.
Then you do a controlled blind study on extending the lifespans of humans eating bacon from irradiated pigs. Wastage stopped, science advanced, lifespans extended (maybe).
Back in the early 20th century, American doctors lobbied very hard to ensure that radium-coated water dispensers sold did indeed contain enough radioactive material to provide “health” benefits.
I just got back from vacation in Germany, and while I was doing research for the trip I found a spa near one of the places I was staying that does radon therapy. From what I could gather on the web site, you sit for multiple sessions in a room with walls made of crushed radon-producing rock, reading, chatting with your neighbors, or having a snooze. Also, they ensure the radon levels comply with the German Spas Association's guidelines and you need a doctor's prescription to get treated.
I recall seeing a documentary about early glow in the dark wristwatches with hand painted radium dials. The painters would occasionally lick the brush tips to make a fine point. A large percentage of them died from terrible forms of lip, mouth and throat cancer.
It sort of reek unsound doesent it? Like CRISPR-CAS in moderation is possibly good for you.
Remember that one time where the that massage felt like being kicked by a boxer in the back? The assumption for low-radiation beeing healthy goes like this:
We are creatures of a naturally radioactive environment. Actually the environment was much more radioactive in the ancient past - which is why we search living and deceased fossils for DNA-Repair mechanisms, as these creatures would have to be much more cancer resistant back then. But i digress...
DNA-Digression, that is what radiation inflicts - also known as damage. Now this is super obvious damage kicking alive the repair mechanism, which also fixes alot of other - not so obvious damages or kills the cell.
>Fukushima is aweful for the nearby region and still a cautionary tale. But it didn't poison the whole world.
I would be careful with that last statement because seafood from Japan is global.
Only time will tell the true impact of Fukushima's radioactive water.
"More than 1 million tons of radiation-laced water is already being kept on-site in an ever-expanding forest of hundreds of hulking steel tanks—and so far, there’s no plan to deal with them."[1]
12 US gallons = 100 lbs and 1 million tons is equal to 2.4 million gallons
"Tons of radioactive water" is a silly measurement. The water itself isn't radioactive, it has radioactive material mixed in with it. If you quadrupled the amount of water, you would have 4 million tons of radioactive water but it'd be the same amount of radioactivity, just more diluted.
It's even sillier because "radioactive water" isn't a measurement of the amount of radiation or a description of the decay process that reduces the danger over time
Just a correction on this, 1 million tons is equal to around 240 million gallons [0], which is roughly the volume of 362 olympic sized swimming pools[1].
A typical olympic pool (assumed 50m x 25m x 2m; the depth actually isn't standarized), contains 2500m^3 of water. 1m^3 of water has a mass of 1t (in typical conditions).
So 10^6 tons of water corresponds to 400 olympic swimming pools. That's quite a lot of water (still not much compared to the volume of the oceans, of course).
It's worse than that; > 100 tons of groundwater contaminated by the ongoing crisis still run to the ocean daily. [1] They've made progress in reducing the amount (it used to be hundreds of tons daily) but it's still a problem. Now there's talk of dumping what's been stored into the ocean anyways.
100 tons is roughly 100 cubic metres, or 1 cube of 4.65m. In a year it's a cube of roughly 33.2m each side. Given that the concern seems to be tritium, with a half-life of 12 years, if you want to reduce radiation to about 1/1000th, you'd need 10 halvings, so 120 such cubes. That's a lot, but if you were to build containment basins of "only" 1 depth, it'd be a basin 365 meters each side and 33.2m deep. That's not nothing, but it's likely also excessive.
Are you asking me what is the problem with 100+ tons of reactor-contaminated groundwater effluent entering the sea on a daily basis? On the surface this seems like a weak attempt at a strawman, please clarify.
Yes, I'm asking that. Because all sea water is irradiated anyway, the effects depend on the amount of radiation in the water. So, what problems is this water entering the sea causing?
My understanding is not all isotopes are equally hazardous. The stuff coming with the reactor-contaminated water is particularly harmful to the marine environment and people of Japan, and not normally present at such levels in the surrounding waters.
The half-life as I understand it is relatively short for the especially harmful stuff, but that's not so helpful if leaks are ongoing.
Yes, he is, and I am as well. First of all, irradiated doesn't mean radioactive. Most food in the US is irradiated for health purposes before sale at grocery stores.
How is it contaminated? What's the distribution of contaminants? Do they actually have a significant effect?
You could consider this an artificially selected population, since to keep an airline transport pilot license it is required to maintain a current FAA medical. I'm sure lots of pilots smoke, drink and eat poorly, but they try to avoid getting grounded and career-ended by their physician.
There was a good study and results that just came out about this in the past couple of weeks and it backs up that airline crew have higher cancer rates:
I agree that the level found in the wine is inconsequential, but it's worth pointing out that studies of radiation workers have found that the linear no threshold model does a pretty reasonable job of explaining non-leukemia cancer risk, at least around the levels of radiation equivalent to, say, a multi-decade career as a flight attendant. Granted, the error bars are pretty big, and you have to get to levels equivalent to a trip to Mars for it to get appreciably above background.
TLDR: It's overenthusiastic to think the literature has concluded that low-level radiation is probably good for you, but good experiments are hard.
"Low level radiation probably acts as a beneficial stressor, like exercise or fasting ..."
I have proposed that the common flu that we all get every 12-18 months serves the same purpose and that attempts to avoid this (via the influenza vaccine) could have unintended consequences for the long term health of our immune system ... but I have never been met with anything but incredulity ...
The common cold that you get every 12 to 18 month is a different thing than influenza. A lot of people die of it but the actual incidence is not that high that you get it every year (every 10 years would be quite often actually).
I knew someone who felt exactly the same way, so don't despair :-)
I'm not quite so extreme, but I do avoid flu shots about one out of every three years for the same reason. In any case, I don't get the flu very often.
What? A vaccine is nothing if an outside stressor on the immune system! It is designed to provoke the exact immune system response the real virus would, so it can react more quickly to future infections.
Unless you mean that the upper respiratory tract inflammation itself is what would cause the beneficial effect?
Yes, that's the common refrain - that the vaccine itself serves as that same stressor - and of course that is true, on the date that you take the vaccine.
But from then on, while you certainly maintain information from the vaccine in your immune system, you aren't ramping up the actual mechanisms of your immune system like you do when you actually experience influenza.
Think of the temperature raise, the white blood cell production, possible gastro-intestinal reactions - these are all bodily systems being ramped up to a level of performance they don't typically see.
My thought is that if I "successfully" avoid the flu for, say, the next 40 years, those bodily systems will not be as robust - even if I maintain the informational content of the vaccine itself.
Please, please, do not confuse this or lump this in with kooky anti-science, vaccine opponents. I'm just thinking out loud about how stressors on body systems work - which is what my parent was discussing.
For some context a Becquerel (Bq) is a single decay per second (note the graph shows mBq/l). One gram has an activity of ~3 TBq.
Though 137-Cs has a half-life of ~30yrs it has a biological half-life of ~70 days. (30 if you are treated with Prussian Blue). It fairly uniformly distributes through the body, though it has higher concentrations in soft tissues, which does pose greater damage (see Sievert).
That being said, these are such low quantities you'd die of alcohol many times over before increasing your chance of cancer by 1% over the course of your lifetime.
From Wiki they gave some dogs a dose of 140 MBq/kg and they all died in 33 days. When they gave some other dogs half (70MBq/kg) dosage all the dogs survived.
Very interesting read. I did a double take that a lack of Cesium-137 meant "post-1980" (not "pre-1980") since the levels have dropped off since the testing in the 50s and 60s.
Even more interesting than other sources is the fact that many spinal and similar surgeons reach their lifetime radiation exposure limit in just 10 years of practice. Many (most?) end up with cataracts or other tell tale symptoms of radiation exposure. It's because of all the x-rays needed for minimally invasive surgury.
I apologize for not posting it in the first place.
Ul Haque M, Shufflebarger HL, O'Brien M, Macagno A. Radiation exposure during pedicle screw placement in adolescent idiopathic scoliosis: is fluoroscopy safe?. Spine 2006;1;31(21):2516-20.
A similar example of how sensitive measurement tools are: wine from the Livermore area is slightly higher in tritium than wine from elsewhere in California, ostensibly because of proximity to LLNL.
If it was made from grain harvested in the year or so after Fukushima, of course. Literally every plant grown downstream of the plume (it's not clear if North America is the only such region, just the first one across the ocean -- it's possible this is a global thing) is going to show the signature.
People are misunderstanding this. The point here isn't to point out a specific toxin in wine, it was to point out the pervasive reach of this isotope and its utility as a dating mechanism for long-stored agricultural products like wine.
Whiskey is aged in barrels using white oak from North America. The mash is made of grains like corn, barley, rye, etc. I'm guessing the grains came from areas upwind of the plume. Everything pretty much blew out onto the Pacific.
That's a different problem. Basically the whiskey makers under predicted the demand 12 years ago or so when they were making that batch. Now it's sold out practically everywhere, hard to find even in Japan.
AFAIK the main reason is that Japanese whiskeys started getting awards and demand exploded. Even here in Japan they're expensive, to the extent that scotch has a better bang to the buck ratio.
I hope you understand different types of radioactivity and what they do to your body once they are incorporated in your tissues long-term. Something like 5,000 inhaled particle causing cancer probability to rise by 1%... Enjoy your banana simplifications!
> The team began their study with the conventional measurement of cesium-137 levels in the unopened bottles. That showed levels to be indistinguishable from background noise.
And yet I bet wine snobs are gonna pretend their palates are so refined they can taste it now.
This wouldn't surprise me even a little. I was living in Anaheim at the time of the disaster, and background radiation levels were elevated for so long that I got bored of checking. A normal reading might have been 0.08 - 0.15mSv, and it was super common to see readings in the 0.40-0.50mSv range for months at a time.
After the disaster my professor measured increases in radioactivity to try and determine if there was any increase in background radiation from man made sources. Unfortunately even our liquid nitrogen cooled gallium detector couldn't find anything. I think you need a mass spectrometer to really detect any change in California. (Fukushima itself is a different matter!)
Variation in radiation levels of the magnitude you are showing can be caused by many different sources, most of which are natural. Sun spots, changes in the upper atmosphere, cat sleeping on the detector, etc...
EditNote: We definitely could still detect the consequences of Chernobyl and atmospheric weapons testing. Fukushima didn't come close to touching the contamination from THAT
If I'm reading it correctly, they can date wines based on the amount of radiation given off by them, and line it up with a chart for amount of radiation the atmosphere took in during events.
How could you properly date a bottle if it was in Fukushima during the event, vs a 1960's bottle during the Cuban Missile Crisis?
> the levels of cesium-137 are barely detectable, and even then, only if the wine is destroyed.
The article seems less about the toxicity of the cesium levels, and more about their use as a signature in dating the wine, which I find is still interesting.
The article seems less about the toxicity of the cesium levels, and more about their use as a signature in dating the wine, which I find is still interesting.
And rightly so, by far the most toxic thing in these bottles is alcohol, not vanishingly small traces of cesium.
If the bottle was sealed, it won't pick up measurable cesium. That arrives via atmospheric transport, being picked up by rain water and eventually grape plants.
The unit of measure in this are mBq/l; millibecquerel per liter. The paper reports 14 mBq/l in CA wine immediately after Fukushima. So the common 750ml bottle of 2011 CA wine will average one cesium-137 decay event every 53.5 seconds.
[1] https://en.wikipedia.org/wiki/Low-background_steel