Hacker Newsnew | past | comments | ask | show | jobs | submit | jasoncrawford's commentslogin

I disagree; this article makes a good case for the term “Dark Ages”: https://slatestarcodex.com/2017/10/15/were-there-dark-ages


This Scott Alexander essay says that the FDA is too strict on some things and not strict enough on others, which is probably right: https://astralcodexten.substack.com/p/adumbrations-of-aducan...


I think regulations can work when they enforce very well-supported, long-established best practices, such that if you don't do them it amounts to negligence.

They might also work better if they say “you can't do it that way, which is known to be unsafe,” as opposed to “you must do it this way, which is the only safe thing.”

Note also that regulatory standards are not the only mechanism in the law to create safety. Liability law can be very effective at creating safety, by giving the right incentives to the right parties, but liability law doesn't tell anyone what to do—only what will happen to you if you cause harm.


Companies are very good at making slight changes to bad ideas and calling them "new ideas". The plastics industry runs on this.


> The living standard of kings hasn't quite changed since 2000 BC.

Not true. Ancient/medieval kings still got smallpox, peed in chamber pots, and shivered in the winter cold. They couldn't hear the music of great performers who had died, or travel to another continent for a weekend trip. Etc.


Arthur C. Clarke's First Law: “When a distinguished but elderly scientist states that something is possible, they are almost certainly right. When they state that something is impossible, they are very probably wrong.” https://en.wikipedia.org/wiki/Clarke%27s_three_laws


Thanks. I think Clark overstates things a bit (unless you take a very extreme form of what "impossible" means), so I prefer my version. It also seems possible that engineers are more likely to hedge their bets by calling things "unworkable" or "impractical" because what actually is and isn't possible in theory is more of the realm of the researcher than the practitioner.


A lot of the impossibilites are relative to the current state of the art.

"X is impossible with current metalurgy". "Y is impossible with current best integer factorization algorithms".

Absolute impossibilities do exist too, but you basically need to prove that that particular thing goes against laws of physics/maths.


An example of this:

Andy Grove in 1992: "The idea of a wireless personal communicator in every pocket is "a pipe dream driven by greed."


This is a truly weird claim to have made at least 85 years after the idea entered the public sphere and at least 19 years after a phone that was intended to be carried by a person not a car was demoed by Motorola.

Did he also poopoo the idea of a public internet and wait until the 80s to do so?


Unless we also had progress in energy technology that could overcome this, which unfortunately we haven't.


You can't make something cheap enough to overcome quadratic growth in demand for very long.

Not even CPU cycles.


It's not always assumed to be exponential. It was assumed exponential by Paul Romer because long-term economic growth is pretty exponential. Much subsequent work has followed that model, but not all of it.


But TFP is not output. Exponential output growth can come directly from exponential input growth, which would be a trivial obvervation.


> long-term economic growth is pretty exponential

Yeah, that graph with the largest time scale on the article strongly disagrees.


Yes, but if your data is actually exponential, the linear segments are not going to be better approximations than an exponential curve. That's what's going on here


I'm not sure that's true in general, nor even frequently. In fact, I'd say it's provably false in general.

The big issue is that you get MANY more curve-fitting parameters to play with if you use a piece-wise linear model vs. an exponential model. (You get to choose HOW MANY breaks to make, what the slope is for each section, and WHERE to make the breaks.)

So... Let's say you created some synthetic data using an underlying exponential plus a normally distributed random number. Obviously, the BEST predictive model is an exponential one. However, for any arbitrary number of observations, I guarantee you there's trivially at least one piece-wise linear model that will have less error than the exponential one. Consider the one that is simply a straight line between EVERY point. Obviously that has zero error compared to the exponential model. Yet, it has very little predictive power compared to the exponential model.

Now, that's not what was done here... but there's actually quite a few parameters in the form of where to make the breaks and how many to make. Doesn't seem like a fair comparison.


Good point.

The paper does cross-validate the models, and I am told that cross-validation properly penalizes overfitting with too many parameters… but I don't understand the statistics well enough here.


For any sampled data you'll get guaranteed 100% fit by making it pieceways constant with N fragments where N=number of data points. It says nothing about the function you sampled, it's just a way to cheat by overfitting.


The dangerous bit is that an exponential curve will also be a fairly good fit for a logistic function that's not yet fully observed.


Every apparent exponential in the real universe must actually be a logistic or some other bounded curve.


Otherwise, we would have about a hundred trillion people infected with Covid by now.


Except for the size of the Hubble volume /s

Cosmic inflation should guarantee an exponentially growing observable universe.


Yes, and the one thing you expect on a logistic function is arguments about whether it's linear or exponential.

But with the amount of noise in economical data, I don't think is evidence of anything.


True, fair point.

Yeah like the article mentions, they are basically making an analogy to the idea of “punctuated equilibrium” from evolutionary biology. Here’s a good exploration of how punctuated equilibrium works, vs the alternative which is called gradualism.

https://gvpress.com/journals/IJBSBT/vol3_no4/3.pdf


NB: "Note that both of these charts are on a log scale."

Appears to apply to the two preceding linear-scale charts.


With noise and enough segments, the linear functions can certainly fit better.


If you zoom in far enough every curve looks linear.



> Henri Poincaré famously described them as "monsters" and called Weierstrass' work "an outrage against common sense", while Charles Hermite wrote that they were a "lamentable scourge".

I wonder if there is a really long compound German word for "an achievement whose greatness is best measured by the degree to which it disgusts experts in the field."


Heh. Reminds me of how gamers manage to find exploits to cheese speed runs while developers react in dismay.


Not a bad description of the history of analysis. Turns out function spaces are absolutely full of gross things that don't quite fit nicely into your theory.


Wow that’s super cool, didn’t know about this


Thank you :)


Breeder reactors, as others have pointed out. Oklo is a modern design based on the concept: https://www.nrc.gov/reactors/new-reactors/col/aurora-oklo.ht...

Also see “Problem free nuclear power and global change”: https://www.osti.gov/biblio/614877

And “Nuclear fission power for 21st century needs”: https://www.sciencedirect.com/science/article/abs/pii/S01491...


You can see the table here: https://people.physics.anu.edu.au/~ecs103/chart/?ShowStable=...

When I click on the lead-208 box, it credits data from:

NUBASE2020: https://doi.org/10.1088/1674-1137/abddae AME2020: https://doi.org/10.1088/1674-1137/abddb0

It also only says that alpha decay is “possible”, not that it happens with any frequency—in fact, it lists Pb-208 as stable.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: