Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Let's say all of that happens. Is there any reason to believe that will lead to "a positive feedback loop of self-improvement cycles, each successive; and more intelligent generation appearing more and more rapidly, causing a rapid increase ("explosion") in intelligence which would ultimately result in a powerful superintelligence, qualitatively far surpassing all human intelligence"?

Or is it more reasonable to suppose that 1) all those improvements might get us a factor of 10 in efficiency, but they won't get us an intelligence that can get us another factor of 10 in efficiency, 2) each doubling of ability will take much more than doubling of the number of CPU cycles and RAM, and 3) growth will asymptotically approach some upper limit?



A lot of tech improvements are s-curves or waves of s-curves, not exponentials. For a given problem domain you will still have s-curves, but for overall intelligence/progress it might be exponential in some overall sense.


Yeah sure, there are physical limits, and doing things like mining, manufacturing, and logistics are really slow compared to just thinking. But that's just time and money. I don't see why it wouldn't happen. You build a machine to make machines. First you have 1, then 2, then 4, then 8, then 16 etc.

Then when the earth is used up, you look at space. Space travel is a lot easier because you don't need to keep a meat bag alive.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: