Hacker News new | past | comments | ask | show | jobs | submit login

> Each increase in intelligence capability requires substantially more resources (Computing power, training data, electricity, hardware, time, etc). Being smart doesn’t just directly result in these becoming available with no limit. Any significant attempts to increase the availability of these to a level that mattered would almost certainly draw attention.

We know that "intelligence" can devise software optimizations and higher efficiency computing hardware, because humans do it.

Now suppose we had machines that could do it. Not any better, just the same. But for $10,000 in computing resources per year instead of $200,000 in salary and benefits. Then we would expect 20 years worth of progress in one year, wouldn't we? Spend the same money and get 20x more advancement.

Or we could say 20 months worth of advancement in one month.

With the current human efforts we've been getting about double the computing power every 18 months, and the most recent ones come in terms of performance per watt, so then that would double in less than a month.

For the first month.

After which we'd have computers with twice the performance per watt, so it would double in less than two weeks.

You're quickly going to hit real bottlenecks. Maybe shortly after this happens we can devise hardware which is twice as fast as the hardware we had one second ago every second, but we can't manufacture it that fast.

With a true exponential curve you would have a singularity. Put that aside. What happens if we "only" get a thousands years worth of advancement in one year?




I would say that if we experienced that, we would likely experience societal collapse far before the singularity became a problem. At which point the singularity could be just as likely to save humanity as it would be to doom it.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: