Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This guys dig at AI is a little conflicted with the exponential hangover idea. Sure we can only simulate a 300 neuron worm right now but, if we every achieve a computationally bound solution and experience exponential growth at a rate similar to Moores Law, in 50 years we could be watching real AI cat videos on Youtube, complete with awful UI controls that customise the feline personality in realtime. Oh, in Javascript of course.



100 years ago we couldn't compute. 150 years ago we didn't even know neurons existed. I wouldn't be so fast about discounting the progress here.


> if we every achieve a computationally bound solution and experience exponential growth at a rate similar to Moores Law,

His point is that we won't continue the curve, and he's offered evidence/opinion that we're already starting to plateau.

Which means AI would need to be solved within ~current computational bounds.


We're plateauing on single processor performance. Not on parallel computing.


It's also worth noting that the brain itself is ticking at something around 200Hz tops, but it's massively parallel.


Not physically, not yet.

But it seems like high-performance parallel computing paradigms are forgotten and reinvented every year.


Yup. Better software needed. But as an indication of how close we're getting Hans Moravec, a robot builder and robotics professor did a quite reasonable calculation of the processing power needed to build something equivalent to a human brain, using computer style algorithms rather than nerve simulation and came up with 100 million MIPS, ie. 100 terraflops (http://www.transhumanist.com/volume1/moravec.htm)

If you compare that with 2015 hardware then the Nvidia Titan X GPU does about 7 terraflops and costs about $1000 so with 15 Titans and a $20k system an hacker can have a reasonable go at building a human level AI (excluding memory hardware that is). That's only recently come about that that kind of power is getting down to hacker budget levels. It'll take a while to sort the software.

Though some recent software stories like https://news.ycombinator.com/item?id=9584325 and https://news.ycombinator.com/item?id=9736598 show progress.


Awesome! Thanks for those links. :)


I found this vid quite interesting about Deepmind which kind of shows where things have got to. Their AI algorithm can learn to play Space Invaders and Breakout better than people starting from just being fed the pixels. It doesn't do well at Pacman though because they have not cracked getting it to understand spatial layout and planning ahead. https://www.youtube.com/watch?v=xN1d3qHMIEQ


Update: Nvidia's Pascal GPU should be 28 terraflops, out 2016 and given much of the brain is not active at one time that's probably getting to similar processing power. It uses 1.3kW so you can heat your room with it too.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: