Hacker News new | past | comments | ask | show | jobs | submit login

Deep learning was an advance. I think the fundamental achievement is a way to use all that parallel processing power and data. Inconceivable amounts of data can give seemingly magical results. Yes, overfitting and generalizing are still problems.

I basically agree with you about the 20 year hype-cycle, and but when compute power reaches parity with human brain hardware (Kurzweil predicts by about 2029), one barrier is removed.




Human and computer hardware are not comparable, after all even with the latest chips the computer is just (many) von Neumann machine(s) operating on a very big (shared) tape. To model the human brain in such a machine would require the human brain to be discretizable, which, given its essentially biochemical nature, is not possible - certainly not by 2029.


It depends on the resolution of discretization required. Kurzweil's prediction is premised on his opinion of this.

Note that engineering fluid simulation (cfd) makes these choices in discretization of pde's all the time, based on application requirements.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: