Hacker News new | past | comments | ask | show | jobs | submit login

I’m an outsider, but what’s the recent “big idea”? It seems to be throwing more compute power and larger training sets at essentially an old technique. This has led to a big improvement in performance, but I don’t see the big conceptual breakthrough.



I think that Bayesian Program Learning as pioneered by Lake et-al is a big idea.

I think that GAN's probably qualify - although you can see that emerging in the SAB series of conferences in the 90's if you read the papers.

On the other hand I do see a lot of small innovations that are enabling many people to create incremental improvements and applications. I feel that that the exploration of the field has been very weak and our overall knowledge is limited and not widely shared. Perhaps the improvements like MCMC search for bayesian reasoning, causality, counterfactuals, GPU's and TPU's and FPGA's and the access to very large data sets for training, forward training and so on will be the actual breakthrough.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: