Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So basically pruning the neural network makes it faster. Isn’t this effectively what we are doing since birth? As babies we just have a bunch of semi-random connections, and then we prune away as we learn.

And in AI it’s basically the same. The fewer number of hidden nodes in your neural network the better, as long as there are enough nodes to accomplish the task.



Except the mind is creating new neural connections throughout life. The idea that neuroplasticity is a baby's only phenomenon is a common myth. In fact, we can see new connections being formed whenever new skills and memories are formed in the elderly, and everyone has the capacity to make new connections at any time. Sure, the RATE of those connections being made is lower over time, we are constantly decelerating in our ability to add new skills.


Making new connections doesn't preclude pruning old ones (or pruning new ones for that matter). And in fact the brain is doing both all the time.

I just didn't think it was relevant to this discussion to go deeply into the details of neurogenesis.


Babies are born with the capacity to learn language, motor skills, etc. The brain is very structured at birth for these capacities. So no, it isn't random, but probabilistic. Meaning the structure of the brain and connections are probabilistically formed via evolution. Or is this what you meant by semi in semi-random?

In ML, think of evolution as hyper-parameter tuning. Most ML models have a defined structure but are tuned via learning. Pruning is probably such a tuning strategy in the brain.


Yes that’s what I meant by semi-random. I was trying to keep it as layman as possible. :)


Something... something... Kolmogorov complexity... lost in thought.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: