Hacker News new | past | comments | ask | show | jobs | submit login

First day of introductions to NN we were asked to create all the logic gates using artificial neurons, and then told "If you have all gates, you can do all computations".

I got to admit, I'm sorta sticking to that at face value, because I don't know enough computer science to a) discern if that is true and b) know what "f: X -> Y only for closed domains" means.




I think the easiest way to think about this is in terms of natural numbers, ie. 1, 2, 3, 4.

When you only have a fixed width, ie. a static feed forward network, you have an upper limit to the data you can represent and compute on.

Eg. if the highest number you can represent is 1.000, then you will need a new NN if you want to do computations on 1.001.

... or use an inductive structure, like a recurrent neural network has.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: