Having a feedback loop, reflex reactions and the ability to access the situation and recover dynamically would be much more useful.
Agreed. But none of that implies learning. So why are you talking about learning in the beginning of your comment?
This happens all too often in AI conversations. Learning gives you a special and powerful kind of flexibility, of course. But not being able to learn doesn't imply it can't cope with an infinite range of situations. A robot that's unable to learn could be programmed with enough flexibility to walk on any surface it could possibly encounter.
I meant to say that a learning robot could adapt to a situation and try out possible solutions, measure success and adapt where as a preprogrammed robot would only ever try what it's been infused with. In my analogy of falling down a hill, the learning robot might not be able to stop itself on the first try but hopefully it might adopt a strategy that could allow it to regain control of the situation.
I think it does imply learning. Human babies "learn to walk", by using the feedback from lots of little experiments to improve their ability to navigate uncertain and varying terrain.
Basically, flexibility and the ability to deal with novel situations is close to synonymous with the ability to learn.
Agreed. But none of that implies learning. So why are you talking about learning in the beginning of your comment?
This happens all too often in AI conversations. Learning gives you a special and powerful kind of flexibility, of course. But not being able to learn doesn't imply it can't cope with an infinite range of situations. A robot that's unable to learn could be programmed with enough flexibility to walk on any surface it could possibly encounter.