> Basic information theory tells us that any noisy continuous system is equivalent to a discrete system of a certain resolution/bit-depth.
Technically you are right - but practically you are wrong. Like 'technically' any logical or even emotional reasoning can be modeled with if-else-structures (maybe add some randomness). But why aren't we able yet to actually create human-like reasoning? Because that approach is 'practically' not useful. That's why the most powerful ML solutions aren't realized with Prolog, but with neural networks at the moment.
> A few bits are sufficient to describe the output of a real-world neuron.
Possibly. But no computer is so far able to even remotely simulate or emulate what is actually going on within a real-world neuron. And that is required to fundamentally understand the output.
> except for a (potentially) reduced energy consumption
And that is quite a big deal. Because the effect of reduced energy in a parallelized system is going to be exponentially relevant!
Technically you are right - but practically you are wrong. Like 'technically' any logical or even emotional reasoning can be modeled with if-else-structures (maybe add some randomness). But why aren't we able yet to actually create human-like reasoning? Because that approach is 'practically' not useful. That's why the most powerful ML solutions aren't realized with Prolog, but with neural networks at the moment.
> A few bits are sufficient to describe the output of a real-world neuron.
Possibly. But no computer is so far able to even remotely simulate or emulate what is actually going on within a real-world neuron. And that is required to fundamentally understand the output.
> except for a (potentially) reduced energy consumption
And that is quite a big deal. Because the effect of reduced energy in a parallelized system is going to be exponentially relevant!
> But on a theoretical level, nothing changes.
But on a practical level everything will change.