If you’re comparing hardware you’re probably right. Our big blob of neurons has a completely different mode of operation compared with the silicon circuits in the computers we know.
However, if you go higher up the ladder of abstraction the story changes I think. Thinking in terms on software design and architecture you can start using a similar vocabulary.
We can talk about systems and sub-systems, foreground and background jobs, interfaces and telemetry, sequential and parallel processing, latency, efficiency versus accuracy, overfitting, etc.
The underlying implementation might be completely different there are similarities and thinking about them can be useful.
> Thinking in terms on software design and architecture you can start using a similar vocabulary.
But I think that's exactly the example of where the metaphor starts to give us bad information.
The biggest difference between the brain and a computer is that the brain is fundamentally parallel. Not more threads in a GPU parallel, but rather the processing the brain does is the manifestation of the parallel actions of a mass of information processing units interacting with one another.
To give an example of where using the computer metaphor gives rise to very misleading assumptions because of this: in many cases with a computer, more data equals more cost. If you need to iterate through a million data points to get a result vs. a thousand, it's going to take a lot more time to reach the answer. But with the brain it's precisely the opposite. More data points means a denser network of interconnections which can give a better answer sooner.
While as of late I try to avoid drawing this metaphor - what your last sentence describes is the act of using a deeper neural network model trained on a larger/cleaner/more representative dataset.
ANN's are a better metaphore (well they should be, they are biologically inspired) but are still woefully inadequate as analogies for what the brain is actually doing.
However, if you go higher up the ladder of abstraction the story changes I think. Thinking in terms on software design and architecture you can start using a similar vocabulary.
We can talk about systems and sub-systems, foreground and background jobs, interfaces and telemetry, sequential and parallel processing, latency, efficiency versus accuracy, overfitting, etc.
The underlying implementation might be completely different there are similarities and thinking about them can be useful.