Sorry, I wasn't meaning to quote mine. I honestly didn't think that part was relevant to the point I was making.
Neural networks have taken some inspiration from biology, but that doesn't account for the vast bulk of the work. Biology analogies play a much bigger part in popular explanations than they did in the development of the technology.
(edit) Here, for example, take a look at the original paper on perceptrons. Whole lotta math. This having been released during one of the AI flaps, there was also a fair bit of psychology talk. But not much indication that he was just trying to copy a brain and otherwise didn't understand how it works.
Similar happens with deep learning. They are grounded in theory, most notably a mathematical proof that deep MLPs are, in principle, capable of learning any mathematical function. You just won't see much of that stuff if you aren't actually reading the papers, because most people aren't that interested in vector calculus. Much easier to explain things by analogy.
Neural networks have taken some inspiration from biology, but that doesn't account for the vast bulk of the work. Biology analogies play a much bigger part in popular explanations than they did in the development of the technology.
(edit) Here, for example, take a look at the original paper on perceptrons. Whole lotta math. This having been released during one of the AI flaps, there was also a fair bit of psychology talk. But not much indication that he was just trying to copy a brain and otherwise didn't understand how it works.
https://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-...
Similar happens with deep learning. They are grounded in theory, most notably a mathematical proof that deep MLPs are, in principle, capable of learning any mathematical function. You just won't see much of that stuff if you aren't actually reading the papers, because most people aren't that interested in vector calculus. Much easier to explain things by analogy.