David Hume's Enquiry posits that our experience (and behavior) is heavily determined by networks of perceptrons, but he's never going to get credit for that.
That would've been speculative with a dash of poetic license I presume ? Please correct me if I'm wrong but I don't think Hume proposed any mathematical model in this regard. Would love to see some references. Thanks
I'm not sure there's any short quote that summarizes it. But as I read him, thought is composed of ideas, which are themselves "no more than the compounding, transposing, augmenting, or diminishing" of other ideas. And ideas are very simple things, fundamentally based on impresseions like awareness of the colour red or of pain. And these ideas relate to each other in terms of "only three Principles of Connexion among Ideas, viz. Resemblance, Contiguity in Time or Place, and Cause or Effect".
I think equating his 'ideas' with perceptrons might be anachronistic. But it does seem to suggest complex and abstract ideas are composites, built of networks of networks of simpler ideas, the connections between them being of a simple nature as well, perhaps all boiling down to nothing more than many stacked layers of basic sensations being present or absent, plus how they are sequenced and associated.
It's not completely revolutionary; much of it follows Aristotle. The particular emphasis on 'ideas' and their 'connexions' being everything to thought, is perhaps original to Hume. This is in essays 2 and 3. Makes me think of Leucippus and his atoms. Not really our atoms, and yet still uncanny in their resemblance, especially with so little to go on. Disclaimer: I haven't read it in some 20 years besides skimming for the above quotes, and I'm not a philosopher.
I think the more someone considers the whole work, the clearer it becomes that Hume really did have a coherent model in mind, but the chapter "Of Probability" is a spot that might ring a bell for reasons that will be obvious:
I think this section on animal intelligence helps spell it out too (I also have to wonder if people just ignore this part in general when they claim he's an "empiricist" in the understood sense, or maybe they just failed to fill in the missing shade of blue):
Thanks for the share! Does anyone have similar recommendations ( books | links | videos ) on the history of statistical inference ? Would love to know more about the evolution of statistical thought all the way from Pearson, Fischer etc.
The Theory That Would Not Die, by McGrayne was fun, about the history of Bayesian statistics.
Also not exactly what was asked, but so good: James Gleick’s ”Information”. It is about the history of information theory or how to statistically argue about information. Really starts from the beginning, eg. how some tribes used drums to talk to each other over kilometers away.
I have got to the bit where he mentions the TV show where he demonstrates a talking NN with 20000 parameters. Would love to find that online (I tried).