Hacker News new | past | comments | ask | show | jobs | submit login
In 17th century, Leibniz dreamed of a machine that could calculate ideas (2019) (ieee.org)
152 points by MichaelMoser123 on July 29, 2023 | hide | past | favorite | 51 comments



Leibnitz followed very closely in the footsteps of the Neoplatonists and he was what you'd call a rationalist's rationalist. He would be later rebuked by Hume (the famous is-ought problem made moral—ought—problems fundamentally distinct from rational—is—problems) and Kant would put the nail in the coffin of the rationalist-empiricist debate in the next century (with his earth-shattering Critique of Pure Reason). And if that wasn't enough, as the logical positivists of the early 20th century were still clinging to some form of mathematical completeness, Gödel proved that the project dreamed up by Leibnitz (and more distantly by Plato) was a dead end, to Wittgenstein’s, Russell’s and many others' dismay. Some things (even true ones!) are simply unprovable.

I love this story as it spans more than 2000 years, and even though the idea itself proved to be untenable, this search gave us the enlightenment, the industrial revolution, the computer age, and beyond.


>and Kant would put the nail in the coffin of the rationalist-empiricist debate in the next century (with his earth-shattering Critique of Pure Reason)

Kant was mostly convincing for himself (to whom he was a great fan of) and Kantians. His arguments were hardly definitive.

>Some things (even true ones!) are simply unprovable

Within the context of a system with certain algebraic properties.


> Kant was mostly convincing for himself (to whom he was a great fan of) and Kantians

This is shown to be false by the historical record. Kant was, unquestionably, the most influential thinker of his day.

Centuries later, Einstein himself felt that it was necessary to write about how Kant was wrong (given modern physics and general relativity), which shows how convincing and influential Kant's arguments were; there's no point to debunking things no one believes.

You're probably looking at Kant with contemporary eyes, thereby deciding that he's obviously wrong and completely unconvincing. But no one looked at Kant with contemporary eyes until contemporary times, and there's a very long road of extremely important and influential scientists/mathematicians/philosophers replying to Kant; which is how our contemporary perspective came to exist.

>> Some things (even true ones!) are simply unprovable

> Within the context of a system with certain algebraic properties.

Right, that's what "provable" means. There's no such thing as a proof, outside of formal inferential systems.


>You're probably looking at Kant with contemporary eyes, thereby deciding that he's obviously wrong and completely unconvincing

Rather with late 19-th and 20th century eyes, those continental philosophy (as opposed to analytic philosophy).

>There's no such thing as a proof, outside of formal inferential systems

Well, only if we constrain it to formal axiomatc proofs. But there are exhaustive proofs (which don't need axiomatic steps), as there are proofs by example, and even proving by non-axiomatic argumentation and fuzzy evidence (as in a court). Leibniz's/Russel's/Hilbert's axiomatic mathematical proofs are not the only game in town, nor do they exhaust human reasoning.


> with late 19-th and 20th century eyes

Oh, so centuries later? Right, just as I said.

> those continental philosophy (as opposed to analytic philosophy).

W. V. O. Quine, one of the most influential analytic philosophers (and mathematician, responsible for an early attempt to formalize mathematical axioms), still felt the need to write against Kant in one of his most influential papers ever (Two Dogmas of Empiricism), well into the 20th century.

> only if we constrain it to formal axiomatc proofs

Umm, no. Inferential systems need not be axiomatic. I was explicitly speaking about inferential systems, not axiomatic.

> there are proofs by example

No, there aren't. There are proofs by contradiction, wherein the truth of a proven statement refutes the candidate-truth of some other hypothetical candidate-truth. But there's no such thing as "proof by example".

> proving by non-axiomatic argumentation and fuzzy evidence (as in a court)

Again, no. Not even judges refer to this as "proof". Evidence is not proof. You're (maybe on purpose?) conflating "deduction" in particular with "reasoning" in general.

> Leibniz's/Russel's/Hilbert's axiomatic mathematical proofs are not the only game in town, nor do they exhaust human reasoning.

Ahhh, there it is. You're mixing up several concepts here. I suspect because you seem to think your pride (i.e. ability to prove things) is on the line. But it isn't. I'm just an autistic spectrum person, trying to make sure people use words in ways that make sense and follow well-established precedent. It appears we have two wildly different projects.


>W. V. O. Quine, one of the most influential analytic philosophers (and mathematician, responsible for an early attempt to formalize mathematical axioms), still felt the need to write against Kant in one of his most influential papers ever (Two Dogmas of Empiricism), well into the 20th century.

Yes, because Kant was pre-analytical, so ultimately not compatible with the analytical program as it developed in the 20th century. But the contintental side started the putdown quite earlier (of course from a different perspective).

>No, there aren't. There are proofs by contradiction, wherein the truth of a proven statement refutes the candidate-truth of some other hypothetical candidate-truth. But there's no such thing as "proof by example".

Any statement to the effect of "A thing with X, Y, Z, ... properties exists" can be proven by an example of such a thing. "There are numbers larger than 9 divisible by 2" Yes, here is 10 or 56 or .... Any Y that fulfills both those properties proves that the statement is true. Those types of proofs are called "existential proofs".

>Not even judges refer to this as "proof"

Ever heard the phrase "prove beyond any reasonable doubt"? "Burden of proof"? All terms used in law across many cultures, including the US.

>Inferential systems need not be axiomatic. I was explicitly speaking about inferential systems, not axiomatic.

You might have, but Godel's proof (which we were discussing) is formulated on axiomatic systems though.

>I'm just an autistic spectrum person, trying to make sure people use words in ways that make sense and follow well-established precedent. It appears we have two wildly different projects.

Well, and I'm a fellow dweller trying to make sure we don't miss the forest for the trees (as we oft to do), and don't constrain human expression, including the use of words and the ability to prove something, into artificial constraints or some strict adherence to a narrow concept of officialdom ("well-established precedent").

Keep in mind that this part of the subthread started with my answer to the praise of Kant, whose philosophy was quite far from using a "formal inferential system". My original point was about how Kant was only a definitive influece to a certain lineage of thinkers.

Then came a answer to clarify what Godel proved. My point there was that his proof applies to axiomatic systems with certain algebraic properties, not to every system of reasoning. It's a common trope in pseudo-science to take Godel's incompleteness theorems to apply way beyond their scope (even in social matters).

It's under that light that I described alternative ways people prove things, to showcase that this limitation doesn't apply anywhere something is proven (to a societal satisfactory way, not necessarily formal proofs).


What's an inferential system that is not axiomatic?

You mean something kike natural deduction? Oor Hilbert style calculus?

I.e. logical calculi that one seta up before one sets up a theory (which "initial statements" as axioms)?


> Within the context of a system with certain algebraic properties.

This downplays the importance of the set of systems for which his proof holds and makes it sound like it applies to some obscure branch of mathematics.

It applies to a huge set of important systems, not least of which is any system that is sufficiently expressive as to uniquely identify the natural numbers.


> Within the context of a system with certain algebraic properties.

The algebraic properties are those that formalize arithmetic, so this encompasses almost any reasonable system


> the project dreamed up by Leibnitz (and more distantly by Plato)

More distant than that. It all comes from Euclid. Who, you know, was actually tremendously successful in that project.


> More distant than that. It all comes from Euclid. Who, you know, was actually tremendously successful in that project.

I of course know of his Elements, but is there any evidence that Euclid was an axiomatic reductionist? Was he trying to turn everything into an axiomatic system? I regrettably don't know enough about him and should probably rectify that (any book suggestions?).


I don't have any real suggestions, but the two important Euclidean books were the Elements and the Data; the latter is about what exists, and the former is about the relations between what exists.

You're right that Plato was the first to write that there are non-mathematical relationships, and to try to formalize them, but what he meant by "non-mathematical" basically meant "non-geometric"; recall that we're talking about a few hundred years before the invention of algebra.

This laid the seed for Aristotle to formally declare that logic is its own discipline, rather than just the method used in geometry, and this is when we see these projects extend to everything, but no longer as axiomatic pursuits.



I love your coherence. Can you recommend a book (if you have read it) that explains in detail what you have summarized?


If you want all the details, I can recommend a philosophy degree :P


Hmm. Do I understand your comment correctly in that thoughts should be either rational or not?

I think different kinds of thinking have their applications in different contexts. Gödels theorems are hardly ever relevant to most of mathematics and not in the least to computers (which are finite).

I also doubt that industrialism has anything to do with Leibniz or Hume. That part of history is most likely fuelled by greed for money, not for philosophical thought.


> Do I understand your comment correctly in that thoughts should be either rational or not?

You didn't understand correctly. "Rationalist", in the OP, is a historical label which can retrospectively be applied to a specific set of thinkers, who defended specific beliefs, from the 16th to the 18th centuries.

None of them were claiming they were the only people who thought "rationally"; rather, they were mostly claiming that only rational thought could reach truth, and empirical evidence need not factor into it.

In modern parlance, the Rationalists believed that all knowledge would end up being deductive, like math.

> Gödels theorems are hardly ever relevant to most of mathematics and not in the least to computers (which are finite).

https://en.m.wikipedia.org/wiki/Halting_problem

> I also doubt that industrialism has anything to do with Leibnitz or Hume. That part of history is most likely fuelled by greed for money, not for philosophical thought.

In this time period, the intellectual circles that Leibniz and Hume frequented are exactly the circles that gave rise to modern economics, the ability to measure longitude at sea, the ability to calculate rates of change over time, using steam to power machines, etc. In other words, we're literally talking about all of the intellectual developments that directly led to the industrial revolution.


> None of them were claiming they were the only people who thought "rationally"; rather, they were mostly claiming that only rational thought could reach truth, and empirical evidence need not factor into it.

This is not my understanding. Rather, the disagreement relates to the primacy and necessity of rational thought in some places. Check out https://plato.stanford.edu/entries/rationalism-empiricism/ and you can find two examples of possible interpretations of each position. To be a rationalist does not mean to say that empirical evidence need not factor into knowledge, but just that some is unobtainable via experience, or that gained via rational thought is superior.

There's no doubt multiple views people have held that fall under each umbrella, but I just wanted to highlight that I think the rationalist does allow for some sources of knowledge that are obtainable only via experience. And I can't see how they could deny that experience is required for some knowledge.

(I may have gotten some subtleties wrong, it's a long time since I looked into this)


Thank you for explaining the difference between rationalism and empiricism.

My comment was probably not very clear -- I wished to verify the opinion of the author who wrote the previous comment, not philosophy as a whole. The specific statement that I doubted was "this search gave us ..., the industrial revolution, the computer age, and beyond."

My point about these philosophical thoughts in relation to industry and computers is that it is all quite academic. The halting problem is very interesting to some computer scientists, but it is, IMHO, not at all relevant to produce working solutions in any industry.

However, I do stand corrected with respect to your last paragraph! Leibniz did indeed offer many practical tools, for instance through his differential calculus.

I still wonder about the impact of Hume and Plato on computers. I'd still dare say that these developments could have taken place without academic philosophical thought. That does not take away from the search being amazing and very interesting indeed.


I wonder how his vector math was? Because it sounds like he had the conceptual underpinnings of the altar we’ve all been praying at lately.

It’s really humbling how a fundamentally simple algorithm interpreting a ridiculously complex and vast data set is capable of a simulacrum of thought.

That data set, of course, is the formalisation of human culture, taken from our works. It begs the question of whether our fundamental algorithm for parsing that data is really that much different. We cannot think without tokens, symbols.

We can be, and we can feel, but we cannot think without them. So, where is the intelligence, really? Is it in our heads, or in the data?

Is the data the computation, like an unfathomably vast choose-your-own-adventure book, or is the computation the data, like a one time pad decryption algorithm that creates a universe simulator from the chaotic arrangement of atoms in a beach full of sand?

Is this really “artificial” intelligence, at all? Or just the steam engine of human intellect?


> I wonder how his vector math was?

He was born around the same time as the invention of analytical geometry (i.e. the idea that it is possible to do geometry using algebra), and "vector math" (or linear algebra) came several decades later.

So, his vector math was non-existent.


The Baroque Cycle contains a fun digression describing the combinatorial logic behind this machine.


I loved that section and wasn’t sure how much livery Stephenson might have been taking. Not much, it seems.


Leibniz was envisioning something like Douglas Lenat's Cyc project. In his grant proposal to the French court, his first intended stage was to reduce all of human knowledge to propositional form. He estimated it would take about 5 years given a small clerical staff and a good library.


What do we dream of now?

I've been reading 70s and 80s sci Fi and loving all of the ideas of the future they had. I don't see them today but I don't know what to read.


Mastering biological processes so that we can ensure that there is enough food for everyone (of a reasonable variety)[1] and that maybe the plants can do well and clean up our environment and make it nicer to live in?[2] Maybe to achieve long enough human lifespans that we can think of interstellar travel?[3]

1 - Hal Clements _Space Lash_, short story "Raindrop"

2 - L.E. Modesitt, Jr's The Forever Hero trilogy (as well as the "The Mechanic" from Space Lash)

3 - Poul Anderson's _The Boat of a Million Years_


I'd say Ian M. Banks Culture series [0]. Arguably also written in the 90ies but it draws a bit more of a positive picture. I'd love to live in an environment like that.

[0] https://en.m.wikipedia.org/wiki/Culture_series


I think cyberpunk was pretty spot on?


That's the 90s though, right? What about now?


Now we dream of commercializing a hype for 5 years and moving to the next one.


Don't forget sequels to hypes from the eighties.


Related: Dijkstra's "Under the spell of Leibniz's Dream"

https://www.cs.utexas.edu/users/EWD/transcriptions/EWD12xx/E...


> Swift’s point was that language is not a formal system that represents human thought, but a messy and ambiguous form of expression.

Yep even the greatest minds are susceptible to false dichotomy. We now have linguistics and computer science, yet some aspects of thought remain forever intractably messy.


Also Newtons method/gradient descent + calculus play a big role in backpropagation/ML (Leibnitz was one of the inventors of calculus).


I sometimes wonder if Turing picked factorial as the example in his 1949 "Checking a Large Routine"[0] purely at hazard, or possibly as an homage to Leibniz, who was fond of them[1]?

[0] https://turingarchive.kings.cam.ac.uk/publications-lectures-...

[1] see "Dissertatio de arte combinatoria...", 1666 https://ia800906.us.archive.org/BookReader/BookReaderImages....


(2019)

Note that this article precedes both ChatGPT and GPT-3. When it was written, Leibniz' idea of a machine reasoning by manipulating symbols was still science fiction. Now it is very much reality.


Does ChatGPT actually manipulate symbols, or does it string together strings of characters based on what most frequently occurs next? I haven't seen anything that indicates actual working w/ symbols/logic/ideas.


"string together strings of characters" = "manipulate symbols", no?


There is a big difference between

"War and Peace"

as a string of ISO-Latin1 characters which are frequently found in that order, and "novel by Tolstoy which contains many insights into the human condition" _and_ understanding the symbolism of said insights.

A grade school student who has read and understood _War and Peace_ would be able to write a paper which is original to the degree that it did not previously exist as that exact sequence of characters, even if it had no original insights, while ChatGPT would regurgitate the most frequent combinations of characters which the metadata and so forth indicate are in the context of writing about that novel.


i think it is working with embeddings - meaning that each word (or word stem) is represented by a very long vector of floating point numbers. These vectors are the result of something like word2vec. The input text is transformed into these embedding vectors and lined up in sequence (that's called the context window), then this gargantuan ML model works starts to process this gargantuan input sequence.


Yes, but it's a processing by association, not understanding.


s/frequently/likely

(Large language models are not plain Markov chains, contrary to popular belief)


Having parameters for "temperature" and so forth, while it might provide sometimes useful functionality, does not change the basic nature.


Are the billions of parameters all symbols, or are you referencing math manipulation as symbol manipulation?


Both the inputs and outputs are symbols: text.


The machine is considering text the same way as RGB pixel values in the annealing process. Are these pixel values symbols?

Maybe it’s all information and symbols are a small way to grasp at it but not the complete way.


Leibniz dreamed that we would be able to completrly determine if some claim was true or not. We can't do that for all claims, but at least in mathematics, we can prove with absolute certainty that some axioms will lead you to other claims. So at least a little piece of his dream is a reality.


I was about to mention Ramon Llull, but it's there.



Newton has left the chat


Unless a symbol is as fuzzy as the bray of a donkey of Buridan.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: