Hacker News new | past | comments | ask | show | jobs | submit login

Given many people don’t have an inner monologue and function just fine, it’s more likely inner monologue is a product of the reasoning process and not it’s mechanism.



It’s commonly conjectured that the emergence of human-level reasoning wouldn’t have been possible without the development of language. Personally, I’m able to suppress “word thoughts” in my head (for a short time), but then I lose almost all of my reasoning ability. I could imagine that reasoning is language-based even when it’s not conscious for some people. An internal process being there, and being conscious of it, are two separate things. We would be happy with an AI using an internal monologue without it being conscious of that monologue.


Maybe, but symbolic thought can get pretty far away from what we generally call "language." I bet you can reason 1+3x=22 pretty easily without any words whatsoever, or the sound of one ascending octave after another, or the approximate G-force induced on your body if you take the next turn without applying the brakes.

All of these forms of reasoning are true and useful calculations: when we talk about "intuition" what we usually mean is that we have a lot of experience and internal reasoning about a subject, but we struggle to translate it to and from the "language" part of our brain. Nonetheless, any social dancer will tell you that a dialog is possible just by receiving and inducing g-forces alone. You can reason this way about abstract concepts like orbits without ever touching a single word or concrete symbol.

Edit: the key aspect of reasoning, imho, is the ability to make predictions and introspect them against a database of other predictions, using an adversarial heuristic to weight the most plausibly useful results. Perhaps our pattern matching AIs of today just lack sufficient "experience" to do what we call reasoning.


>I bet you can reason 1+3x=22 pretty easily without any words whatsoever

I've tried to do it, but I can't. I had to do something like "ok, so we subtract one from both sides and then it's easy, 3*7=21". Maybe I could do 2+8 but I still think the word ten "aloud".


I was able to do it with no words. I 'saw' the steps as if on a piece of paper. I saw 3x=22-1=21, then x=21/3=7. But I have a degree in applied math. Perhaps not internally vocalizing is just being extremely familiar. It also happened very quickly, perhaps there was no time to vocalize anyways.


To be fair, math is a language in itself... with many dialects come to tcreditors.

At the end of the day though, thought requires communication, even if internal. Even physics is modelled as some sort of 'message passing' when we try to unravel what causality really is. Similar to how a processor has cycles, I think/know similar (but unsynced) happens as part of what we call 'thinking'.


A decent number of folks can jump straight to the answer on something so straightforward, no steps.


Most people can't do 1 + 3x = 22 without any words or symbols. People who can don't realize that most people can't. I'd argue one isn't using logic when they do that, it's just very good pattern matching.


It's also possible to do mentally by visualizing it rather than internal monologue. You can imagine the 1 on the left arcing over to the right, cancelling the 22 to 21, then the 3 moving under the 21 and the 21 descending through the 3 to become 7.


yup. I considered myself an /extremely/ verbal person when reasoning, but what I do with the above feels closest to 'moving the 1', almost like balancing a mental scale.

I never really noticed that before. I'm not great at math, fwiw.


A decent number of folks just have the answer pop into their head, no logic or thinking required.


Regarding “1+3x=22”, I’m actually not sure, the number words certainly appear in my head when solving the equation. But even then, I would count “1+3x=22” as constituting language. Perception of sound, G-forces, and dancing don’t perform type-2 reasoning by themselves, so I don’t think your argument applies there.

Regarding your edit, no, I think the key aspect of the kind of reasoning we are missing in current AI is the ability to hold the reasoning in your mind, and to iterate on it and evaluate it (judge it) within your mind.


It is very difficult to have a discussion using words to discuss the semantics of non-word or non-symbolic semantics. I was pointing at several different plausible spaces for semiotics and how these spaces could be spaces for reasoning in the hopes that one of them might be relatable.

If you use words in your mind when you use math, and you use words in your mind when you make or listen to music, etc., then it is very difficult to find a common ground where it is possible to see that these other realms of thought are capable of not only prediction, but also producing evidence that leads to judgement. That is to say, the key aspects of "reasoning." I picked them because I thought they had broad enough appeal to be relatable, and because I do not personally hear words in my head when doing any of these activities, whether it's calculus or tango, but I still find calculus and tango to be places where reasoning occurs.

Some of them, like math or music, are closer to the kind of symbolic thought we use when we discuss things with words. Others, like the experience of g-forces, are not. I present them as a sliding scale between "word based" reasoning and "non-linguistic" reasoning. Perhaps you can think of a realm that better fits for your personal experience of intuition, and inspect whether these intuitions are capable of "real" reasoning in the absence of language, or whether intuition should never be trusted even when you have a great deal of experience in that area. Perhaps in your estimation, anything that cannot produce evidence that is articulable in word form is suspect.

Personally, I find all these methods, including language, to be suspect. I don't find language to be especially better at producing the kind of evidence for prediction, correct judgement, or discourse for reasoning than other methods, unless you reduce "reasoning" to tautologically require it.

One of the best tools of language is that we have writing that allows easy inspection or iteration of the written content; but these things are possible in other realms, too, it's just that we didn't have great tools for introspecting and iterating on their "ideas" except within our own minds. These days, those tools are readily available in many more realms of human insight.


i feel it is worth pointing out, as another commenter highlighted, language and even symbolic more abstract languages, bring about fluency if you've practiced "speaking and writing" it enough.

i think native speakers hardly "think" about the steps necessary to form a grammatically correct expression, and most of the time just "know".

fluency is not the same as lacking an internal framework for interpreting or thinking in terms of, symbols.


Brains are weird. I reason almost entirely non-verbally and I would absolutely struggle if I had to laboriously express every thought in words. Its part of the reason I don't work well in teams. So slow!


What defines the boundaries of internal vs external? Certainly nothing about llm weights or ops should.


Language is a serialization of our brain's "world model" structures.


The existence of an "inner monologue" isn't really a falsifiable claim. Some people claim to have one while other people claim not to, but we can't test the truth of these claims.


Feynman came up with a potential test by tracking the things he could and couldn't do while counting seconds (via internal monologue). He found he generally count not talk while counting.

He then had others try and found that one of his mathematician friends was able to talk just fine while counting because it turned out he was counting visually.

https://www.youtube.com/watch?v=Cj4y0EUlU-Y


In this particular case, is there any reason why we simply can't take their word for it? This is not a case of where if I say "weak" or "strong", most people pick strong because no one wants to be weak, even if the context is unknown (nuclear force for example).


> In this particular case, is there any reason why we simply can't take their word for it?

My concern is that if we take their word for it, we're actually buying into two assumptions which (AFAIK) are both unproven:

1. That "Internal Monologues" (not consciously forced by attention) exist in the first place, as opposed to being false-memories generated after-the-fact by our brain to explain/document a non-language process that just occurred. (Similar to how our conscious brains pretend that we were in control of certain fast reflexes.)

2. Some people truly don't have them, as opposed to just not being aware of them.


Not only are they unproven, but are ultimately not provable at all. Some people will say yes, some people will say no. Probably we can take their word for it, but in the simplest case they could just lie (in either direction) and we would have no way to tell.

In short, maybe these inner monologues exist and maybe they don't, but science can't comment on that. That said, it is clearly something we are interested in, but it will need to be addressed in some other way (i.e. religion, ideology, etc.).


> but are ultimately not provable at all

No, they are potentially falsifiable as we get better at scanning, identifying, intervening in brain activity.

Just off the top of my head here, suppose we create a table puzzle problem that (in itself) doesn't require language to understand, like ones we make for certain animals. Have human subjects (silently) solve it. Afterwards, quiz the solvers about their internal monologue--or lack thereof--dividing them into two groups and noting the words used.

Now change to a second puzzle of similar style and same overall difficult. Stun/anesthetize the language-centers of subjects, to deny access to any of the monologue-words (validating this intervention will involve other research), and then test them on the second problem.

* If performance is constant for both groups, that suggests the monologue is illusory or at least not needed for this kind/scope of problem.

* If performance drops for both groups, that suggests the no-monologue people might just not be as aware of a linguistic process that's actually happening.

* If performance drops for monologue-subjects, that suggests it's a real and important difference in modes of logical thought.

* If some other combination happens, you have an mysterious and exciting new line of research.


Sure, there is stuff we can do to tease around the edges (similar problems crop up all the time in psychology and sociology) but we will always have to weaken the claim in order to do experiments relating to it.


> Probably we can take their word for it, but in the simplest case they could just lie (in either direction) and we would have no way to tell.

Individually, no, but in general, for people to consistently lie about this particular thing at scale would be extremely unusual, given that people rarely lie if there's no reason for it. Going by this baseline, you could assume upward of 50% of replies are honest (even if mistaken), otherwise you'd have to explain why do you believe people would suddenly lie about that particular thing.


I've heard a theory where the inner monolog was emergent, and some of the first people to recognize thr 'voice in their heads' attributed it to god/angels/etc

There's conspiratorial lying and lying from ignorance, one is much less credulous.


That theory is the "bicameral mind"; I think it's even discussed elsewhere in this thread.


Because we can't be sure whether two people interpret what "inner monologue" means and whether they think it describes a phenomenon that actually isn't different between them and other people.

For example, I can think of interpretations of "I picture objects that I'm thinking about" that range from me not experiencing the phenomenon to me indeed experiencing the phenomenon.

To say that you're not experiencing something that other people are experiencing in their head is a solipsistic notion where you hypothesize an experience that you imagine others are having and then discard it for being different than yours.


And here I thought this was solved decades ago - I need to find the source, but I read about an old study where people describe their experience, and the answers were all over the "range from me not experiencing the phenomenon to me indeed experiencing the phenomenon".

Then again, it's trivially reproducible - people self-report all variants of inner monologue, including lack of it, whenever a question about it pops up on-line. Same is the case with imagination - aphantasia is a thing (I would know, I have it).


I'm responding to "why can't we just take their word for it?"

That you and I can come up with different ways to describe our subjective experience in conversation doesn't mean that we have a different subjective experience.

Especially not when relayed by a species that's frequently convinced it has a trending mental disorder from TikTok.


We can keep talking about it, and assuming we're both honest, we'll arrive at the answer to whether or not our subjective experiences differ. To fail at that would require us to have so little in common that we wouldn't be able to communicate at all. Which is obviously not the case, neither for us, nor for almost every possible pair of humans currently alive.


On the other hand, a deep one on one discussion isn't what's happening in casual debates online about to what degree each of us has an inner monologue. And because we don't have so little in common, I would be resistant to concluding that my subjective experience is so different than everyone else's. To claim that I'm different requires me to have an accurate model of what other people are experiencing, not just an accurate report of what I'm experiencing.

Look up examples of this on reddit and you'll find a lot of larping. I would take most of it with a grain of salt as you should with any story-telling you encounter on social media.

If we're so reliable, there wouldn't be fake mental illness epidemics on TikTok regarding experiences far more concrete than fuzzy notions like inner monologue.


> "why we simply can't take their word for it"?

As someone who was involved in spiritual practice of "stopping internal dialogue" for years, I can tell you that one learns that that dialogue (or monologue, pretty much the same thing) is quite subtle and complex, essentially multi-layered.

Typically, when you think that you "think about nothing at all" it's just the most surface layer that has stopped, and more subtle talking to yourself is still going on. It takes training just to become able to notice and recognize it.

After all, it's just such a constant and monotone hum at the back of one's mind, one learns to completely ignore it.

So no, I would not take a word of people who were not trained to notice their internal monologue that they haven't any :-)


> is there any reason why we simply can't take their word for it?

because if we give them a problem to solve in their head and just give us the answer, they will. By problem I mean planning a trip, a meal, how to pay the mortgage, etc. It's impossible to plan without an internal monologue. Even if some people claim theirs is 'in images'.


'It's impossible to plan without an internal monologue.' - Sorry, but I disagree with this. I have no 'internal voice' or monologue - whenever I see a problem, my brain actually and fully models it using images. I believe 25% of the population doesn't have the internal monologue which you're referring to and this has been tested and confirmed. I highly recommend listening to this Lex Friedman podcast episode to get a full grasp on the complexities of modelling language and general modelling present in the human brain: https://www.youtube.com/watch?v=F3Jd9GI6XqE


Sure, I do mention thinking in images in my original comment and count it as some type of internal monologue. I personally do not believe it's all images, as that would preclude using highly abstract concepts. But I might be wrong, and it might be 100% images. That being said, it does count as an internal monologue.


Can you draw a picture of an example of what you see when you think about something?


Sure - with the Alice example, when I saw the problem I came up with a really simple visual example of Alice having 3 sisters and 4 brothers. When I visualized it I saw Alice standing next to 3 other women (her sisters) and her 4 brothers standing close by. When I imagined asking her brother how many sisters he has, I could see that Alice was standing there next to 3 other women and thus came up with the answer of 4. Does this make sense?


This could account for why some people are much better at say geometry than algebra.

I'm the opposite. I rarely visualize things I read, be it geometry or poetry. I can read a detailed description of a person or an item in a novel, and I don't really "see" anything.

But I have an active inner monologue. I "hear" myself saying the words when reading or writing, I "talk" to myself when solving problems or just thinking about stuff.

Especially when programming I'm reasoning by discussing the problem with myself, the only difference being that I usually don't open my mouth and vocalize the discussion. Though sometimes when I'm alone I do just that.


> It's impossible to plan without an internal monologue.

Of course it isn't impossible, and this is backed by what we know about paleoanthropology and other instances of cognition in animals - humans were making stone tools millions of years ago, which takes planning in the form of imagining what you want the tool to look like and how you will do it and what it will be used for. It's exceedingly likely we had this ability long before complex speech evolved. Apes also use and make tools, which would require planning, and I don't think they have an internal monologue going on. birds from the corvid family can do some pretty advanced problem solving that requires planning. Cetaceans might be an exception, because they appear to have some form of language, but this is a pretty wild claim not really backed by any kind of science as we understand it today.


Animals can not manipulate abstract concepts nor can they do long-term plans. No crow can plan an international trip spanning a couple of weeks and two change-overs. And some people definitely can't do it start to end, but they can at least plan the first 5-7 steps.

Also, maybe inner monologue is not a binary have/have not, but maybe it is on a continuum.


Not sure. Migratory birds seem to manage this just fine. Not only do they make multiple stops to eat and rest, they also navigate around bad weather and still make it to their intended destination (at least most of the time).


> Migratory birds seem to manage this just fine

Instincts.


Yes, no one is disputing that animals are not as intelligent and lack the same capacity for planning that humans do, but the post you're replying to is disputing the fact that planning is done solely through internal narrative/monologue, which is easily disprovable by pointing to the examples I did. There are many more in nature.


> It's impossible to plan without an internal monologue

I once had a teacher claim that people who claimed to have aphantasia were lying, because those people have read books and it is impossible to read a book without picture the scene in your mind's eye. Are you citing the same source that she was?


I wish I had such a teacher, because I'd learn the term "aphantasia", instead of worrying all my youth that I'm doing reading wrong, as I could never picture anything I was reading in my mind (and as a result, I found scenery descriptions to be mind dumbingly boring).


>> It's impossible to plan without an internal monologue

That's quite the claim.


> It's impossible to plan without an internal monologue.

How can science make this claim if it can't prove (or disprove) the existence of an internal monologue?


Well, I remember Richard Feynman came up with an interesting experiment. He found he could not count objects when he read aloud some text at the same time. He had to name the numbers, and it was impossible if he was already engaging his speech.

He thought this was universal, but doing this experiment with friends, he discovered a guy who could count while reading aloud. So when Feynman asked him, how he does this, turned out that the guy instead of "pronouncing" numbers was "seeing" colored numbers in his imagination, so his speech was not involved.

I supposed this experiment can be modified and generalized, and at least to shed some light on this problem.


Perhaps there's confusion in how we are using the word monologue. I took it to mean a conversation, a dialogue where the problem is perhaps solved using a dialectic method, or simply a conversation. Since one can solve a problem by following some memorized steps, no conversation required, this is perhaps not a good test, or we mean different things when we say "monologue."


> The existence of an "inner monologue" isn't really a falsifiable claim.

Another possibility is that inner-monologues (ones not forced by conscious effort) do exist, but are just a kind of false-memory, something one part of our brain generates after-the-fact to explain/document the outcome of another non-language part.

Kind of like how certain reflex-actions can occur before certain decision-making area of the brain light up, yet humans will believe that they sensed the event and made a thinking choice.


I think you’re using “inner monologue” too literally. It could be a progression of pictures, emotions, etc.


To make any progress on this question at all, we need first to come up with some definition of internal monologue. Even if we may need to modify it later, there has to be a starting point.

Otherwise, nothing can be established at all, because for any statement there always will be someone's understanding of "internal monologue" for which the statement is true, and someone's else understanding for which the statement is false...


I'm sure inner monologue just cashes out into the ability to reflect on your own thoughts. And for one to say that they're not having that experience also involves a claim about what they think other people are having which would make me doubly skeptical.

In practice, when you see people arguing about whether they have an "inner monologue" or can "mentally picture objects" on social media, it's more of a contest of who is the most unique in the world rather than anything that sheds clarity on our subjective experience.


With that definition even bacteria have inner monologue.


Can bacteria imagine pictures? Do they have emotions?

Why does this matter? Stop being so pedantic. We're talking about a progression of ideas. Talking in your head is one form of ideas, but people can easily solve problems by imagining them.


Initial thesis was - inner monologue is required for reasoning. If you define inner monologue to include everything brains do - the initial thesis becomes a tautology.


Hmm, looks to me like just trading some words for others. Do bacteria have ideas? Does the navigating system in your car? How do you know?

We need to be at least somewhat pedantic, otherwise it's impossible to know what we are even talking about, and no way to establish anything.


The fact that we don't actually have an understanding and framework for reasoning (e.g. whether inner monologue is a cause or an effect) means we are VERY off from general AI.

https://youtu.be/QGYbbLWn-IE?t=72


Have there ever been studies that demonstrate that those individuals don't simulate possible state transitions they'll go through in a different modality? I'd be curious if they visualize actions they'll take still, just not verbally.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: