Hacker News new | past | comments | ask | show | jobs | submit login
False memories, or why we’re so sure of things we’re wrong about (duke.edu)
230 points by Hooke on March 15, 2018 | hide | past | favorite | 103 comments



The interesting thing is that society really strongly rewards confidence, even when you are wrong... but having a certain amount of self-doubt, I think, is fundamental to sanity for all the reasons in the article; sometimes, even when I'm really sure, it turns out that I'm just... wrong.


People like certainty. It's why someone who is 100% reliable is generally perceived as far more valuable than someone who is merely very reliable. People are far happier to follow a leader that has 95% chance of succeeding than someone who has a 75% chance.

Confidence is basically just a word for "projecting one's own certainty". People like certainty, so they favor the person who knows what they're talking about over the person who's not quite sure.

But unfortunately, confidence isn't a perfect signal. Some people actively exploit the general human bias to find certainty to their own advantage by falsely projecting confidence.

This is a topic I find interesting. And to be fair, not everyone who projects false confidence is doing so maliciously, some (it seems to me) simply have a hard time holding things that aren't absolutes in their head and they tend to round certainty on things to 100% or 0%.


> unfortunately, confidence isn't a perfect signal.

I think that's a major understatement.

There's the Dunning–Kruger effect https://en.wikipedia.org/wiki/Dunning–Kruger_effect

and the world is complex and it's very difficult to get a clear and definitive picture of it. The better your understanding, the more hedged and less definitive it will tend to be, and that does not tend to come across as confidence in your understanding.


You should read the link you shared, since it contradicts your claims.

Your post is an illustration of the false Dunning-Kruger effect that is astoundingly commonly seen in forum posts citing the Dunning-Kruger effect.

"The group of competent students underestimated their class ranks, while the group of incompetent students overestimated their ranks; yet the incompetent group did not estimate their class ranks as higher than the ranks estimated by the competent group."


That doesn't contract my claim at all. It would only contradict it if I had claimed that the incompetent people estimated their rant as above some competent group. All that is required for my claim is that they overestimate their abilities.


Right - you claimed mere imperfection, in a limited case. But we also know that experts making predictions outside their field are on average more confident but less accurate than non-experts. So the "folk mythology" version of D-K actually does apply at times, just not to "people of low ability."


I think the problem boils down to how to take meaningful action without confidence.

E.g., if I admit that, not only am I not confident, but I see the world as so complex that I can't be possibly confident about it, how can I decide what to do?

If my view of the world is correct than my actions would at best have unpredictable results and at worst would be reckless. So the best course of action would be not to do anything.

However, would strip me of all agency and any ability to do anything about things in the world I see as problematic.


> if I admit that, not only am I not confident, but I see the world as so complex that I can't be possibly confident about it, how can I decide what to do?

You get input from others with knowledge, consider things as much as you can, and recommend a course of action / make a decision based on what you know, understanding that it might not end up being the right one.

I've seen it happen often in small businesses and startups I've worked with, and I've seen people make what in hindsight were the wrong decision. But that's just the way things are for everyone, really.

You can't see the future but have to make a decision anyway at some point.


>E.g., if I admit that, not only am I not confident, but I see the world as so complex that I can't be possibly confident about it, how can I decide what to do?

You need to learn to face uncertainty. You know some of your data is incorrect, and that you sometimes make mistakes processing that data. But if you are calibrated correctly, you should be more sure of some things (that the sun will rise tomorrow) than others (that I'll get that job I just interviewed for) - I mean, I think both of those things are true, but one I'm very certain of the first, the other? I'd say it has a 60% chance of happening.

It's totally reasonable to make a bet that will cause me to lose something important if the sun doesn't come up tomorrow. It's much less reasonable to make a bet that will cause me to lose something important if I don't get that job.

>If my view of the world is correct than my actions would at best have unpredictable results and at worst would be reckless. So the best course of action would be not to do anything.

If you literally make no conscious effort to do anything? It's likely you will die of dehydration in the next few days.

Doing nothing is very likely to be much worse than randomly picking from actions deemed culturally reasonable... and you can probably do a lot better than random through weighing what uncertain information you have.

I mean, complexity is hard, but it's ultimately going to get you better decisions than a system where you are absolutely certain of things.


No I agree it’s not. But, my experience from many years is: people want leadership that is confident, which way to go. Event though, it might not be the right (I wonder if there is ever a right way, because as you correctly say, the world is utterly complex)


Well, and that's precisely what we need to adress, some of the things people want. I firmly believe we should teach about this in schools. Knowledge about biases is very relevant to critical thinking, and critical thinking to a fairer society.

^ I was writting with more confidence and assertively than usual there ;)


Maybe. But then again it is really difficult to change human nature


> "That's right!" shouted Vroomfondel, "we demand rigidly defined areas of doubt and uncertainty!" — Douglas Adams

One of my favorite quotes of all time. We want certainty, and when that fails, certainty about the uncertainty.


> People like certainty. It's why someone who is 100% reliable is generally perceived as far more valuable than someone who is merely very reliable. People are far happier to follow a leader that has 95% chance of succeeding than someone who has a 75% chance.

I absolutely agree that people like certainty. I would even go so far to say that they actually don't like the '95% chance' of anything. In that case, they would say "I believe this to be 100% correct", which is basically saying that you can't tell for sure, but you believe it to be 100% correct. Not an objective 'real' chance of maybe 90%.

Take for example religion. Nobody can be 100% sure whether there is some sort of god or not. But you won't hear anybody make any % predictions, especially the ones who believe. They believe 100% for sure there is a god. Atheist are on that front the same, and believe 100% there is no god. They are unable to make a % prediction, or really don't want to because it would break something inside of them (their belief). Very strange. I like to work with chances a lot.


Probabilities concern how much information it is possible to have about an event: they are a description of what a perfectly rational person, under perfectly rational constraints, could know.

Since I do not understand how we could know anything about "God", there could be no 'probability' associated with it. What would it mean to say 50%? That, were the universe run 100 times -- 50 runs would have a god in them?

I am an atheist in exactly the same way that I am an a-leprechaun-ist. The proposition "God Exists" is one I believe to be false for extra-empirical reasons. Namely, that the inexistance of God is consistent with every given observation -- and the content of the idea seems deeply anthropocentric and psychological.

Those ideas which have had no explanatory content (ie., could not be used to predict anything) and which have had a clearly human psychological construction have all turned out to be "reasonably false". At the moment holding that god does not exist is "unreasonably false" according to people (eg. You) whereas holding that leprachuans do not exist is "reasonably fasle".

I believe this distinction to have everything to do with saving people's feelings and nothing to do with the content and nature of the ideas.

Objects which have no explanatory power (magic, gods, etc.) are "bugs" in human thinking. No one ought believe that any of them exist.


> Objects which have no explanatory power (magic, gods, etc.) are "bugs" in human thinking. No one ought believe that any of them exist.

To dismiss a pervasive category of human beliefs as a "bug" that serves no purpose is incredibly egotistical and arrogant. The sheer pervasiveness of these types of beliefs point at them serving an evolutionary / competitive purpose.

I believe your mistake is based on a lack of understanding of the reasons why we hold beliefs and the purposes of knowledge.

Belief and knowledge are not purely about making predictions.

They provide the rules by which we operate as individuals and as a society. They provide the basis for value attribution and define our self identity.

While some knowledge derives much of it value from it's ability to accurately make predictions, that is not the core reason that we have knowledge. Knowledge is fundamentally judged by the utility it provides to individuals and societies.

Newtonian physics is my favorite example. It is not the most predictively powerful version of our knowledge about physics (since we know there are areas where its predictions break down), yet we still teach Newtonian physics because it is simpler and easier to understand and use than General Relativity or Reletavistic Statistical Mechanics.

> I am an atheist in exactly the same way that I am an a-leprechaun-ist. The proposition "God Exists" is one I believe to be false for extra-empirical reasons. Namely, that the inexistance of God is consistent with every given observation -- and the content of the idea seems deeply anthropocentric and psychological.

You claim that "God Exists" has no predictive power, this means that "God Does Not Exist" also has no predictive power. The existence of God is also consistent with every given observation. Thus your Atheism is another example of beliefs whose value arises not from their predictive power but from some other form of utility.

TLDR: It's not a "bug" it's a "feature".


Its a bug because its false.

Suppose I say to you I have a magical creature in my bedroom. Surely you'd want to know more about it?

Well for every question you ask, I will describe a property it has that prevents me from answering your question.

Where is it? Everywhere. What does it weigh? It has no weight. How do i see it? Its invisible. And so on.

These are "buggy objects". They actually possess no properties. The term "magical creature in my bedroom" doesnt actually refer to anything. But we are still able to speak of it as-if it does. And thus go around constructing speech around it: "I love my magical creature!".

I am an atheist because God clearly falls into this pattern of buggy-object-generation. Take some idea that seems concrete and remove from it every impactful dimension so that what remains seems something, but is in fact, nothing.

For the idea of God to even be true there ought to be something that it means for it to be true. Some property which, if absent, would make a difference.

False beliefs may have other "value". Faires and sprites may be "fun" and "bring families together".

But they are pure bugs. Whatever feel-good factor you want to have can be had without them. I don't believe in any of them as a class. God just happens to fall into this group.

To claim that it is reasonable to believe that any object of this kind exists i think breaks reasonable thought to such a shattering degree. At that point I do not know what principles you have left to deny any form of nonesense.

That is, why is "leprechauns do not exist" overwhelmingly reasonable but "god does not exist" not?

There is no reason. At some point you need to draw a line and debug your reasoning. That isnt an empirical issue, its a thinking issue.


> I am an atheist because God clearly falls into this pattern of buggy-object-generation.

Are you 100% sure our universe is not a simulation? Because if it would be, that opens up a whole lot of realism to "God". A creator, a possible afterlife, etc.


The proposition "the universe is a simulation" is just a form of global scepticism.

I am reasonably sure that the best explanation for the causal structure of the world is the causal structure of the world -- not some other world which, in some grand mystical paranoia, behaves as-if it were the world.

Global scepticism is another "bug" of a different kind. It is a scepticism-loop, like conspiracy theories.

In rational thought, doubt attenuates, it doesnt build. The more occurances of events occur which pertain to one hypothesis, the more confidence you should have in that hypothesis.

The problem with global scepticism is that it declares, by assertion, that every event cannot reduce doubt -- for some magical reason. And so it makes escape from doubt and explanation itself, impossible.

Magical reasoning of this kind is another error. Remove any unjustifiable extra-empirical premises that cause doubt to multiply rather than attenuate. If you spiral into doubt, you're in a loop. Delete the unjustifiable premise.

{Simulation, Cartesian Daemon, Dreaming...} is this set.. now, of "empty substances" out of which every object is meant to be made. These are buggy.

Premises of reasoning which begin, "everything is drawn from... a simulation/dream/blah" are thought-traps that cause doubt to spiral. They are, of themselves, simply made up. That they cause these spirals of doubt is nothing profound.. its in the nature of the setup.


> The problem with global scepticism is that it declares, by assertion, that every event cannot reduce doubt -- for some magical reason.

This is true, you cannot and should not be able to decrease absolute doubt. All you can do is reduce relative doubt given a certain set of premises.

> The more occurances of events occur which pertain to one hypothesis, the more confidence you should have in that hypothesis.

This is rational only given the assumption the of the Doctrine of Uniformity (https://en.wikipedia.org/wiki/Uniformitarianism)

That doctrine allows us to build pretty much all of our useful knowledge systems and is the basis for most of our certainty. However, if we are epistemologically honest with our selves, all of our certainly only exists given that this doctrine is true.

Your "magical reasoning" is what allows us to accept and use unprovable premises because we find the knowledge systems we can create with them provide functional utility. This is precisely why I call it a "feature" and not a "bug".


We have some options: "uniform and reliable" or one of an infinite varieties of "non-uniform and unreliable".

Neither starting point has any effect on what makes "the earth goes around the sun" true. It is true iff there is an earth and iff it goes around the sun.

If we are all deceived then that claim is false, regardless of its utility.

Utility is a criterion for the most general starting point of our reasoning: how are we going to think. It isn't a criterion for truth, nor for the truth of particular beliefs. Nor does the fact that utility features in my methodological criteria mean that I am not entitled to reasonably infer truth from the method.

The factory is made of steel (utility) its products are made of wood (truth), that utility features in the production mechanism does not mean it features in the product.

My observation about "the simulation hypothesis" is that it isnt a hypothesis at all. It is just an alternative formulation of how to think: ie., to deny that evidence builds in support of conclusions about how the world works.

It is not mysterious then that "nothing seems to work" if we presume simulation. That's by design of the thought process. Since how I think is my choice. I can chose to use the steel factory over the playdo factory. And my products are clearly of a better quality for any person concerned for production (ie., for the making of truth-claims).

Whether they are true or not, my method reasonably entitles me to infer they are. The playdo method is a thought trap, and a pretty juvenile one at that.

This is the trick being played: simulationy people want you to think "simulation" is just like "the earth goes around the sun". They are both, meant to be, healthy products of reasoning and thus plausible. The former is a bug in this sense: it is in fact a change to the method of truth-production it isnt one among many plausible products (truths). It is broken machinery producing nonesense.


> If we are all deceived then that claim is false, regardless of its utility.

Not really, knowledge is relative: Our claim of "The earth goes around the sun" claim is equally true, regardless of whether we are in a simulation or not. If we are in a simulation, there may not be a real earth, and real gravity may function differently than our gravity. Obviously this piece of knowledge is about our own earth and our own gravity.

If we are in a simulation, then the claims we make are about the simulation and are still provable. If one of use were removed from the simulation, they would be forced to potentially learn an entirely new set of rules and facts. Their knowledge about the our simulation wouldn't be false, but it would have to qualified as about a particular simulation rather than as about reality.

> The factory is made of steel (utility) its products are made of wood (truth), that utility features in the production mechanism does not mean it features in the product.

I don't understand this metaphor.

Truth and Utility are measures we use to evaluate beliefs or claims. In your metaphor, they would be the QA process in the factory. At best they are the QA process as the factory builds itself.

The Doctrine of Uniformity, the scientific method, mathematical proof, visual proof, trust in experts, holy texts, and really the entire sum of our existing knowledge are what make up the factory that produces our beliefs and claims.

> My observation about "the simulation hypothesis" is that it isnt a hypothesis at all.

It certainly is a hypothesis, but may not be a scientific (i.e. testable) hypothesis if there are no associated testable predictions.

I don't get people like you who seem to think there is some magical set of "true claims" that somehow exist outside of any knowledge structure. To me, claims can only be made within a knowledge structure and those claims can only have meaning (let alone truth) by being evaluated within that knowledge structure. I'd be curious to hear you explain how it could be anything else.

I personally think our sense of "truth" arose as an instinctive heuristic for estimating the utility of a claim given our limited existing knowledge and experience.


Strange you got downvoted, because I couldn't have said it better.

The western world is pretty focused on rational thinking. This indeed doesn't mean that a society with some irrational thinking ingrained into it, has a better chance of surviving.

And indeed, if you look at the world, religious thinking did a pretty good job at surviving in about any culture in the world. While it might not be 100% correct, the things they believe, it might give them an extra psychological edge to survive. All evidence points that way.


Falsehoods are helpful for survival? All of them? Or only the ones you care about? What about the ones other people care about?

Aunt Maple wants to use sugar water for her cancer treatment...

Hmm.. strange, how much the value of false things overlaps with the number of wishful things. And how the ones we ought keep are those you like to wish for, and not those terrible wants other people wish for.

One might wonder: is the false really that valuable? Do we need to revive the Norse Gods for their "value"? Or rather, is it much simpler: falsehood is a kind of oppression which limits our ability to act.

"The abolition of religion as the illusory happiness of the people is the demand for their real happiness."


> Falsehoods are helpful for survival?

Nothing I was discussing can be considered a falsehood. These are unprovable beliefs. They cannot be proved and they cannot be disproved. (or rather: we have yet to discover the means to prove or disprove them)

> Hmm.. strange, how much the value of false things overlaps with the number of wishful things. And how the ones we ought keep are those you like to wish for, and not those terrible wants other people wish for.

I believe you are making mistaken assumptions about what I believe.

The functional utility of wide classes of beliefs has changed rapidly over the last several hundred years. I am not arguing that everything people believe has or continues to have functional utility. In fact, I believe to ideologies to succeed over time, they must repeatedly adapt and evolve to maintain their utility.

My issue is that Mjburgess dismissed an entire category of human cognitive function as a "bug" when it seems much more likely that such a pervasive trait is in fact an adaptive feature of human cognition.

I have great admiration for skeptics who attempt to disprove provably false beliefs. I also have no problem with atheists and many of them are intellectual heroes of mine. However, I take issue with atheists to believe their belief in the 'non-existence of all gods' is somehow epistemologically superior to people who hold religious beliefs despite them being similarly unprovable.


> Falsehoods are helpful for survival? All of them? Or only the ones you care about? What about the ones other people care about?

Doesn't really matter what you, me or anyone else thinks, no? Nature decides who survives.


I am concerned about what is true.

Nature does not determine importance. Survival is an event, it isnt a value.

We determine what matters. And I do, in particular, when I am speaking of the things I care about.


[flagged]


Posting like this will get you banned, so please don't do it again.

https://news.ycombinator.com/newsguidelines.html


Why do you say that?


At least in Christianity, doubt is a major component of the faith structure. Generally, the more thoughtful believers will recognise that the concept of God is something that exists outside the realm of things humans can assign probabilities to - and likewise for thoughtful atheists, they generally refer to the overwhelming improbability of a god existing, not to any particular number, certainly not 100%. It seems to me that people willing to place probability numbers on such things (outside thought experiments) generally don't understand probabilities (or, frankly, religion or atheism, either) very well.


So you are saying that the pope and priests consider a small chance that god doesn't exist? Sorry, I don't think so.

None of the religious people I know would say there is a small chance god doesn't exist. That would be a betrayal of their belief.


>> Generally, the more thoughtful believers will recognise that the concept of God is something that exists outside the realm of things humans can assign probabilities to


I'm having difficulty understanding your message here.

Is doubt a positive feature of christianity in your view?

Is 'more thoughtful believers' an oxymoron?

Is probability, or indeed statistics, a sensible approach to people's belief in things they can't prove?


Whether it's positive is irrelevant for the purposes of this discussion. It's there, so not confidently stating a 100% certainty in the existence of God is not something Christians just do as a matter of being Christians.

For the two latter questions, not sure what you're getting at, but, no and no.


> It's there, so not confidently stating a 100% certainty in the existence of God is not something Christians just do as a matter of being Christians.

Okay. I thought that 100% certainty in the existence of God (who is also Jesus Christ) is in fact something Christians do as a consequence of being Christians.

For the latter two questions, I was getting at being thoughtful belies being a believer (in the sense of 'believing something without evidence') and questioning the reliance on statistical probability for any faith-based religious belief.


Belief, even acceptance, isn't interchangeable with certainty. "Now faith is the substance of things hoped for, the evidence of things not seen".

There are plenty of things you accept and believe without evidence. The world, and especially the inner lives of humans, isn't made up exclusively (in some aspects not even predominantly) of empirically observable facts.

The idea that believing is incompatible with being thoughtful is not worthy of something who (presumably) considers themselves thoughtful. You can do better.


> Belief, even acceptance, isn't interchangeable with certainty.

Totally agree. I think.

> There are plenty of things you accept and believe without evidence.

Incorrect.


How do you know you love your spouse? (Or mother or whomever else you might say you love)?


My spouse that I've met? Ditto my mother? (there are also non-females that I love, btw.)

Unless you're going the solipsistic route, actually knowing (not in in the biblical, but rather merely just the physical / actual sense) people is a major factor.


I am going the solipsistic route, because it's the relevant one in getting to the bottom of what these fundamental things actually mean. The point is, love is something that can only exist inside our individual minds (what is a mind, even?). Scientifically, we bare know what it is (something to do with dopamine), much less are we able to measure it. When you love your spouse, you take on faith that the feelings you have inside are what love really feels like. You tell the little voice in the back of your head that asks "Is this really it? Ohh, the model is that ad is really good looking, perhaps if..." to shut up. But you don't go to the love doctor and have your heartbeat or whatever measured to empirically confirm that your feelings are truly love, and not merely affection, an infatuation, or lust.

And that's without getting into what "actually knowing" someone really means, and how you evaluate your spouse's claim that she loves you. At least you can feel your own feelings.

I'm not saying that these feelings aren't true, I'm sure they are. I'm contesting your assertion that you take nothing on faith.


>Take for example religion. Nobody can be 100% sure whether there is some sort of god or not. But you won't hear anybody make any % predictions, especially the ones who believe. They believe 100% for sure there is a god. Atheist are on that front the same, and believe 100% there is no god. They are unable to make a % prediction, or really don't want to because it would break something inside of them (their belief). Very strange. I like to work with chances a lot.

The question simply isn't meaningful enough to give a meaningful answer? The ultimate answer isn't going to be true or false - it's null.

Humans seem peculiarly blind to the notion that sometimes questions just aren't meaningful enough to produce a meaningful answer. Or at least, the humans I tend to run into are. Maybe there are tribes in Africa where this notion is obvious.


"Atheist are on that front the same, and believe 100% there is no god."

This is a false statement. The rest of your post is supposition and anecdote.


It's neither a false nor a true statement. It's a mostly non-meaningful statement.


I've noticed the same. I put it down to our evolutionary past.

In the past when we were all living in little groups in the wild, it mattered to the person who was making a statement whether he really knew that this new plant you'd come across was edible and not poisonous. He'd have seen someone eat it, personally tried it, or had some firsthand knowledge of whatever the situation was. So if a guy said "let's eat this thing, I'm sure it's good" he'd both have skin in the game AND actually have evidence.

Nowadays almost all knowledge is second hand. Trust me, quantum tunnelling is a thing, and the earth is flat. But we're still wired to believe in the confident guy.


You're plant example is not really convincing. It would make much more sense, that only one tries it, in small amounts, and than later everyone. And this is what indigenous people also do.

Just "all go for it" would be evolutionary very hindering as there are many, many poisonous plants out there.

It makes sense in other scenarios, though. Or just in general, should we take path A or B. Doing a "maybe" decision won't get you anywhere in many situations. Like when you either have to go far in that direction, or in the other direction. Staying means starving, but finally reaching A or B could mean food. So it is better to go choose and be determined, even when you don't know for sure whether the prey really will be there and not the stronger enemy tribe.

Still true today.

Just not in all the scenarios, our "leaders" wants us to believe...


Not just society, institutions and groups. But individuals as well. It's probably hardwired in our brain.


Maybe? I find I have a "dangerous other" type reaction to people who seem too confident. I mean, I can get over it, but my initial reaction to a confident person tends to be quite negative; I pretty much immediately switch to "defense against salesmen" mode and only slowly start to consider that they might actually be competent after they have proven themselves, which is quite different, I think, from how I try to treat people in general.


I found out that your reaction mainly rise among intellectuals, which are not the majority.

Also it's only triggered with certain kind of display of confidence (like aggressive confidence, or confidence mixed with poor vocabulary/skill, etc). E.G: the confidence of a humble but strong expert does not trigger the reaction.


> I found out that your reaction mainly rise among intellectuals, which are not the majority.

That very much implies it's not hardwired, though, just that it requires unlearning.


I think it's more that you have an additional layer canceling the initial reaction.


> reaction to people who seem too confident

but people who seem merely reasonably confident slip under your radar, even if they are faking it.


why would I care if someone is faking confidence any more than I would care about it if someone is faking feeling energetic or silly?

I mean, I guess I would feel a little weird if I thought you felt you needed to fake confidence around me, so there is that.

Is there an optimal level of confidence you can project to get the optimal social response from me, and is that optimal level other than the lowest level possible? Yeah, sure.

My main point was just that my reaction to people who are too confident is immediate, negative and extremely strong. I do my best to ignore it like all other bits of personality that aren't relevant to the role, of course, if it's a role where confidence doesn't matter... but I am human, and these things leak.

I was just bringing this up as a counterpoint to the "humans are hard wired to favor displays of confidence" because, as I mentioned before, I am human, and have a significantly negative (and difficult to consciously suppress) response to levels of confidence that seem to be normal in the marketing professions.


> I pretty much immediately switch to "defense against salesmen" mode

I have the same reaction, which is a result of working in and around sales for years. Before that though I think I was just as susceptible.


Its always a struggle to tease apart what is hard-wired into our brains and how our brains are socialized to operate.



The scariest thing about false memories is that police interrogation techniques are great at planting them.

Yet another reason to never talk to the police.

https://www.psychologicalscience.org/news/releases/people-ca...


[flagged]


I have read about cases like that in children, but they didn't really chalk it up to false memory implantation. It seemed mostly to come down to children just wanting to give the answer that the authority figure wants to hear. Or what the child perceives as what they want to hear.


That still sounds like false memory implanting. They're asking detailed questions and the child knows they want them to agree, so they say yes and start believing what the question said happened. I will say that the questioner wasn't necessarily trying to get this to happen, but they still would have implanted a memory.


"Every time you recall a memory, it becomes sensitive to disruption. Often that is used to incorporate new information into it." That's the blunt assessment from one of the world's leading experts on memory, Dr. Eric Kandel from Columbia University.

http://www.cbc.ca/news/health/scientists-explore-the-illusio...


Yes but is he recalling his research correctly?


An Indian Bollywood movie comes to my mind, the hero recreates an entire day by planting false memories in the minds of people around him, he exploits the power of visual sense to do so, which sticks quickly to your mind.

https://en.wikipedia.org/wiki/Drishyam_(2015_film)


My mother in law is a master confabulator. She makes up stories out of whole cloth and absolutely believes them. They range from the mundane to the fantastic to the impossible. Some examples:

One year she was disappointed that we didn't come to spend Christmas Eve with her. She never called to make plans because she imagined we'd just come over because "that's what we always do." In reality--we have never spent xmas eve together-- ever.

She insists there was a ghost in her house which played pranks on her and-- in full view of two other people levitated a potted plant and threw it at her.

and her masterpiece-- blaming me for her husband's death (of cancer) because I asked him to come to his daughter's graduation ceremony. I don't even know what to do with this one.

The maddening thing is, these people absolutely believe these things. They will argue until they are blue and tell you that you're crazy. It's hard to even blame them in the end.


Regarding the ghost thing, I assume it wasn't the case here, but it could be a life-or-death situation... see this video (I don't want to spoil it so I won't say more): https://www.ted.com/talks/carrie_poppy_a_scientific_approach...


Just in case anyone is interested (spoilers!): the talk is about a woman who thought her house was haunted, but it turned out she was suffering from carbon monoxide poisoning, due to a leak. Carbon monoxide can cause auditory hallucinations and feelings of dread and pressure on your chest.

Edit: added spoiler warning


This is also the topic of a famous reddit thread, where an OP thought he was being harassed by his landlord, but was actually delirious from CO poisoning.



You just spoiled the whole thing...


For n=1, at least, I appreciated the summary (even though I understand what you meant about spoiling it).

I habitually avoid videos on the internet, and with the summary, I was able to at least get the gist of the video, whereas otherwise, I would have never watched it.


> I habitually avoid videos on the internet

Even TED talks? :(


Yes, even TED talks. They're not magically better. If I can't read a summary to judge the quality of information, it's likely I can just skip the whole thing and save myself from having my time wasted.


i like transcripts some discussion boards that post videos also post transcripts summaries are cool too though depending. works best with videos that are just talking..


There's a transcript on the same page as the video, even with translations.


TED talks are mostly circle jerks now.


When people are paying that much money to hear the talks, you've got to make sure you are telling them what they want to hear.


Yes, because I think it is more useful to just write about the details of the potential "life-or-death situation" instead of linking to a long video about it. Your original comment was very clickbait-like, "What you don't know about hauntings could kill you!" If the whole thing can be spoiled with a short summary of the first couple of minutes, it's probably not worth watching anyway.

Edit: I have now added a spoiler warning to my original comment (I hadn't thought about this possibility, sorry)


It's okay to spoil something when the information might keep people from dying.


There's something endlessly entertaining about the juxtaposition of a life or death situation and still not telling him outright just to maintain the suspense.


When I was young I had reoccurring dreams for a few months about a domestic violence issue. Years later after my parents had passed, the stories started to come out. It turns out that incident was real. My mind completely blocked it out and turned it into a dream. It was then I knew to not trust my memories.


I'd like to see a cross-section of data and metrics that acknowledge that some human beings lie. I searched the article for any mention of lie or untruth or manipulation and found nothing. Why is that concerning to me?

>Ultimately, we want to understand the psychological mechanisms underlying these kinds of cognitive processes to generate models that can contribute to computer programs and artificial intelligence.

"I'm sorry Dave, I can't do that." - Hal

A great start would be to use the espoused rubric on Duke students and/or faculty to see if they are willing to lie about bribery in NCAA Basketball recruiting vis-a-vie success.


I agree, especially in this kind of study, where it's been acknowledged (e.g. https://www.tandfonline.com/doi/full/10.1080/09658211.2017.1...) that lies contribute to the formation of false memories.

For example, another commenter reports that his mother seemingly remembers things that never happened; the first thing I thought (because of similar experiences) was that perhaps his mother had lied to friends about family coming over to visit.


The classic Deese–Roediger–McDermott false memory study paradigm: https://en.wikipedia.org/wiki/Deese%E2%80%93Roediger%E2%80%9...

Basically, semantically related lures can lead to false memories for items not actually presented.

Interestingly, younger children do not have this type of false memories as frequently. That is theorized to be because a)a reliance on verbatim memory traces, an b)under-developed semantic knowledge


I'd argue a strong impact for young children, is that they haven't formed but a small fraction of the concepts that adults have. They simply are unable to generate as many or as elaborate of false memories because they lack the necessary array of concepts required for a higher level of creativity an older human can put together.

Simply put, if you're aware of seven concepts, that's a hard limiting wall for your creativity when it comes to inventing false memories. What would the false memories consist of? They can only consist of what you know. And further restricting that limited set, is the ways in which you understand to arrange/rearrange what you know.

You understand the concept table. You know the generally defining characteristics visually of a table. You know some of the ways a table is or can be used, however you only possess a limited knowledge of tables and how they can be used. That limited knowledge set of "table," restricts your ability to utilize the concept "table" in your false memories.


I'm using 'semantic' as an umbrella term to include facts AND concepts. So we are saying very similar things.


No one is aware of only 7 concepts after the first few days of life. Children are exceptionally "good" at mixing reality and fantasy and not having the critical thinking skills and knowledge to reject the reality of their dreams.


Another relevant theoretical understanding of false memories comes from Fuzzy Trace Theory of memory and cognition.

https://en.wikipedia.org/wiki/Fuzzy-trace_theory

and more generally: Confabulation https://en.wikipedia.org/wiki/Confabulation


Almost every time I watch a film for the second time I encounter a scene which seems to have been mysteriously messed up. I remember precisely how it was the first time I watched the film, but on the second occasion the dialogue and actions seem to have been changed so as to give an inferior overall result. I used to think that perhaps there were multiple versions of all these films, but every time I investigated I failed to discover any evidence of the existence of an alternative edition containing the details that I clearly remember, so I now think it's just the way my memory improves things.

On the other hand, perhaps in some cases there really are different editions of the film that use different takes for certain scenes, though the films I watch twice are not usually the big-budget productions that are well-known for having multiple editions.

Anyone else experience this?


Sometimes I re-watch classic sports games that didn’t end well for my team hoping my memory was wrong and that the game ends differently than I remembered. It never has. :-(


Relatedly, you might enjoy knowing of the Mandela Effect.

https://rationalwiki.org/wiki/Mandela_effect

https://www.reddit.com/r/MandelaEffect/


I recommend meditation. It's just delusion. Staying present is protection against actively overwriting fact with fiction.


When I was younger my memory was rock solid in that if I recalled something it was 100% true. As I got older my memory is not as certain, but at least I know my memory is not always correct.

The loss of the reliability of my memory is probably the thing I most about my youth.


https://quoteinvestigator.com/2010/10/10/twain-father/

"When I was a boy of fourteen, my father was so ignorant I could hardly stand to have the old man around. But when I got to be twenty-one, I was astonished at how much he had learned in seven years."


Or could it be that as a youth, you were simply more sure of your memory?

I have found that as I get older, I know much more than I ever have, but am also less sure than ever.


But how can you trust your memory when it tells you that your memory was once better? :>


Yes an interesting question. I do have written evidence from the time, but I also have a whole lot of independent memories of having a rock solid memory. I don’t seem to have false memories, just incomplete memories.

It is not like my memory is bad now (it is still better than nearly everyone I know), it just is not as good as it used to be. I have to write things now in my diary and use bookmarks when I read, rather than just remember which page I was up to.


> I’ve been interested in memory for as long as I can remember.

Eeh, so a wise-arse, eh?

Great interview. I'm reading Kurt Anderson's 'Fantasyland (How America went Haywire)' ... and the idea of false memories features large.

I wonder about the mentality or outlook of people who approach any session of false memory exploration with a 'too open' mind.


The past is a faulty memory and the future is an illusion of ego.


The vividness of a memory is also misleading; people are always sure some memory is accurate because they can vividly recall every detail.

Except that the brain often will generate vivid details to replace or fill in gaps.

E.g. I had extremely vivid memories of a movie I hadn't seen for years, so I was surprised when I watched it again and I was very wrong in those vivid details. I had only retained the outline of things, some snapshots of things.


Personal story: I once remember seeing a spider in my childhood, as huge as an elephant, with its web spanning two large trees. I am sure I saw it, without a doubt. But my conscience says that it's simply impossible.

I kept on telling my dad this story when I was a kid and he didn't believe it. But when I told it a long time later on, he too thinks he might have seen it.

Maybe he formed those memories from my story.


is that you Frodo?


Elizabeth Loftus work on the subject is super interesting too, definitely recommended if you like the topic.


The mengle effect ;)


No, it's the Mandela effect. Either that or some weird parallel universe stuff.


Weird, I'm pretty sure it was always called the Mengele effect but all Google results indicate it's always been called the Mandela effect.

/s




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: