Hacker News new | past | comments | ask | show | jobs | submit login
We Can’t Get Over Ourselves: Reasons we fail to understand others (nautil.us)
82 points by dnetesn on Aug 27, 2015 | hide | past | favorite | 26 comments



The biggest objection I have to self-reported studies and the conclusions drawn from them in articles like this is that, according to research in books like Thinking, Fast and Slow, a lot of self-reported facts (including beliefs) are often wrong. Revealed preferences will often show that people believe something different from what they report. And people will fervently believe they told you the right thing. Also, a lot of the findings of these studies end up not being repeatable or have very small effects. I've gotten to the point where I unconsciously put little value in opinion polls and other self-reported data, unless I have a good reason to think otherwise.


I'm curious as to the rhetorical implications of all this research, particularly regarding modern media. A TON of dialog takes place on message boards and comment sections like this one, certainly more than on any traditional media. It would be interesting to see whether such discussions significantly affect popular opinion of if they're largely just outlets for what people already believe. Even a simple poll of "have you ever changed your mind based on information gathered from discussions on message board X?" would be enlightening. You'd need to conduct such surveys on specific boards/sites to get a clear picture though, as each has a unique community with different priorities. At the very least it would be nice to get a database of which forums have intelligent discussion and which are just full of windbags lol.

Obviously on a lot of boards the posters are just running their mouths, but it would be interesting to poll the impacts on the lurkers as well.


Which part of the article addressed that? It does talk about beliefs, but not about discussions. On one study they mentioned persuasive arguments on some topics affected reportes beliefs the way they measured it as part of that study, but that's not the same situation as a discussion response.

What I realized years ago is that everyone has beliefs, regardless of how religious they are, and most discussions DO involve people trying to 'help the other person to understand' their existing beliefs.

Its not just 'windbags'. Its almost everyone in most discussions.

And it absolutely is not something that is limited to online or any particular media versus a face-to-face discussion. People may be more free in expressing their beliefs online, but that doesn't mean they don't hold them in spoken conversation -- they might keep things to themselves.

There are no forums 'full of intelligent conversation'. People _are_ able to discuss things and sometimes gain new information or even change their minds, but the ratio of new information going in versus explaining why your belief is more correct than the other guy's is very small.

People _think_ that they are rational or 'scientific'. Generally though, people are good at rationalizing what they already believed, and only change their minds when all else fails. Including you and I.


None of it. I was commenting on the implications of the information that the article presented.

However, particularly with that line about twitter, it seems to imply that modern methods of communication are more about confirmation than discussion. In particular the article's point about how much is lost in text-based communication. If I state an unpopular opinion, someone reading it might very well fill in the "blanks" with the picture and voice of a pimply 14 year old, which would allow them to more easily dismiss said opinion. I think it would be interesting to more closely examine how prevalent such projective behavior is on message boards.

My personal experience has (unfortunately) been people will almost always stick to their guns online, even after their perspectives have been thoroughly and undeniably refuted, even after their own allies tell them they've lost. Best case scenario they'll lose composure completely and look like a fool rather than admit defeat. In person, however, people will often admit that an opponent may have a point and other such courtesies. While there is an argument to be made that this is all superficial courtesy, it would also stand to reason that in person an opponent has less room to rationalize, and can detect things like passion, enthusiasm, or level-headedness that are impossible to deny. These traits could lend subconscious credence to an opponent's argument.

Basically I just want to take the researchers' findings on text-based communication and examine the issue in greater detail.


They save the best for last: people's own beliefs cannot (with statistical certainty) be distinguished from people's beliefs about what God believes.

EDIT Just to specify the distinction cannot be made based on fMRI data during verbal responses to such questions. This isn't the best or only measure of brain activity but it's still scary result.


Seems obvious; if you believe in God (in the monotheistic, infallible sense), then, ipso facto, you must believe that everything that is true is believed by God and vice versa, so if you believe X is true, you must also believe that God believes X is true, and vice versa.


It would be interesting to meet someone who sincerely believes in god, does not despise god, and yet consistently acts contrary to (his idea of) god's morality.


I'm sure there are many people like that, if only because many people consistently act contrary to their own idea of their own morality. Somebody in this thread already mentioned revealed vs. stated beliefs and preferences.


The stereotypical Mafioso is a simple example, but folks like that are everywhere. Humans are complex creatures and frequently do things voluntarily that they don't morally agree with.


This would be an example of cognitive dissonance, which is very common.


This is probably just because people who a believe a god exists have beforehand adopted beliefs that they were told are the god's. If one maintains, "I am a believer of religion X," it's tantamount to, "my opinions are those of X-god." Easier than doing some sort of modeling of the mind of god and filling in the details with one's own mind, is to start with the premise, "as a (e.g.) Christian, my beliefs are God's beliefs" and just say what you believe.


This is the benign interpretation, i.e. causality flows from God to me, but the opposite direction, or both, is I think equally likely interpretation.


That's why the most ironic passage in the bible is the bit about how "God created man in his own image."


"... and man returned the favor."


>beforehand adopted beliefs that they were told are the god's

That explanation is contradicted by the paragraph after.

The rest of your post seems right.


So true.

I can't count how often I was in a situation, where someone told me something religious/spiritual and I just thought he was joking.


"Twitter does not allow others to understand your deep thoughts and broad perspective. It only allows others to confirm how stupid they already think you are."


> Logically, this sum cannot exceed 100 percent.

The chance of the sum being exactly 100% is pretty low considering that they are subjective opinions on what percentage one thinks she or he is personally responsible for certain activity, and the spouses are separated from each other. As humans, we have cognitive biases and we don't access memories the way a server does. Simply inferring that "people often overestimate their importance in this world" from the experimental data that the couples’ estimates often added up to a number a lot bigger than 100% is an oversimplification of the matter. But since this is general psychology it is fine to have things this way perhaps.


Plus, it doesn't really mean that people overestimate more in big groups. If everyone overestimates by 5% they might just be bad at rounding.


"By the time you get to groups of eight, these MBAs were claiming nearly 140 percent productivity! "

OK but the 80/20 rule suggests a couple of the MBAs are doing most of the work, and the rest are overestimating their useful contribution. It's even possible that the 20%ers are underestimating their contribution, which fits the pattern of the Dunning-Kruger effect.


I've done some workshops on performance improvement. One of the exercises I do is to ask people to rate, on an absolute scale of 1 to 10, how well they are currently doing their job. 10 means "There is no possible room for improvement". 1 means "Just barely acceptable". I don't ask people to rate themselves as incompetent ;-). If you really think you are incompetent, then you can write 0, though.

After they write down the number, I ask them to consider another question. When you retire (as I usually had a young audience, this would be in about 30 years time), at what level will you be doing your job -- again on an absolute scale from 1 to 10?

Most people score themselves as a 7 or 8 in the first question. Then in the second question they have nowhere to go. I cruelly ask people if they intend not to improve very much over the next 30 years :-)

The problem is that most people do not follow directions. They don't judge themselves on an absolute scale. Their first thought is, "I'm doing well. In fact, I'm good at my job." So they think, "5 is average. A bit over that is 7. Obviously I've got to leave some room for improvement. I'm not perfect, so I'll leave 9 or 10 alone".

It's the same lens effect. I am actually asking a different question than they are answering. The interesting bit is that people skim over the internal logic and can often conflate my actual question with the question they answered.

Here is an example. Let's say you have 10 years of experience and you have 30 years left before retirement. How much more effective will you be in 30 years from now (as a percentage). By effective, I mean your ability to do your job and influence people so that projects are more successful and your company makes more money. Does it not seem reasonable that with 4 times as much experience you might be twice as effective? Or even more?

If you answered "yes", does that mean that you should make half as much money now (or even less!) than someone with 40 rather than 10 years of experience? What about your title? Compared to your 30 year more senior self, how do you rate? If you have "senior" in your title now, what should you have 30 years from now? (maybe "god", I suppose)

People conflate the questions, "How well are you performing" , "What is your worth", and "How much do you want to be valued". They answer the last question and assign it to the first two questions. Because people want to be valued, they over estimate their worth and (more critically) become blind to the fact that they can improve. This limits their ability to increase performance.


>Most people score themselves as a 7 or 8 in the first question. Then in the second question they have nowhere to go. I cruelly ask people if they intend not to improve very much over the next 30 years :-)

The first question is worded so they assume you mean their current job, and the second one assumes they'll be doing the exact same job in 30 years.

The problem is you, not them.


Or maybe a lot of them were worried if they didn't at least _claim_ to have contributed their share, they would ensure themselves a poor grade.



> One of the biggest barriers to understanding others is excessive egocentrism.

A 'Silicon Valley' problem!


A problem anywhere people spend most of their time being themselves instead of each other, i.e. all the places.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: