Hacker News new | past | comments | ask | show | jobs | submit login
List of cognitive biases (wikipedia.org)
168 points by mrb on Dec 4, 2013 | hide | past | favorite | 62 comments



I wonder if a "cognitive bias bias" exists, where a rational solution is discounted in favor of a more inferior one due to falsely attributing your initial solution as a cognitive bias, even though it is perfectly appropriate for the situation at hand?


There was a great post on lesswrong.com about this:

http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_peo...


Very interesting. I have always wondered if teaching people about cognitive biases can help them avoid them. On the very least, I am convinced that this may not be as easy as I was thinking.

Some relevant excerpts from the cited article:

>> If you're irrational to start with, having more knowledge can hurt you.

>> I've seen people severely messed up by their own knowledge of biases.

>> You can think of people who fit this description, right?

This makes sense now. If the brain behind is close to a random baseline in interpreting statements, how can you feed more statements (about interpreting) to expect a positive outcome.

A question still remains: Would it help to teach people of these biases when they are kids.


The impression I keep getting is that (with due respect to "the power of the context" http://www.vpri.org/pdf/m2004001_power.pdf) teaching someone about fallacies automatically makes them about 15 IQ stupider.


I am not sure what you are getting at. That the PARC people were too deep in awesomeness to productize it? Just grabbing at straws I haven't read the whole article.


I'm referring to the maxim often repeated by Alan Kay that "[point of view|context|a change of context] is worth [40|80] IQ".


Darn! I would be -150 by now. :-)


It's a one-off effect rather than -15IQ per fallacy, though I wouldn't deny that adding more fallacies could have some cumulative effect. Maybe deeper, rather than more extensive, understanding of fallacies, cognitive biases and so on may start to reverse the effect, though I wouldn't exactly bet on it doing do consistently.


It should be named the basilisk bias.


It's a similar idea to the argument from fallacy, wherein a conclusion is believed to be false because its argument contains a fallacy.


I do not think it is a common problem. Discounting other people's correct arguments because you diagnose a fallacy or the effects of a cognitive bias? More common.


Sounds like the opposite of the Dunning-Krueger effect[0]. Instead of one with below average skill overestimating their ability, someone with above average skill underestimates their ability.

[0]http://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect


The DK effect is that everyone is bad at estimating their ability, the lower skilled have more error in their estimates. The DK effect doesn't apply to the person, it applies to the person in the context. I might think I am more skilled at auto repair than I actually am. And in fact, I might think that because I am a bad ass at writing codes.

It might be that the DK effect is _more_ pronounced in someone who has mastered a skill that they think transfers to other problem domains.

But who knows, I am not a cognitive psychologist, I only play one on the internet.


I am not sure, but I think an illogical argument can be just illogical without being a bias. It is possible to reject the initial solution due to some bias (which may be any of the number of them), but it may also be possible to be just wrong at rejecting it.


Agree, have you watched Japanese Soup “Legal High 2” ? To win the lawsuit, the lawyer doesn't case too much...


I sure am glad I don't suffer from any of these cognitive biases. I'm much better than the average person at identifying and squashing them in myself. Dunning-Kruger in particular!


Everywhere I look I see people falling for confirmation bias. Just as I suspected.


Much prior HN discussion on this article.

https://www.hnsearch.com/search#request/submissions&q=%22lis...

I guess the duplicate URL checker has a time-out period?


If you like learning about cognitive biases (I certainly do!), may I recommend spending some time with some of the articles on lesswrong.com -- I find that for me, sometimes reading online distracts me from work, so I have all of Eliezer's work in e-book format which I read via kindle. Here it is: http://lesswrong.com/lw/72m/an_epub_of_eliezers_blog_posts/


Heuristics & Biases is a great book with more detail:

http://www.amazon.com/Heuristics-Biases-Psychology-Intuitive...


Relevant : Thou shalt not commit logical fallacy !! https://yourlogicalfallacyis.com/


The bias here is the belief that bias is a bad word.


That's because biases are, by their definition, not correct.


Actually, I don't think that's true. As a noun:

> prejudice in favor of or against one thing, person, or group compared with another, usually in a way considered to be unfair.

As a verb:

> cause to feel or show inclination or prejudice for or against someone or something.

Biases may often be lead to incorrect assumptions about things, but I'm fairly certain that a bias doesn't have to be incorrect by it's very nature. Say that someone I love is on trial for some sort of crime. My natural bias would be to believe that they are innocent (unless I absolutely knew otherwise). Just because I'm biased towards their innocence doesn't automatically mean they are guilty.

Perhaps I misunderstood you though.


Addressing both you and the parent, I think it's more that bias implies a resistance to any other perspective. So it's bad after all, but not because it's wrong per se, but because the biased person isn't open considering to whether it's right or wrong in the first place.


I attempt to bias myself towards auto-correcting feedback loops (physical, mental, computational).


Only if you know you don't take them into account. Mathematically speaking, you can have a perfectly unbiased output from a biased input, if you know the bias model and account for it.


Not true. There are upsides and downsides my friend and these biases are adapted to evolution where heuristics thrive.


I am so glad to see this list! Although we are intelligent beings, we emerged from the wilderness, molded by eons of Natural Selection. It is naive to think that our intellect is pure, that it has somehow emerged uncolored by our origins. More likely, our minds are highly biased and carefully calibrated toward protecting our own individual survival. This is a list of what makes us humans. Someday humanity might create artificial forms of intelligence, but it will no more resemble our own intelligence than a plane resembles a bird. This is a list of all the mental quirks that define us.


A really interesting article on how this relates to being an engineer is http://www.kitchensoap.com/2012/10/25/on-being-a-senior-engi.... At the time I read it I found it really interesting how few of the skills he talks about are technical - they're all "soft" skills. He has a section on cognitive biases and how they affect development.


Relating to entrepreneurship, the ambiguity effect seems particularly dangerous. There is no such thing as having perfect information. In fact, the more you know, the less opportunity and upside you can have in a venture. Why? Because the unknown equates to risk, and risk correlates with reward.

Not knowing enough is one of the major excuses to avoid starting a venture. Acting optimally on imperfect and limited information, on the other hand, is an entrepreneurial trait.



I'm a bit skeptical regarding the weight of these biases on everyday life. Looking at all these biases, and the fact that no one is actually free of them (except perhaps some enlightened few) would suggest that we're all making choices which most of the time are based on assumptions that have little anchoring in reality. If this were so, shouldn't there have been total chaos here on earth?


As heuristics, for familiar cases, these cognitive biases are acceptable; Why would you expect chaos from that, instead of expecting suboptimal results? And yes, we definitely see suboptimal results.


I guess it depends to how detached from reality the average choice is when affected by these biases, and obviously it's not worse than suboptimal as you said, otherwise we would not be getting anywhere. I guess just viewing that large number of biases made me feel like we're lucky to get through the day, let alone make progress as a civilisation :)


Maybe our world is in total chaos compared to some hypothetical Earth where everyone is perfectly rational. We are just used to it.


Barney Stinson had pointed out the cheerleader effect before, I never thought that it could actually be a thing...


I won't believe it until it has a more formal name. "Organizational Attraction Bias", perhaps. :)


Yeah, I feel like half the biases listed are actually phenomenon that are quite noticeable, just referred to with more casual terminology. Of course, statisticians wouldn't want to work with arguable terms so they have to "jargon-ify" the words a bit.


Or in reverse, perhaps Isolation Minimization Bias.


Check out the Center for Applied Rationality (http://rationality.org/) for workshops on overcoming cognitive biases, building habits, achieving your goals, making better decisions, etc. It's a non-profit that my friends are running.


The google effect is a very interesting one that I have noticed happening to myself. It is interesting to know that it is happening to other people.


While it is a memory bias in the sense that having a better memory is better than depending on Google, in practice I know that I have limited memory and so am better of remembering those things that are hard to find via Google [1]. I frequently don't remember other peoples phone numbers anymore for example because my phone remembers them, AND, I believe I am using this newly available memory elsewhere. Like with all other finite resources, I think it is about the production-possibility frontier [2] with memory as opposed to a bias.

[1] One day interviewers will understand this and will stop asking questions that are just one Google/StackOverflow search away. :-)

[2] http://en.wikipedia.org/wiki/Production%E2%80%93possibility_...


I work on an IT Helpdesk, and I fear that if my clients figure this out, I may be kicked out the door :/


I am guessing you are saying that you are often dependent on Google for solving clients problems. If so, I think that is highly appreciated rather. Of many IT help-desk professionals I have encountered (while working for big companies), only two would actually spend time researching the problem at hand. Others would just propose an OS reinstall or hardware replacement.


many of these are natural defense mechanisms. if we acknowledged to ourselves or to each other how shitty certain experiences are, how we treat each other / get treated... isn't not even clear there would be an outlet anyway.

the good news is: if we don't like what we hear, we can choose one of these many distortions to make the story fit our needs.


I would like to find the same kind of list about scientific experiments. What would the list be ? Analogy, elimination...


Reads like a list of how human beings are. Was there news somewhere in there?


i heard about a book that discusses the various ways in which our brains fool ourselves. i've never actually seen it though. anyone have a clue what book it might be?


could it be "Thinking, Fast and Slow" by Daniel Kahneman? https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow


Correct. It's a great read. He weaves theory with storytelling better than any other author I've read.


It might be "You Are Not So Smart" by David McRaney.


I don't know why you'd link to wikipedia for this- I often see instances where wikipedia has incorrect information.


I don't know why you write a hacker news comment for this- I often see instances where hacker news has incorrect information.

Your comment is pretty irrelevant, if you cannot find errors in the current page. Yes Wikipedia is not accurate for some things, yet what's wrong with the article posted? Also, although I might sound like a hypocrite, if you find something wrong on wikipedia go ahead and change it.


Personally I see this list as not necessarily complete. For instance, as I see it, the term 'moral hazard' is as valid as 'gambler's fallacy' as a cognitive bias, however, I would not dare to edit the page and add it in.

Another example is 'wilful ignorance' - this is when trying to think about something is so mind-blowing that the mind just goes blank. Despite the name of the term this is not 'wilful' - it is subconscious. It might kick in when the world goes to war or something serious like that. This is not the same as the 'Ostrich Effect' - this is conscious ignorance, e.g. if someone asks me about what shoes they should buy next and I really cannot be bothered to think about it then I make a conscious decision to be ignorant.


In fact, the grandparent comment is an example of one of the biases listed there. :-)


I'm guessing that was the point.


Skimming the list and the cites, the page seems to be combining topics from a mishmash of different domains (mixing unconscious decision biases, rhetorical fallacies, misunderstandings of statistics, heuristics, and perception/memory effects), and heavily leaning on primary rather than secondary sources.

It's certainly a list of Wikipedia pages, and the pages might be good ones, but treating the list itself as even a tentative fact about the world is probably a mistake.


I'm confused. I would have sworn that the comment was jokingly demonstrating a cognitive bias.


Yeah, my joke was maybe too subtle.


It was a joke.


I thought I was reading a list of typical programmer thought patterns.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: