Confirmation bias: The tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses.
Irrational cognitive dissonance reduction: When information we gather from the world contradicts our beliefs, we tend to interpret that information in such a way that it no longer contradicts our beliefs.
Overconfidence bias: We have too much confidence in the correctness of our own answers or beliefs.
Dunning-Kruger effect: The tendency for lay people to overestimate their knowledge of something and of experts to underestimate their knowledge.
Bias blind spot: We detect reasoning errors much more easily in the reasoning of others than in our own reasoning.
Self-overestimation: We overestimate our own talents and prospects in life.
Belief bias: Accepting the validity of an argument simply because the conclusion sounds plausible or because you agree with the conclusion.
Hindsight bias: We overestimate the probability that we would have accorded to the occurrence of a certain event after the event occurred.
Stereotyping: Expecting an individual of a particular group to have certain characteristics (associated with said group) without having information about that person.
Choice supportive bias (or post purchase rationalization): We remember the choices we made in the past as being better than they actually were.
Endowment effect: We accord more value to something simply because we own it.
Bandwagon effect (= ingroup bias): We adopt beliefs too quickly when they come from people in our group and blindly follow the behavior/decisions/opinions of the group.
Anchoring: A given piece of information can strongly influence our estimates (even if there is no link between that information and our estimate).
Framing effect: Drawing different conclusions based on the same information because it is presented differently.
Loss aversion: We feel the negative impact of a loss more intensely than the positive impact of a gain of the same size.
Sunk cost fallacy: Taking into account incurred and non-recoverable costs in deciding whether to continue with a project (and thus continue to invest in it).
Statistical reasoning errors: Intuitively we perform poorly at estimating probability.
Base rate fallacy: We tend to ignore base rates in estimating the probability that something will occur. In general, we often turn a blind eye to general, implicit information and focus exclusively on specific, explicit information.
Availability bias: We overestimate the likelihood that something will occur when it is easy to recall or imagine.
Gambler’s fallacy: Expecting a statistical correction when that expectation is not justified.
Hyperactive pattern detection: Seeing patterns in random series.
Exponential reasoning errors: We underestimate exponential growth because we are used to linear growth.
John is left-handed. His mother tells him frequently that left-handed people are generally more intelligent and creative than right-handed people. He is becoming more and more convinced that his mother is right because he has met numerous intelligent and creative lefthanders.
Joe is an avid basketball fan who likes to bet on games. He watched almost all of the NBA games for years and noticed the following: the team who scores the first point usually wins the match. Tonight, he bets on the match between the LA Lakers and the Chicago Bulls and the Lakers score the first point. So, he puts his money on the Lakers.
Brian studies at university and is member of a debate club where he and his fellow students analyze the ins and outs of American politics. At the first meeting they have after the election of Trump (about 5 weeks after the election), John says theatrically: ‘You didn’t have to be a political genius to see that Trump would be elected. You could see the frustration of white, poor Americans grow over the last decennia and Trump played right into that’. Brian is quite puzzled by John’s claim because at their last meeting - 3 weeks before the election - everyone agreed unanimously that Hillary Clinton would be elected. Which bias does John succumb to?
Peter is a student in economics who has already saved some money. He invests his savings in shares of a biotech company. He buys the shares at 12EUR per share. After a year, the price has dropped to 8EUR per share. Peter refuses to sell his shares even though there are no indications that the share price will rise again. Which bias does he succumb to?
Kurt is 16 years old and just did an internship for 1 month in a garage. He mainly had to clean cars and he occasionally watched as cars were repaired. When the car of his father breaks down three months later, he is convinced that he can fix it. He sees himself as an accomplished mechanic.
Martin likes to smoke marijuana. A friend who also smokes tells him that marijuana is not harmful and even stimulates creativity. Martin decides to look it up. He consults google and types in: ‘marijuana enhances creativity’ and finds an endless series of websites, blogs and articles that state that marijuana indeed stimulates creativity. What reasoning error does he make here?
A few years later it appears that his friend Peter, also a soft drug user, is not admitted to drama school after failing a test that requires creativity. Peter’s parents send him to a psychologist. The psychologist attributes the bad performance in the creativity test to excessive marijuana use. Martin finds out about that and quickly decides that it is not so much the marijuana that made Peter fail the test, but the chronic sleep deprivation by playing video games all night (albeit with a joint). What reasoning error does he make here?
You want to buy a new mobile phone. After looking around in the store for a long time, assisted by a diligent salesperson, you decide to buy model X. You are very happy with the purchase and proudly show the new phone to your friends. A friend asks you if that model has the new panoramic camera function. You quickly say ‘yes’, but you don’t really know. You hope so. In the evening you take the manual and discover that there is no panoramic function. "Damn it," you yell out in distress, but soon you figure that you do not really need it and that this panoramic function is a sales trick anyway. Everyone knows that you cannot take beautiful panoramic photos with mobile phones.
A marketing agency advises the sales department of Tesla to exaggerate how many people have already ordered a car. Which bias do they hope to exploit?
John is an American who does not believe climate change is happening. Recently he heard a Republican governor say: "Last month of August I had to put on a sweater almost every single night. Imagine ... In the middle of the summer! Therefore, climate change cannot be happening." John thought that this was a strong and valid argument and now uses it himself in discussions on climate change.
A new biotech company announces that it will bring a fantastic product to the market. The product gets a lot of attention in the media. It attracts a huge amount of investment, even though only 20% of starting biotech companies ever become profitable.
The Supermarket ‘Albert Hein’ has bought a new Chilean wine and hopes to make a nice profit selling it. They buy the wine for 2,5EUR per bottle and want to sell it for 6EUR per bottle. But the wine sells poorly. A marketing agency advises them to put the Chilean wine on the racks between their cheapest (and not very nice-looking bottles) and their much more expensive (and for most people unaffordable) bottles. Which bias are they trying to exploit here?
Linda and Mary begin a start-up. At the start they both invest 20 000EUR in the project. After the first year, that money is completely spent (they have rented and furnished an office space, made marketing costs, etc.). Unfortunately, the start-up still hasn’t generated any returns. If they do not pump in extra capital, it will be game over. They decide to ask their parents for extra funds because they do not want to give up now, given that they have already invested so much effort and money. What reasoning error do they make?
Moreover, they add, we have read about a lot of start-ups that also faced difficult times in the beginning and are now very successful. What reasoning error do they make here?
5% of lifelong smokers get lung cancer. Recently an article was published in which the CEO of Philip Morris talks about the unjustified demonization of the tobacco industry. He stressed that tobacco is a leisure product and added that the vast majority of smokers - 95% - do not get lung cancer. Which bias does the CEO exploit here?
A friend of yours bought a ticket to a concert but cannot go. He asks you if you would buy it from him. You decide that you want to spend no more than 30 EUR. Your friend accepts and you buy the ticket for 30 EUR. A day before the concert someone offers you 40 EUR for the ticket. You refuse to sell it.
Sabine already has three children and is pregnant with her fourth. Her first three children are girls. She asks her two sisters to guess the gender. Sandra, Sabine’s eldest sister, thinks it will be a boy. Because four girls in a row seems very improbable.
There is a new machine on the market that detects counterfeit money. A large marketing campaign is set up to sell the machine to stores. The campaign boasts that the machine detects 99.999% of fake money. That seems very accurate. Many shops therefore buy the machine. What, however, have they forgotten to check and what kind of reasoning error do they make?
Jeremy believes that women are bad drivers. The longer he has been driving, the more convinced he becomes of this because he has seen so many bad female drivers over the years.
Kurt believes that eating gluten is harmful and he is talking about it with his girlfriend Ann. He tries to convince her by pointing out that two of his friends have recently stopped eating gluten and reportedly feel much better. Ann took a course on critical thinking and points out that he is making a reasoning error. What reasoning error does Kurt make?
Ann explains the reasoning error he made, but Kurt refuses to see that something is wrong with his reasoning. ‘His thinking’, he tells Ann, ‘is always rational’. ‘In contrast to many others,’ he adds. What reasoning error does Kurt make here?
A long time ago there was an Indian Maharaja who loved to play chess. He was always looking for new opponents. To encourage people to play, he promised them a prize if they could win. Usually, it was a copper cup or a necklace for their wife. One day, a beggar came to the Maharaja to play chess. The Maharaja promised him a cup that he would receive if he won. The beggar, however, turned down the offer and said: "Honorable Maharaja, the only thing I want is a little rice. If I win, do you agree to put 1 grain of rice on the first square of the chessboard and then double the number in the next square (i.e. 2,4,8, etc.) until the whole chessboard is filled? "The Maharaja agreed and thought he would be off the hook by giving a little bit of rice if he lost. What reasoning error does the Maharaja make? (Source: https://www.mathscareers.org.uk/the-rice-and-chessboard-legend/)
Self-overestimation and the confirmation bias
Self-overestimation: We overestimate our talents and prospects in life.
John overestimates his creativity as a left-handed person.
Confirmation bias: The tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses.
John only sees and remembers confirmation for his belief: smart and creative left-handed people. This strengthens his conviction.
Hyperactive pattern detection, the confirmation bias and the overconfidence bias
Hyperactive pattern detection: Seeing patterns in random series.
There is no relationship between scoring the first point and winning the game, Joe is mistaken in thinking there is one. He sees a pattern that is not there.
Confirmation bias: Over the years Joe has seen and remembered mostly confirming instances of his belief (games won by teams who scored the first point) and has not noticed or quickly forgotten the disconfirming instances (games lost by teams who won the first point).
Overconfidence bias: We have too much confidence in the correctness of our own answers (and predictions).
Joe overestimates the likelihood that he will predict the outcome of the game correctly and therefore goes ‘all in’.
Hindsight bias: We overestimate the probability that we would have accorded to the occurrence of a certain event after the event actually occurred.
Brian reasons afterwards that (before the elections) he predicted the victory of Trump, but that was actually not the case.
Loss aversion: We feel the negative impact of a loss more intensely than the positive impact of a gain of the same size.
Peter’s loss feels uncomfortable to him, so he tends not to sell his shares below the purchase price, although he would better do so, since there is no indication that the share price will go up again.
Dunning-Kruger effect: The tendency for lay people to overestimate their knowledge of something and for experts to underestimate their knowledge.
Kurt is clearly no expert and overestimates his own knowledge.
Confirmation bias: Martin only looks for confirmation of his beliefs.
Irrational cognitive dissonance reduction: When information we gather from the world contradicts our beliefs, we tend to interpret that information in such a way that it no longer contradicts our beliefs.
Martin receives information that refutes his conviction but tries to save his conviction from falsification by giving a different interpretation to the facts.
Choice supportive bias or post purchase rationalization: We remember the choices we made in the past as being better than they actually were.
Because (without knowing it) you bought a phone without a panoramic function, you argue that you don’t want that function anyway (even though you would have liked that function). You rationalize that you made a good purchase, when in fact you did not.
Bandwagon effect: We adopt beliefs too quickly when they come from people in our group and blindly follow the behavior/decisions/opinions of the group.
The advertisement wants to make it appear that many people in your own group already bought this car hoping that you will jump on the bandwagon and follow suit.
Belief bias: Accepting the validity of an argument simply because the conclusion seems plausible or because you agree with the conclusion.
John adopts a bad argument uncritically (based on a subjective perception of temperature over a short period of time, we cannot determine global climate trends), because he agrees with the conclusion.
Base rate fallacy: We tend to ignore base rates in estimating the probability that something will occur. In general, we often turn a blind eye to general, implicit information and focus exclusively on specific, explicit information.
The investors do not or barely take into account the fact that only 20% of biotech start-ups ever become profitable (that 20% is the base rate).
Anchoring: A given piece of information can strongly influence our estimates (even if there is no link between that information and our estimate). (Anchoring is a form of framing).
When the wine is presented by itself, people see it as too expensive for the quality of the wine (since the wine does not sell well). The hope is that if you put the wine next to a very expensive bottle, it will seem affordable, and by putting a very cheap-looking bottle next to it, it will seem qualitative.
Sunk cost fallacy: Taking into account incurred and non-recoverable costs in deciding whether or not to continue with a project (and thus continue to invest in it).
Linda and Mary take their decision to continue based on the fact that they have already invested a lot of money. This is irrational since they should only be concerned with the expected profits of the new investment.
Availability bias: We overestimate the likelihood that something will occur when it is easy to recall or imagine.
Because Linda and Mary have read a lot about successful start-ups (the only ones that are covered in the press), they overestimate the chance that start-ups (even with a difficult start) succeed and assume that this will also be the case for their start-up.
Framing effect: Drawing different conclusions based on the same information because it is presented differently.
The CEO expresses the statistics in such a way (95% do not get lung cancer) to paint the picture that the health risks of smoking are not that bad. Saying that 1 in 20 smokers will get lung cancer (which of course amounts to the same thing) would sound much more alarming and therefore might convince more people to stop smoking.
Endowment effect: We accord more value to something simply because we own it.
Before you had the ticket, you thought it was worth a maximum of 30 EUR, now that you have it in your possession, it is suddenly worth more than 40 EUR to you.
Gambler’s fallacy: Expecting a statistical correction when that expectation is not justified.
The chance of having a girl or a boy is the same every time. The previous births of girls do not make it more likely that it will be a boy this time. There’s no justification for expecting a statistical correction.
Statistical reasoning error: Intuitively we perform poorly at estimating probability (or at statistical reasoning).
They have forgotten to check how much real money the machine actually detects as real. You could make a machine that says ‘false’ with every note, and therefore will be 100% accurate in detecting counterfeit money.
Stereotyping and the confirmation bias
Stereotyping: Expecting an individual of a particular group to have certain characteristics (associated with that group) without having information about that person.
Jeremy erroneously expects every woman to meet his stereotype.
Confirmation bias: (see above)
He is more and more convinced that women are bad drivers because he is much more receptive to confirming instances (women driving badly) and remembers such instances much better than disconfirming instances (women driving well / men driving badly).
Confirmation bias: (see above)
Ann makes it clear to Kurt that he should not only be looking for confirmation of his belief (the two friends who feel better on a gluten-free diet), but also for possible counterevidence.
Bias blind spot: We detect reasoning errors much more easily in others than in ourselves.
Kurt is blind to his own reasoning error and thinks that others make more reasoning errors than he does himself.
Exponential reasoning error: We underestimate exponential growth because we are used to linear growth.
The Maharajah underestimates how much rice he should put on the chessboard because he underestimates exponential growth. On the 64th and final chess square alone, the Maharajah should lay 18 000 000 000 000 grains of rice. That is more than 210 billion tons of rice. With that amount of rice you can cover the whole of India one meter deep and it is much more rice than has been produced throughout the history of the world!