Most people believe that their decisions are mostly based on logical and rational thinking and judgment. This section shows that in reality many factors influence the decision making process making it much less rational that we’d like to think it is. Understating these processes helps improve one’s decisions while improving ability to influence others. This section uses the terms rational and irrational in order to draw a distinction between two types of decision and judgment process characterised as follows:
- The rational decision and/or judgment processes: Conscious, logical, cold, controlled, slow, unbiased, …
- The irrational decision and/or judgment processes: Unconscious, illogical, emotional, automatic, fast, biased, …
We'll first overview some reasons why irrational thinking takes place and then go through a few examples of cognitive shortcuts to show how irrational thinking operates, we'll then take a look at the special case of thinking about statistical matters.
Part 1 - Limits of rationality, why we get irrational
Attention and effort
Rational thinking requires attention and effort which are states which can't be held forever. Limits range typically from 10 minutes to a few hours (very rarely). As concentration drops, so does rationality, thus leaving more room for biases.
It is possible to identify physiological symptoms of concentration (and lack of), for example large pupils, deep breathing, …
In a negotiation or debate, it is therefore good to realise that as time goes by, concentration will drop and so will rationality, and that irrational decisions with mostly emotive motivations are most likely to be made at times where concentration is low. Simply put if you want people to think about what your saying and make rational decisions don't get them tired first.
This limitation of attention and effort is one of the main reasons why irrational thinking can take over, it simply isn't possible to stay concentrated and vigilant long enough to handle day to day judgement and decision processes.
Irrational thinking as input to rational
Even when rational thinking is achieved and maintained it is important to realise that irrational processes will have great impact. This is because irrational processes are often very quick and take place before rationality kicks in, leading to biases due to the cognitive state which underlies the rational process.
An example of such a situation is the association and priming principle discussed below, which is a case where automatic processes highly impact the unrolling of subsequent controlled processes, because the automatic processes create a cognitive state which will influence subsequent thinking processes.
Part 2 - Cognitive shortcuts
This sections explores the kind of things that can go wrong during irrational thinking.
Association and priming
Associative mechanisms have great influence on the cognitive state (state of mind) which in turn has great influence on decisions and actions. This is true for both rational and irrational processes, because even during rational thinking retrieval of information in memory will be based on associative memory processes.
Priming is an example of associative mechanisms in action, where a given word fires ideas associated to the word. For example, "bread" and "jam" might fire up the idea of breakfast or snack. Or being in a school will raise your concern about education issues (voting in a school can for example influence results concerning economic issues). Anchors are a specific case of priming where an answer to a question is created by relatively small variations around a potentially irrelevant reference (often provided a little before). For example "is the biggest ant bigger or smaller than 8cm?", then "how big is the biggest ant?". The 8 in the first question will be a starting point for the reply in the second.
The confirmation bias is another example where the process of trying to confirm an idea leads to the activation of associated elements (in particular compatible ideas) and the omission of others (and in particular contradicting) which leads to a biased representation in which the idea seems particularly valid.
Fluidity and comfort
Cognitive comfort (fluidity) leads to a decrease of rational thinking and an increase of intuition and biases. Cognitive comfort is a state in which things seem to fit together well and little contradiction exists between the current focus and the cognitive context. This in turn leads to an impression of validity.
The impression of familiarity (which can be created by being exposed repeatedly to an idea for example) leads to comfort, and can lead to increased believing due to lack of vigilance and rational questioning.
Comfort is also linked to quality of presentation, for example, a false statement written in small and hard to read will be more often evaluated to be false than if it's easily read.
Cognitive comfort is a pleasant state (more pleasant than effort obviously), and the fact that comfort reduces attention and deep thinking is easily understood from an evolutionist point of view. Indeed, familiar situations which lead to no harm will induce a sense of comfort and reduced vigilance.
Cause / Effect
The tendency to create cause to effect sequences where they are not justified is strong, in particular when the two events are sequential in time.
For example, if a company improves results after a new director arrives doesn't prove he has talent, because many other factors can explain such improvements.
It's important to note that in many cases random events have as many chances to produce seemingly regular results as irregular results. However, when a random sequence appears regular we tend to search an explanation to this regularity, when this cause doesn't exists.
Generally speaking, whenever a causal explanation comes to mind it is likely that any statistical evidence will be dismissed, even if the causal explanation is known to be of little reliability.
The halo effect designates the tendency to generalise a partial impression to a global impression. An example of this is when someone is described as strong and authoritative, we might think of him as a good leader or good looking, when this can be true or not. This also explains why first impressions are so important, because first impressions condition how we will form our following impressions (the cognitive model explains this).
Due to conformation biases and more generally association principles, this effect has impact on rational thinking as well.
Limited awareness theory
Decisions are made based on whatever information is available at the time, in particular, it is very difficult when making a decision to integrate elements which aren't easily representable.
Project management systematically suffers from this because people always make project plans based on what they know about the project and never integrating "the unknown unknowns". At the very least it is important to acknowledge the existence of the "unknown unknowns" in order to add some error margins to all estimations. It is better to rely on statistical data if any is available.
Also, we build up full representations out of the little information we have, and when doing so, coherence is the goal and is achieved (using association and confirmation bias in particular). This in turn makes us gullible, as we tend to build coherent stories out of the pieces of information we have, thus we believe the information is true (because the story is coherent).
We pay much more attention to the pieces of information we have than to their reliability (the sources, the sample, ...).
Substitution and Intensity equivalence
Substitution is a mechanism by which a complex question gets replaced by a simple one for fast evaluation purposes. For this to work, the answer to the simpler question must also fit as an answer for the more complex question. An example is the complex "do you think the president's economic politic is good" gets replaced by the simpler "do you like the president"
Intensity equivalence helps achieve the matching between two different kinds of answers by allowing mapping an answer on the target (complex) scale from an answer on the source (simple) scale.
This leads to the overall idea of heuristics, where a complex question/problem gets replaced by a more manageable one.
Illusion of understanding
Retrospective biases occurs when understanding of an event is possible a posteriori when it wasn't possible at the time. It is very difficult in these situations to exclude the currently available knowledge and to realise that the outcome wasn't predictable at the time.
Invalid cause effect occurs when cause effect relations are built when this relation doesn’t exist (many events are random or statistical and not causal). Remember intuition likes simplicity and apparent logic, and cause effect is simple and logic, even when wrong.
People will rely on a prediction even if they know it is barely more reliable than a guess and this tendency to find simplistic explanations of past events enforces our belief that we can predict future events.
In most cases simple formulas and "rules of thumb" give better predictions than expert intuition and opinions, but these are usually dismissed.
Experts and intuition
Intuition, which is an automatic and mostly irrational process, requires knowledge about similar situations, therefore for intuition to develop and become more reliable, a fast and harmless feedback loop is necessary. This means that one should only trust intuitions if the context allows intuitions to be most reliable.
High conviction intuitions only show that the person has excluded all doubt, and this is enforced by intuitive and irrational thinking (associations, subsitution, limited awareness, ...). High conviction isn't a sign of reliability, quite to the contrary.
People tend to hold mental account for different categories, this can lead to irrational choices, for example:
People prefer to keep investing in a loosing and low potential project (to avoid accepting the lose) rather than keeping or investing the same amount in a new higher potential project.
Example of the lost concert ticket...
Part 3 - Statistics
Little samples produce statistical results which systematically over represent extremes. A simple example is a box with as many red and white balls. If you only pick four balls you have more chances of getting 100% reds than if you pick 10 balls. For this reason, if you correlate success level and school size (number of pupils) then you'll find that higher success rates are more frequent in small schools, this is an example of the little number effect. In day to day studies, samples are very often too small.
Estimation of the probability of an event tends to be based on how easily it is to recall (or imagine) examples of the event. Media have a large impact on this, as the fact of talking about unlikely events raises the impression of probability. Terrorism largely benefits of this effect by managing to create large terror effects with pretty little facts largely amplified by the media.
When estimating probabilities people tend to overestimate the value of the little information they have and ignore base ratios (statistical facts about the probability). An example of this is when told someone is serious, meticulous, ... and then asked if the person has more chances of been librarian or farmer, people will reply librarian despite the fact that there are 1000 times more farmers than librarians, and in those 1000 there are hundreds which in fact match the description.
Details and credibility
A detailed assessment tends to be judged more likely than a less detailed one providing the details match stereotypes. This is obviously wrong as each new detail can only reduce the chances of a match.
An exceptional situation usually doesn't last and tends to drift back to an average situation. For example, when an exceptionally good performance is achieved, it is predictable that the following will be worst. The same goes for a bad performance. This has an adverse effect on reward/punish systems, as good performances are rewarded they are often followed by worst performances, therefore giving the impression that the effect of the reward is to impair future performances (when this isn't the case).
Intuition vs Facts
Two types of intuitions: experience (the situation matches a passed one and the future can be predicted) and substitution (the situation is replaced by another known one and outcome is extrapolated).
In both cases it's important to first evaluate statistical facts and only allow intuition to perform minor adjustments.
Dissymmetry of win and loose
Looses have higher value than their matching gains while evaluating choices. This basic principle has many effects in every day choices. Some examples:
- People will evaluate the value of something to be higher if they are selling it then if they are buying it
- People will only accept a win+loose outcome if the gains are significantly higher than the loses
- Owning something increases its value
This is explained from an evolutionist perspective by the fact that threat is treated with higher priority than opportunity, defence has priority over attack, and this happens directly at a neurological level. This is sometimes referred to as "lose aversion".
Extremes and probabilities
Low probabilities are over evaluated, people over estimate the chances that low probability things will happen, weather these are desired or not. People make an enormous difference between 0% and 2%. Many examples illustrate this:
- People play national lottery where chances of winning are ridiculously low
- People take insurance against things that have very low chances of happening
On the other hand high probabilities are under estimated. People make a big difference between 98% and 100%.
- People will often prefer 100% chances of winning 80€ over 95% chances of winning 100€
These two effects are sometimes referred to as the possibility (over estimation of very low probabilities) and opportunity (under estimation of very high probabilities) effects and lead to many biases in judgement and decision making.
However, it's important to add that these biases are stronger when the decision implies things that have salient representations. Moreover, some examples seem to contradict this theory (people under estimate the chances of an earthquake in Los Angeles). This theory must be considered with caution.