It's all based on science
One of the biggest surprises from the last 40 years of neuroscientific research is that rationality is not the answer to everything human. Specifically, decision-making solely based on logic and rationality tends to be biased, and can only be improved when emotions and intuitions are taken into due account.
For over two thousand years humans have believed in rationality as “the” superior form of knowledge.
- Plato was one of the known initiators of the idea that we form knowledge by logical thinking, not by following bodily sensations, because “the senses can fool us, but the brain does not”.
- Aristotle even went as far as to state that the brain starts every thought from scratch (tabula rasa): i.e. “logic is not influenced by any preconception”.
- Descartes said “what is real is rational, and what is rational is real” and the bulk of Enlightenment theories followed through celebrating the primary role of logic in human cognition.
At the same time, the role of emotions was often viewed to be a lesser form of cognition; even for Hume, who went counter-current and maintained that we do not chose what makes logical sense, but what we care for, saw emotions as a necessary evil, with passions spoiling an otherwise rational decision process.
Rationality was believed the absolute form of superior human cognition, emotions and evil to control or suppress.
Today we know more.
There are different forms of cognition, we learn with different processes and we integrate them together. We have probably only started to scratch the surface of the matter, but here is in simple terms what we now know:
We have at least three forms of intelligence that run in parallel and interact: logical intelligence, emotional intelligence and intuitive intelligence.
Our brain is capable of collecting a lot more signals and stimulus than what we can translate into language and process in logical schemes; the rest is intuition, feelings, perception and instantaneous judgments.
Nobel Laureate Daniel Kahneman is undoubtedly the one who has changed the way we think about thinking.
In his book Thinking Fast and Slow he gives a thorough and quite entertaining account of several experiments that support his revolutionary theories:
- Contrary to what we used to think we DO NOT form our opinions solely based on logic, the contrary.
- We start by having an intuition – which is derived from a number of memories that we have collected in our brain from before (genetic or educational).
- Our intuition is a way to save energy; our brain takes a short cut as often as possible, approximations are helpful to get us by.
- If challenged, we then collect support data to explain our initial approximate opinion
- Interestingly enough, we stop to collect evidence when we reach a sufficient degree of confidence in our opinion; we often forget to, or are reluctant to play devil’s advocate with ourselves.
Consider the following educational examples:
- Mental calculus: if we are asked to perform a sequence of arithmetic sums, that starts with small numbers, we probably find it easy and we identify the result quite quickly at the beginning (2+3= 5, 5+4 = 9; 9+ 12= 21;...) but as we continue to add up, and the resulting number gets bigger, we may need to slow down; at the beginning we maybe able to perform the calculation while walking or doing something else, but as the complexity increases, we cannot concentrate on the calculation unless we stop doing whatever we are doing. We need more dedicated energy, the brain cannot do complex tasks without dedicating all its attention to it. When the numbers get really large, we may not be able to keep then in mind, we may need to write them down. However, although we may struggle to get to an accurate result, we retain a certain “feel” for an approximate result. When asked to resolve “what is 24 567+ 35 678?” we may not come up with 60245 right away, but we do “know” that the result is not going to be 1000 or 1000000; we may even have a vague idea that the results will be about 60 000. This is our brain making an estimation, and educated guess, to save energy. And it is often good enough. But sometimes it can be really wrong.
- An even more intriguing case happens when we have to resolve more complex problems, that involve not only simple numbers, but context and personalities. One such example is the question “John lives in Connecticut, likes to work alone, loves the open spaces and has a passion for field and agricultural machines; what is more probable, that John is a farmer or an accountant?” Most åeoel, including statisticians, would form their opinion based on certain clues like open space and agriculture an answer “farmer”, even though it is statistically incorrect (there are far more accountants than farmers in Connecticut and the probability is much higher that John is an accountant who likes tractors). The most interesting part of this type of experiment is that often even statisticians start to find supporting data to justify their initial response, statistically incorrect as it is.
So this is the basis of our model:
- that we tend to form opinions based on intuition and hate to admit that they maybe partial or approximate, but collect sufficient evidence to justify them and then try to “sell them” an “convince” others;
- meanwhile others have looked tat he same triggering event and probably come up with an entirely different opinion that they are trying to justify.
- Each want to win, each believes to be right, no-one is interested in hearing a different perspective, as long as they do not realize that their “truth” might be only a partial truth.