Why What You Think Is Not True
If you do not know yourself, your unconscious as well as your conscious states, all your inquiry will be twisted.
“If you do not know yourself, your unconscious as well as your conscious states, all your inquiry will be twisted, given a bias. You will have no foundation for thinking which is rational, clear, logical, sane. Your thinking will be according to a certain pattern, formula, or set of ideas – but that is not really thinking. To think clearly, logically, without becoming neurotic, without being caught in any form of illusion, you have to know this whole process of your own consciousness.”
7th Public Talk in Saanen, 1963
A cognitive bias is a flaw in our thinking that distorts the way we perceive and act. The present estimate, according to Wikipedia, is that there are 140 of them. A cognitive bias is not classified as pathological but rather as a normal, perhaps even helpful event in our daily lives. Most of them however seem distinctly unhelpful. They can be countered, so neurologists tell us, by our awareness of them, by logic, and by reflective thinking: though getting rid of habitual ones, they caution, can be a heavy energy demand on the neocortex.
140 is a daunting number but in order to cite a few likely to interest everyone the eminent physician and medical writer Jerome Groopman has helped us out in his book How Doctors Think. In a chapter entitled The Eye of the Beholder Groopman quotes Ehsan Samei of the Advanced Imaging Laboratories of Duke University reporting that ‘currently (2006) the average diagnostic error in interpreting medical images is in the twenty to thirty percent range … this has a significant impact on patient care.’
This is a chart of known cognitive biases. Click to see it in full size:
How do these errors come about? Well, all of them can be said to be or involve a cognitive bias of some kind, and radiologists are naturally concerned to identify and learn from them. So this has led to a considerable amount of research. But are there cognitive biases in the interpretation of x-rays that extend far beyond x-rays? I think there are.
For example, what is termed ‘affective error’ occurs because doctors tend to prefer happy outcomes to unhappy ones. At the first sign of a happy outcome a doctor may value this too highly compared with an unhappy sign. ‘Availability’ is also a cognitive bias to be wary of. This refers to the ease with which relevant examples from the past come to mind as a guide. Diagnoses proved right in recent weeks predispose a doctor to see a repetitive pattern [and we know that the brain likes detecting patterns]. Probably the cognitive bias known [and lived] most widely by the general public is ‘confirmation bias.’ This makes a doctor tend to see signs that confirm his initial diagnosis and not to see so readily those that disprove it.
What soon becomes clear to the general reader of Jerome Groopman’s book is that the cognitive biases which doctors may have also act in all of us in everyday life. This raises a wider-ranging question: Why is cognitive bias not taught as a subject in the curriculum of our schools? Is thinking of the clear and logical kind so highly valued by the educational system that cognitive bias gets badly neglected? [Maybe this is cognitive bias 141?] After all, it is all very well for The Oxford Dictionary of Philosophy to say that ‘the most evident display of our rationality is our capacity to think.’ It is also the most evident display of our irrationality. A balanced education should reflect that.
Note: This video provides an explanation of some cognitive biases. The foundation does not endorse the Royal Society in any way.
The conventional public view is that our thinking is our supreme, specifically human achievement, the driving force of our rationality, what distinguishes us from the lower animals. Yet any education that fails to also fully point out to students the irrational component of much human thought is clearly unbalanced, and will risk conditioning them as adults to rely unduly on what thought can do, particularly in the area of human relationships, whether personal or international. It is here that a course, even a very short one, in cognitive bias, could play a useful role by promoting young people’s awareness of objectively established and repetitive flaws in our thinking.
When you observe thought… how do you observe thought? Is there a thinker observing thought? When there is an observer observing thought, the observer is thought, one thought looking at another thought – right? One fragment of thought looking at another fragment of thought, and saying, ‘I must be aware of that thought’ or, ‘I must control that thought, I must suppress that thought, I must overcome that thought’. But the observer, the thinker is the thought. If you see that, not abstract, if you see that then you will see the place of thought, the necessity of clear thinking. And what place has thinking in relationship? You understand? What place has knowledge in relationship?
2nd Public Talk at Brockwood Park, 1973
One wonders what Krishnamurti, who saw the scientific mind as part of the religious mind, would have said of a proposal to teach students about cognitive bias in schools. Perhaps he would have welcomed what well-founded neuroscience has to say about the brain’s mistakes. Perhaps he would have welcomed too what the Yale neuroscientist Steven Novella, in his fine Great Courses lectures A Scientific Guide to Critical Thinking, says of the heavy consequence of our cognitive biases and allied errors of perception: because of them he says: ‘Our very sense of self and what we perceive of reality is an illusion created by the brain.’ That is, of course, a subject on which Krishnamurti has much to say.
By David Skitt
Book Editor for Krishnamurti Publications
DON’T MISS OUT
Get updates on new articles, videos, and online events