“If the doors of perception were cleansed everything would appear to man as it is, infinite”
– William Blake
In my quest for rationalism, I always attempt to guard myself against logical errors that arise from noise in perception. Whenever a problem presents itself, one first needs to know the facts, as complete as possible and as unbiased as humanly possible. This journey of rationalism inevitably leads one to writing on what is called cognitive bias. There has been a lot of research about these systematic deviations from sound logic that appear to happen in all of us naturally and the subject is a rabbit hole of seemingly infinite knowledge once you truly grasp the chains that bind human potential. A cognitive bias is often a shortcut taken in the brain to avoid information overload and I have found it to be a great habit to attempt to identify whether a decision is influenced by a cognitive bias.
Let’s take the example of the common idiom of a glass of water, filled halfway with water. The idea is that the description of the situation will depend on the perception and worldview of the observer. An optimist would explain that the glass is half full, whereas a pessimist would argue that the glass is half empty. Both persons focus on a different aspect of the situation based on their predisposition towards optimism. Both persons are right and wrong since they only got half of the story. A rational person, in turn, would assert that the glass is always entirely full; half of the glass contains water, or H2O, – together with minerals, bacteria and other molecules – and the other half contains a mixture of gaseous substances which we simply refer to as ‘air’.
The error both the optimist and the pessimist make here is referred to as a cognitive bias known as selective perception. They only see half of the story. It is – like all cognitive biases – a behavior all people are naturally inclined towards and cannot be eliminated – only controlled and identified – and arises from a persons particular frame of reference. All humans are imperfect, and none can see the entire story, but one can be aware that they only see part of the story when they perceive a situation, object or problem.
A high amount of selective perception is called perceptual defence and can lead to problems such as optimism bias – or ‘wishful thinking’ – where risks are ignored, such as the example of a smoker who believes he or she has a lower chance of sickness compared to other smokers, or gamblers who think they are naturally luckier than others. It can also have the opposite effect, where people are afraid they have a higher risk of failure than they actually have.
Hostile Media Effect
Another great example of perceptual defence is the ‘Hostile Media Effect’ and refers to the effect that individuals with a strong opinion on a subject perceive objective media coverage as biased againsts their point of view. In 1982 a major study showed the same news video to a number of pro-Israeli students and a number of pro-Palestinian students. On a number of objective measurements, both groups of students reported that the video presented a more negative image of their side and that a neutral observer would be more likely to blame their side for the conflict if they would watch the video. Clearly, information presented in the video was perceived differently from other information in the same video purely on selective perception.
A low amount of selective perception, where more information is objectively taken into account is called perceptual vigilance and it is this behavior that a rational person would want to train, practice and drill every day.
After practicing and attempting this behavior myself for a long time it leads to very interesting effects in daily life; I find that I am much less likely to judge a person harshly on his or her behavior because I am very much aware that I am missing a lot of information (about this person, their background and even the way their day is going) to be able to accurately judge their behavior and actions. It leads to a greater understanding why people hold certain opinions and I find that I can more easily excuse their irrational, natural behavior; They are simply programmed to ignore certain information. As for my own behavior, when solving a problem , I always try to take into account that I might be falling into the trap of selective perception.