What if Common Sense is Not All That Common?

W

Artwork: Gabriel Lemonnier

People don’t just receive external information; they also process it and become architects of their own social environment.


–  
Markus & Zajonc (1985)

Imagine yourself in a new environment full of people you’ve never met.

For most of us, this is a stressful situation. Interacting with people without knowing how they see and interpret the world carries a fair degree of uncertainty, and therefore, we become vigilant.

When we encounter new information we do a few things. We interpret that information, construct theories around it, and eventually make judgments based on prior experience as well as our own opinions of how others are likely to think.

Most of this happens unconsciously and automatically. This process has become the foundation of the study of social cognition.

When thinking about the world around us, we are highly influenced by our heuristics and the way in which we are socialised. One example of this is the looking-glass self, popular in psychology since the early 1900s. The theory is that each of us derives our identities from how we perceive others to see us. This occurs due to Theory of Mind, a cognitive process unique to humans.

Simply speaking: we change how we behave based on the people and environment around us.

Psychologists Yeung and Martin present this theory in a three-stage process:

  1. Thinking about how we appear to others
  2. Reacting to their judgment of our appearance
  3. Modifying our behaviour in response to those judgments

The second stage is the most interesting and least understood, partly because we can never be certain of how others perceive us. We simply make best guesses based on the context.

However, these guesses are often less than accurate. Humans are notorious cognitive misers, biased towards choosing the path of least resistance when it comes to forming judgments and opinions.

In classical psychology, the model of the naïve scientist was one way of examining these cognitive shortcomings. The naïve scientist, in thinking about the world, seeks to create balance and environmental control.

Any new situation is interpreted through the lens of a scientist (albeit a naïve one) based on observations and common sense.

But ‘common sense’ is actually pretty subjective. If a situation does not fit our internal model of the world, it creates dissonance, and as recent research has shown, we engage in all manner of cognitive leapfrogging in order to reduce it.

Research into the naïve scientist theory has revealed a wealth of biases and heuristics that we employ daily, and often unconsciously, to circumvent unfamiliar situations.

The argument over how many of these biases are evolutionary is on-going, but there is solid evidence supporting that many developed in order to reduce ‘cognitive load’ and generally make our lives easier. While these may have been very effective in our hunter-gatherer days, some are less than beneficial in the modern world.

Think of pattern-seeking, or categorisation heuristics. We have evolved over millennia to see patterns in the environment, as recognising them has proved far more beneficial than not over our evolutionary lifespan.

Even today, it does less harm to see a threat when there is none, than to not see a threat when one is there.

For instance, when you hear a loud sound behind you, you will instinctively turn to see what it is. The more it repeats, however, the more you will desensitise to it, as you have judged it as non-threatening. If we didn’t have this heuristic we would constantly be on guard even when the environment was harmless.

But categorisation heuristics also mean we see patterns that are not there, or group things into narrow categories even when broad categories are more logical. We do this for the sake of easy remembrance, as the more categories we attach a concept to, the more triggers we have for its recall. This is what’s known as the nodal network model of memory.

The conjunction fallacy is a great example of a categorisation heuristic. Consider the following put forth by behavioural economists Tversky & Kahneman:

“Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable?

  1. Linda is a bank teller.
  2. Linda is a bank teller and is active in the feminist movement.”

Because of our need to categorise, option 2 seems to make sense. She is interested in social justice, so why wouldn’t she be a feminist?

Option 1 is actually correct simply because there are more bank tellers in the world than there are bank tellers who are also feminists. Here, our preconceptions of how ‘representative’ Linda is to the options given influences our ability to categorise in a rational manner.

Every interpretation we make is done so to achieve a common goal: to support what we believe in a way that makes sense to us. If we don’t do this, all new information quickly becomes overwhelming and mentally fatiguing.

This is why we actively seek to reduce the dissonance that occurs when things don’t make sense.

What makes sense to one person will not necessarily make sense to the next. There is no strategic guidebook on common sense, which is why the study of social cognition is so important.

Only universally understood concepts and language can break through the subjectivity.

Thoughts?

By Adam