Cognitive Biases: Bias Blind Spot

Image Cognitive Biases

In a Nutshell

Michael Smith introduces the topic of cognitive bias, particularly our own blind spot to this phenomenon.






At conferences I like to introduce my topics with a short story, usually involving this little character.

He’s a pipeline operator and he’s had a number of pipeline problems over the last few years. Fortunately, I always solve his latest problem by the end of my 20 minute presentation.

To make up for his distinct lack of facial features, I do occasionally give him some props. At IPC 2016, for example, he was lying in his bed worrying about pipelines. Then two years later at IPC 2018 he was standing by his desk – still worrying about pipelines. IPC 2020 is still some way off, but I plan to give him a new desk to stand by while he worries about pipelines.

So which one do you prefer? The long, thin desk A, or the shorter, fatter desk B?

desk question

Well… it may surprise you to hear that both of these tabletops are drawn exactly the same size. If you do not believe me, grab a ruler and measure the sides.

This little trick is known as the “Shepard Tables” illusion, and it is a good example of cognitive bias – a systematic error in judgement that arises in human cognition. Cognitive biases can make you see, hear, think or feel things that are not quite right. That can result in bad decisions.

Perhaps the most alarming trait of cognitive biases is that you cannot shake the faulty perception, even when you know that you are wrong. The bias is a built in feature of your brain.

More specifically, the tables illusion is a side effect of your three dimensional perception. When we are moving through the three dimensional world around us (like when we are driving a car or running away from a bear), this quick visual processing system is immensely useful.

The only problem is that the system can be tricked, even by something very simple like a two dimensional drawing of a three dimensional object. You cannot “unlearn” the way you see the world (unless you’re happy to crash your car or get mauled by the bear), so you just have to live with the possibility of being tricked.

desk question

Great question.

The pipeline industry, like many others, is undergoing a digital transformation. That means we are transitioning towards an era in which digital technologies will underpin all of our business activities. Whether it is data analysis or engineering, recruitment or marketing, more and more of what we do is facilitated by machines.

And is it any surprise? Today’s commercial computers operate at processing speeds in the GHz range. That means billions of operations every second, without getting bored, tired or asking for money. You cannot compete with that.

To rub even more salt into the wound, computers do not have cognitive biases. So when a well programmed machine observes the Shepard Tables, it sees two identical parallelograms – and thinks you are stupid.

In conclusion, we should all be replaced by machines. Right?

desk question

Wrong.

Humans have plenty of qualities that today’s machines cannot reproduce: individual and collective experience, ethics, intuition, and an empathic understanding of human behavior, to name a few. We are also the ones that think creatively, design the hardware, write the algorithms and keep everything running smoothly.

So let us not give up just yet.

What I would say is that the onus is on the human workforce to maintain its relevance. That means developing competency with computing, data and statistics. But it also means fighting our cognitive biases.

desk question

OK…

To start off the proceedings, here is a quick fire questionnaire. For each question you answer with a “yes,” you score 1 point. If the answer is “no,” you score 0 points. Answer as honestly as you can.

  • Do you pay more attention to evidence that confirms your preconceptions rather than evidence against your argument?
  • Do you see patterns in data where there are none and try to make coherent stories out of small or random datasets?
  • Do you give too much weight to your most recent memories and experiences?
  • Do you rely too heavily on a single source of information (often the first one you look at)?
  • Do you find it difficult to accept information that predicts an impending disaster?

desk question

If, like our faceless friend, you scored 0 out of 5 (or indeed anything below 5), you may be suffering from “bias blind spot.” This is the cognitive bias of underestimating the extent of your own cognitive biases! Overcoming this is the first step on the road to recovery.

In a Nutshell

What do cats that survive falling from high-rise buildings have to do with cognitive bias? Find out in this next part of our series, in which our author Michael Smith focuses specifically on survivorship bias and then links the phenomenon to pipelines and reaching zero incidents. In the previous part of this series, we introduced the illusion of causality.





ROSEN - Cognitive Biases #4

In this article, I was going to talk about survivorship bias. But in these strange and difficult times of lockdowns, social distancing and state approved daily exercise, there’s only one possible topic of conversation.

You guessed it, it’s co…

Cognitive biases!

Oh, alright then. Let’s talk about cognitive biases.

Survivorship Bias

In 1987, the Journal of the American Veterinary Medical Association published a rather unusual article about cats falling from high rise buildings.

Cats, the authors observed, suffered fewer injuries when they fell from higher floors of the building. Cats falling from lower floors seemed to be less fortunate, with broken bones aplenty. Very counterintuitive.

One tragic (but hilarious) explanation was that cats relax once they reach terminal velocity, thereby absorbing the impact better. The unfortunate cat falling from a lower height would still be accelerating as it hit the ground, tensed up in a state of feline panic.

Crunch.

While this is a perfectly coherent hypothesis, the more realistic explanation proposed was survivorship bias.

Ask yourself this: Did the authors collect their data in a controlled experimental environment? Presumably not. Throwing kittens out of windows is ill advised in a civilized society.

Instead, their data came from veterinary practices, where anxious owners had brought their ailing pets for treatment after a fall.

And what linked all of the cats that were brought to the vet?

They were ginger?

No.

Well, maybe. I don’t know.

The important link is that they all survived the fall. Deceased kitties never made it to the vet and hence were omitted from the dataset. Unsurprisingly, these were the ones that fell from the greatest heights. The dataset was therefore biased, and the conclusion was faulty.

But Michael, what does this have to do with pipelines?

A lot, as usual.

As pipeline professionals, our ultimate goal is to prevent incidents, and – much like the authors of the 1987 study – we pore over historical data in an attempt to understand why they happen. So, is there anything we can learn from the cat story?

Well, in this context, the survivorship bias would be a tendency to limit the scope of our studies to pipelines that have survived (i.e. never failed). I don’t think we can be accused of that. Indeed, a huge amount of effort goes into analyzing pipeline failures, and the resulting failure statistics are regularly used as a basis for risk assessments.

There is no evidence that we overlook failed pipelines as if they were dead cats.

But here’s some food for thought: What if we do the opposite? What if we focus too much on pipeline failures and not enough on their absence?

The reverse survivorship bias.

When learning about the causes of pipeline failures, we need to pay close attention to all pipelines, whether they be pipelines that narrowly avoided failure in the past, pipelines that are likely to fail in the future, or even pipelines that are entirely healthy. The data describing and explaining their condition is all valuable for decision support. That’s why ROSEN is developing predictive models that leverage condition data from tens of thousands of in service pipelines as well as failure statistics.

After all, what if the best way to reach zero incidents is by learning from pipelines that have had zero incidents?

Profound

I know. You heard it here first.

Until next time, take care everyone.

Last modified: Wednesday, June 17, 2020, 6:54 PM