7 Cognitive Biases That Create Hazardous Work

Have you ever assumed, “Oh, that will never happen to me,” despite the fact that statistics indicate it is likely?

Do you always take the same route home from work, despite the fact that there may be better alternatives?

Have you neglected to apply sunscreen since “a few hours in the sun won’t do any harm”?

If this is the case, you have been operating under the influence of an unconscious cognitive bias (specifically, the bias of ignoring the baseline, the default bias, and the bias of underestimating cumulative risk).

When we humans make decisions, we make a variety of assumptions based on previous experiences, what others have told us, and what we see directly in front of us. Without our inherent biases, we would be unable to function, as we would be incapable of making any decisions.

Occasionally, though, our biases endanger us by causing us to act in risky ways. To be more secure at work, we must be aware of our biases and act against them when necessary. And certainly, this is true for both management and employees.

Consider seven cognitive biases in greater detail and how they can contribute to risky work.

#1: Optimism Bias

Nobody thinks themselves to be overconfident, but almost everyone is. Simply said, the majority of people believe they are more agile, intelligent, and capable of performing most jobs than they actually are.

For instance, do you consider yourself to be an above-average or below-average driver? No fewer than 93 percent of respondents state that they are above average—and undoubtedly half of them are incorrect.

In the workplace, overconfidence manifests itself in a variety of ways, including skipping safety procedures, assuming we can work as safely at 4 p.m. as we did at 10 a.m., ignoring the fact that the floor is understaffed that day, failing to seek assistance when needed, and assuming we understand how a machine operates under all conditions.

In short, overconfidence causes us to disregard optimal practises that we are aware should be followed.

The worst part about over confidence is that you continue to receive positive reinforcement up until disaster occurs. If you perform an unsafe act 500 times without being injured, you begin to forget that it is unsafe. Then you arrive at 501.

#2: Blind Spots

A worker trims the surplus material from an extruded plastic item using a knife. Her focus is on her knife, the part she is holding, and the line’s speed. She places her hand down to grasp a clamp without looking, unaware that another worker had left an open knife on the table.

When she cuts herself, she falls victim to a cognitive bias that assumes nothing in her environment has changed to imperil her. Never previously has a worker left a knife there—she has developed a blind spot as a result of her familiarity with the area.

Blind spots expose us and can be produced by a variety of causes. For instance, prominent threats can occasionally create blind spots for less evident dangers. As an example, suppose a factory has a metal-stamping machine that has previously crushed workers’ hands, resulting in horrific injuries. When working with this machine, everyone keeps a close eye on the stamper. Meanwhile, an automatic arm clears the path for the next piece of metal to be stamped. It features exposed gears that have the potential to squeeze and ruin a finger, although no one has ever suffered from that damage. This automatic arm could be a blind spot risk—one that goes unnoticed because it is obscured by the evident danger.

#3: Inconvenient Confirmation Bias

As defined by Scott Plous in The Psychology of Judgment and Decision Making, confirmation bias refers to the inherent human inclination to “seek for, interpret, favour, and recall information that confirms our pre-existing ideas or hypotheses.”

In other words, we frequently see only what we anticipate.

Safety expert Thomas Krause recounts an instance in which a thirty-year-old miner died when the roof of a tunnel collapsed. The collapse occurred immediately after he and another similarly experienced foreman assessed the tunnel for structural problems and discovered none. Following the event, an examination discovered 137 missing bolts, as well as clearly weakened roof boards.

How could the foreman and miner have missed these issues? The reason was straightforward: they had previously inspected other tunnels in this mine without seeing any issues, and thus did not anticipate encountering any in this tunnel. Not only do we see what we expect to see with confirmation bias, but our pre-programmed brains deliberately reject anything that contradicts our initial preconceptions.

#4: Disregard for the Baseline

We all have a tendency to believe that our individual circumstances are unique, and we tend to overlook the usual statistics that govern our activities. This includes collision rates.

This cognitive bias is referred to as “ignorance of the baseline.” Rather than disregarding the baseline, we frequently engage in active denial of it.

Even when workers learn that their coworkers have developed serious ailments as a result of, say, handling fibreglass insulation without gloves, they continue to do so, trusting that it will never happen to them.

Anyone, including you and your employees, is susceptible to accidents and injuries.

#5: Inherent Bias

When given with a choice, we tend to choose the defaults—not it’s only easier and faster, but we also assume that the defaults are the safest bet.

That is, when a worker approaches a task or a machine, they will always look for the defaults—regardless of whether someone else has described the alternative possibilities.

As a result of this prejudice, we must exercise extreme caution in establishing what should be the default. For instance, if your workplace has five distinct types of safety gloves, you must make it very obvious which ones are the default gloves for each type of activity. This may be a large image of the default gloves next to a specific machine, or even a rack with that specific gloves next to the machine.

#6: Neglecting to Consider Cumulative Risk

Humans have a proclivity to grossly underestimate cumulative risk—the things that eventually hurt them over time.

Everyone is familiar with the legend about the frogs in the boiling water. When a frog is dropped into a saucepan of boiling water, it will safely hop out. However, if you place the naive amphibian in cold water and gently boil it, it will fail to perceive the danger and will die. Regrettably, this parallel can be applied repeatedly to workplace safety.

Every day, people come into contact with “only a tiny bit” of a hazardous substance and acquire skin disorders, neurological problems, or cancer.

Individuals utilise vibration instruments that cause neurological harm over time without realising it until it is too late.

Gloves wear out and develop holes, but people continue to use them. If these gloves had been issued on the first day, they would have been immediately rejected.

#7: The Bias Against Recency and Availability

The recency bias, also known as the “availability bias,” is a characteristic of the human mind that causes us to prioritise recent events with a lot of publicly available information, giving them more weight than they deserve.

The recency and availability biases mean that we are constantly hunting for answers in the immediate past, rather than in the future.

If someone loses a finger while operating a harvester on a farm, everyone will be extra cautious for a while. New gloves, guardrails, and protocols will be implemented around harvesters. Meanwhile, workers apply pesticides with their bare hands, fiddle with open tractor engines while the tractors are running, and operate power takeoff flywheels without training.

While the lost finger is tragic, it may be an isolated incident rather than a threat faced by many workers on a daily basis. Indeed, hazards may be obscured by the focus on this recent event, as no comprehensive risk assessment is being conducted.

IMPROVE YOUR ABILITY TO CONTROL YOUR BIASES

The majority of these cognitive biases are unconscious. They occur spontaneously and without thought. However, this does not mean we are defenceless against them.

With deliberate thought, we can overcome our biases. By becoming conscious of these pervasive biases, we can begin to combat them.

When it comes to safety, we must pause, consider our biases, and determine whether those biases are contributing to harmful behaviour. We can then take appropriate measures to provide a safer work environment.


No Comments

To top