Is it not astonishing that some safety leaders still refer to “common sense” as a part of safety compliance as though it exists?
To assume that any level of safety compliance hinges on common sense will be an irresponsible assumption to make in controlling risk. I often hear this absurd terminology in many safety discourses, especially when they say, “safety is common sense” or “common sense should tell you not to put yourself in the line of fire”
And guess what? If it existed, you really wouldn’t need to ask for it. Imagine how absurd this would sound to a jury if a person lost his life for being in the line of fire. Can an employer get away with lack of common sense of an injured party?
Your guess is as good as mine. The employer will still be charged for negligence or at best for contributory negligence if both parties were culpable and the burden of proof will be to prove guilt beyond reasonable doubt that risk was not controlled to a level that is as low as reasonably practicable (ALARP).
In this post, I will be diving into why we cannot rely on common sense as a criterion for influencing behavior and driving safety compliance, especially knowing that common sense is indeed not common to everyone and does not exist in the safety business due to many factors that negate it as a standard criterion.
The 3 Main Types of Risk
Before we dive into that, let us first understand the 3 main types of risk:
- The directly perceived risks. These are risks that are easy to identify and perceive e.g. when a child is attempting to cross a street, every adult will know that is dangerous and risky hence the adult would intervene by helping the child to cross the street.
- Risks perceived through science. These include risks from exposure to physical, chemical, biological, mechanic and non-mechanical hazards. These are proven by science with methodologies for risk identification, assessment and implementation of controls.
- Virtual risks relate to everything imaginary not yet proven by science and hence better explained by theological speculations. They are products of the imagination and are guided by belief, conviction, and superstition. For example, In Ghana, fishermen don’t go to sea on Tuesdays because they belief Tuesdays are sacred and fishing on such days could cause boat to capsize.
Risks that are easy to perceive are tempting to assume they are common sense or common to all, but it gets dicey when the human factors come into play.
Studies into human factors show that we have cognitive biases and behavioral attributes that make us vulnerable to human errors.
A person who is expected to easily perceive risk could do otherwise because of some of these cognitive biases and behavioral attributes.
Have you ever found yourself in a situation where you acted completely foolish? That is the effect of a cognitive bias.
The Cognitive Biases that Dispel the Common Sense Argument.
These are a few cognitive biases that dispel the fact that we cannot assume common sense to drive safety compliance or justify negligence:
- Overconfidence – this is when people believe that nothing bad can happen to them because they are too perfect and have good control of their personal ability. These people believe they are superhuman and tend to forget about the dynamic hazards and risks from their work equipment and work environment. This prevents them from easily perceiving risks as they have a false sense of safety in their personal ability.
- Complacency – this is when we build blind spot to risks due to the nature of our brain. Every time we focus on the task or a goal, our peripheral lenses naturally build blind spots to the things we are not focused on. That is simply the nature of the human brain and the risk of being focused and in extreme scenarios building a tunnel-vision.
- Automatism – this is when we work in auto pilot without adjusting our behavior to changes at the workplace due to the repetitive nature of a job. This comes as no surprise because of the nature of the human mind. We have the conscious mind and the subconscious mind. The conscious mind is limited in capacity and used for the complex tasks which we deliberately want to solve whiles the subconscious mind which is the largest part handles reflexes and automatism. When we forget to engage our minds, the activity we are carrying out will be pushed to the subconscious mind and that is dangerous because we are not alert.
- Optimism Bias – this is when a person believes that things will always go right without thinking about what can go wrong and planning for them. Persons like this are highly optimistic about work activities, do not plan for emergencies, and do not prepare to adjust their behaviors to negative events or when things do not go according to plan. This cognitive bias affect how they perceive risks and treat risks. It is also known as wishful thinking.
- Negative Group Thinking – this happens when people naturally conform to a negative group belief subconsciously and as such assuming a group self until the person is isolated from the group. For example, when a person in a group equally conforms to the belief that his PPE is not comfortable to wear in a critical job but when isolated and questioned about the same unsafe act, the person is awoken to accept the right approach and will individually be compliant away from the group. This goes to show how important shared importance of safety is in building the right safety culture in an organization.
- Risk Homeostasis – this is simply a misleading sense of safety at the workplace just because of a few good practices and we lose sight of the impact of the bad practices. For example, a good housekeeping at the work area which is the most obvious safety control or safe work environment, but planned maintenance of equipment is not being followed and everybody loses sight of it, or secondary retention of pieces of fixings are worn out but nobody is aware. There could be a mount hill of unsafe conditions hidden under the cover.
- Halo Effect – this is following other people’s unsafe practices, especially by people who are not much experienced, lack safety awareness and are quite new at the workplace, or new to a piece of equipment. Such persons would by default learn the bad practices of others especially senior colleagues without knowing they are wrong.
- Law of Least Effort – this is when we tend take the easiest approach in doing a job and that puts us in a comfort zone whether safe or unsafe. But in reality, the easiest approach is not the safest. The human brain is lazy by default and likes to be comfortable. It always takes the path of least resistance when faced with a challenge. This makes us like to take short cuts because it is the easy thing to do.
- Bystander Effect – observing unsafe acts and conditions without any sense of responsibility to intervene and correct. When people conform to the negative group thinking or are carefree or have normalized risks, they tend to become bystanders of unsafe acts and conditions. This is a behavior attribute that influences how people perceive risks.
- Obedience to Authority – this is when a person is afraid to speak up against a superior who is acting unsafely either by pushing the job and his crew or requesting his crew to use an uncertified equipment or taking a short cut to do the job. The ability to speak up and exercise a stop work authority is hampered by their obedience to authority bias.
- Decision Fatigue– this is simply the brain becoming overloaded with information or stressed from lack of recovery time, presenting a high chance of making mistakes. When we do not take proper rest to recover from previous day’s work, the brain gets tired, we tend to forget easily and cannot stay alert. We cannot assume common sense if a person in this state commits an unsafe act.
- Priming Effect – this is when exposure to one stimulus influences how we respond to subsequent stimulus. For example, witnessing a traumatic event which then affect how you respond to any signs and symptoms associated to that event in another setting. This can drive a positive or negative extreme behaviors which common sense cannot explain.
- Cognitive Dissonance – this is basically the fact that some people will naturally not associate with the goals, beliefs and values that an organization is promoting. They simply do not subscribe due to their different views, opinions, goals and values. They are simply at dissonance with set goals of an organization. We cannot use common sense to explain why they will not subscribe.
So just before you assume that safety is common sense, take note that the above biases are simply what make us human and make us prone to human error. So therefore, as a safety leader you cannot expect perfection from others by assuming that common sense should be common to all to drive good safety behaviors. You need to implement a robust management system including systems that are not 100% reliant on humans to reduce risk to ALARP.
Do you now agree if I say it is absolutely absurd to say safety is common sense? Share your thoughts with Sel.