Chernobyl is an enduring lesson for risk managers on the damning effect a poor risk culture. It would be easy to say ’that would never happen again,’ but in the grubby aftermath of the Hayne report, can we honestly say we have learned our lesson, asks our editor, Lauren Gow?

Everything you need to know about what could go wrong in business with a poor risk culture can be found in the final episode of Chernobyl, an HBO miniseries based on the generation-defining event in the Soviet Union in 1986.

It is considered the worst nuclear disaster in history and is one of only two nuclear energy disasters rated at seven—the maximum severity—on the International Nuclear Event Scale; the other being the Fukushima Daiichi disaster following the 2011 earthquake in Japan.

What makes the final episode so compelling is the seemingly simple ease at which this catastrophic failure occurred. Despite a number of risk mitigation strategies being in place and despite active human interaction attempting to disrupt the course of events, the disaster still occurred and it occurred as a direct result of a failure of risk culture.

“We will have our villains, we will have our hero, we will have our truth. But first, the trial. Perhaps then we can deal with our reactors.”

Dialogue following this line says that the state (in this case being the Soviet Union but feel free to substitute any company at this point) “will have to be forced” to make a change through the actions of a whistleblower. (Your risk hackles should be rising.)

A trial post-event ensues. We see a replica model of the 3-year-old nuclear power plant. We find out fairly quickly that the three men in charge of the construction were awarded state-endorsed commendations, despite the fact the work was not complete. Worse still, that work not yet complete was a safety test on the nuclear reactor. “A test is only as good as the men carrying it out,” we are told.

Allow me to take you back to Saturday, April 26, 1986. A safety test on nuclear reactor number four has been tried and failed, three times. But a fourth test must now be completed with conditions, with one manager telling the team resignedly: “Someone is pushing down from above. Not that we will ever know who,” leading to the first inkling that something may be wrong with the risk culture.

The safety manager is told to wait 10 hours before beginning the testing due to power restrictions in the Soviet Union at that time. “A competent manager would have cancelled the test,” we are told, but instead he goes home to snooze and vows to come back in later and complete the testing, “once and for all”. The overarching supervisor says he is also going home. “Call when it is done,” he barks. (Your risk hackles should be rising rapidly at this point.)

A shift change at midnight and a new, unprepared teams arrive at the facility for the turbine rundown test. A junior member of staff raises concerns and tries to say: “We don’t know what we are doing,” but he is quickly silenced by a manager. That junior was only 25 years old and had been in the position only four months. (More hackles should be rising at this point.) The junior is then informed he needed to perform a task he has never performed before with senior management watching him so, naturally, he turns to the manual for instructions. 

Many of the instructions are crossed out so his immediate supervisor calls for clarification. He is fatefully told to follow all the instructions, regardless of corrections. The junior questions why there are corrections if they are to follow them anyhow. (More hackles.)

The overarching, bullying supervisor storms in. More members of staff raise concerns and are quickly told: “Do what I tell you. Even you, as stupid as you are, can manage that.” The junior once again implores his supervisor to check but he is told: “Shut up and do your job.” (You can see where this is going.)

Testing is commenced. The team work through the steps slowly but their supervisor hurries them along, meaning mistakes are being made and opportunities for redemption are missed. The team feels threatened and bullied and the mistakes begin to pile up and panic ensues. The team wants to stop the test but the supervisor insists they continue: “Don’t tell me about rules! There are no rules.” The team refuses any further demands and are told they will “never work anywhere again,” by the supervisor.

“I knew what Dyatlov said was wrong but I knew if I didn’t do what he said, I would be fired.” 

Testing continues. The catalog of failures begin to bite, more concerns are raised and quickly dismissed. So sets in motion the catastrophic nuclear event that conservatively killed 31 but other figures on long-term casualties as a result of exposure to radiation puts the figure closer to 4,000. The disaster continues to have consequences even to this day.

In the TV trial, which critics argue closely resembles actual events, a Soviet inorganic chemist and a member of the Academy of Sciences of the USSR, Valery Legasov remarks “our secrets and our lies are practically what define us. Every lie we tell incurs a debt to the truth,” indicating the risk culture at the plant, and within the government echelons, was one of secrecy.

Sadly, it was only after Legasov’s death by suicide and audiotapes of his observations on Chernobyl were released amongst the scientific community, that the Soviet Union acknowledged there were fatal design flaws in the RBMK nuclear reactors.

The Chernobyl story is an enduring lesson for risk managers with regard to the damning effect a poor risk culture can have. It would be easy to say “that would never happen again,” but in the continuing grubby aftermath of the Hayne report on banking, can we honestly say we have learned our lessons? Look around you, what do you see? I see BP Deepwater Horizon, Wells Fargo, Volkswagen, Kobe Steel, Commonwealth Bank of Australia and most recently, NAB.

Risk culture is one of those ephemeral terms which is hard to categorically define and even harder to monitor; and yet, it is arguably the single biggest indicator of risk health in a firm. Whilst it is easy to put in place guidelines (even Chernobyl had those), what is difficult to control is what happens when push comes to shove. Are guidelines followed or ignored? Are mistakes owned up to or buried?

“Where I would once fear the cost of truth, now I only ask ‘what is the cost of lies?’” Legasov 

Do yourself a favour and watch Chernobyl’s final episode without delay.