How do you manage a tradegy? Former NASA risk manager, Mike Lutomski, spoke to StrategicRISK about the lessons he learnt from the Space Shuttle Columbia disaster in 2003

Space is an unforgiving environment, and losses are inevitable, it is just that there are limits to what gets prioritised and tested when faced by myriad risks and limits to the human imagination.

During his career at NASA, Mike Lutomski was the risk manager for the International Space Station (ISS) for a decade between 2003-2013.

Lutomski’s current role is as the director of system safety and reliability for SpaceX.

He talked to StrategicRISK in May while in Singapore to address the Risk Forum APAC 2018 event.

Lutomski spoke about lessons learned from one of NASA’s most tragic losses in space: the loss of the Space Shuttle Columbia in 2003.

At NASA, the Challenger and Columbia shuttles both suffered catastrophic losses that led to the loss of life. The disasters provide management and organisational lessons, as well as technical failings.

“Both had cultural and organisational lessons. Hindsight is 20:20, but they were both preventable,” said Lutomski.

The Columbia tragedy in 2003 was largely the result of technical failings, he explained, but also due to a failure of imagination.

The thermal protection system (TPS) of the Columbia shuttle was getting hit by foam debris from the insulated external tank during the launch, but the risk had been underestimated.

“We just didn’t understand the risk,” said Lutomski.

“What had been happening was for decades the shuttle had been hit by foam shedding debris from the external tanks and the shuttle would lose a few tiles,” he explained.

“We were treating the damage to the shuttle tiles as an operational concern of how many tiles to replace between flights, five or 25, as an inconvenience.”

During the Columbia mission, it was suspected that foam debris from the rocket had hit the shuttle after its launch, but the damage was not thought to be significant if any damage at all.

“Even after it happened, and the engineers studied the videos of the orange foam being shed, we never thought it would be able to penetrate the solid reinforced carbon-carbon leading edge of Orbiter wing,” said Lutomski.

On the shuttle’s return journey, it disintegrated on re-entry to the earth’s atmosphere on 1 February 2003, with the loss of all seven crew.

“It had been a successful mission, and we didn’t know the damage was already done,” he said, “They went on with their mission, and nobody knew that they were going to plummet to their deaths. It was another sobering and terrible moment for the NASA family and the nation.”

Earth from space

Source: WikiCommons

Always question your assumptions

“All serious accidents in our history are due to the inability to recognise the risks. Our knowledge is always imperfect – you can’t know everything,” said Lutomski.

In the case of the Columbia accident, which occurred while he was working on the ISS, Lutomski said NASA’s engineers had requested imagery of the TPS to assess damage from the debris hitting it.

The engineers were denied the request twice – the first through the normal management chain, and a second time when they tried to circumvent it.

That was despite the engineers running mathematical calculations, suggesting a few pounds of foam would accelerate to hundreds of miles per hour when hitting the shuttle.

“Management denied that request because it was seen as a distracting waste of time and money. “They didn’t think is was possible for a light piece of foam to cause significant damage,” he said.

“It just wasn’t part of the normal thinking. Far out ideas tend to be outside the cultural norm. On the one hand you have group think, and on the other you have infinite risks,” he added.

When implementing risk management in an organisation it’s important to embed the process in the existing organisational structures, he stressed, otherwise it gets seen as an unwanted cost or overhead rather than an essential component.

“It can’t be an add-on to the existing organization. You can only get people participating when they see value,” he said.

The burden of proof should always be on proving that something is safe or an acceptable risk, he noted.

Disasters often happen because of limits to the human imagination, making keeping an open mind continuously a critical quality of good risk management.

In the case of the Columbia shuttle disaster, NASA was more worried about risks to the belly of the shuttle, rather than the leading edge of its wings – which was where the fatal damage occurred.

“The space shuttle flying up and down has gotten routine to us, but it was always a high-risk operation, and we didn’t understand it as well as we thought we did,” Lutomski said.

“As engineers and as risk managers we have to preserve ignorance, and our openness to new information,” he continued.

“The lesson is to be vigilant, to stay humble, and to always question your assumptions,” Lutomski added.

Topics