Sheffield Haworth logo

SH INSIGHTS

The danger of blindly trusting the data

There’s no shortage of data these – technology allows us to measure and track just about everything. But data without insight can be dangerous

It was just after midnight on 26 September 1983, Stanislav Petrov, poured himself another cup of coffee and sat back in his chair in front of series of bulky computer screens. He was preparing himself for another night on duty in the Serpukhov-15 bunker just outside of Moscow. His responsibilities included the monitoring of the early warning satellite system which warned the Soviet Union of an impending nuclear missile attack.

Relations between the US and the Soviet Union were tense. Earlier that month, the Soviet military had shot down a South Korean passenger jet that had stayed into Soviet airspace. 269 people were killed including the US Congressman Larry MacDonald as well as many other Americans. 

That year, the Americans had continued to probe the Russian defences. NATO and US had conducted Naval exercises in the Barents, Norwegian, Black and Baltic seas as well as the Greenland-Iceland-UK gap. US Bombers would fly directly towards Soviet airspace and pull away at the last moment. 

Dr William Schneider, former undersecretary of state for military assistance and technology said ‘It really got to them… they didn’t know what it all meant. A squadron would fly straight at Soviet airspace, and other radars would light up and units would go on alert. Then at the last minute the squadron would peel off and return home.’

Back in his Soviet Bunker, Stanislav Petrov glanced up at the screens in front of him. Suddenly, the early warning system went off triggering alarms throughout the control room. Petrov’s team look nervously at him – was this is it? Was this the first strike being launched by the Americans? Petrov investigated the alarm and found that it had been triggered by a single missile launched from the US. 

His orders were to pass this information up to his superior officers in Moscow, but he didn’t, he just waited. The other airmen and Officers looked at Petrov. Why wasn’t he following his orders? This wasn’t how things were done in the Soviet Union at the time. Not following orders could result in a one-way ticket to Siberia – what the hell was he doing?

Shortly after this incident, another four missiles triggered the system. The men in the bunker waited anxiously for the first one to arrive… it never came, and neither did the subsequent four missiles. It was a false alarm, an innocuous computer error.

Why hadn’t Petrov told his superiors what was going on?

When explaining his decision not to alert his chain of command, Petrov stated that if the US had chosen to strike first, they’d have fired everything in one coordinated wave. One missile, followed by four, didn’t seem logical based on what he knew about their capability and likely strategy. Petrov also knew that the system was relatively new and he didn’t fully trust it yet. Ground radar had also failed to pick up any corroborative evidence even after the first alarm.

Petrov’s was the right one but only with clarity of hindsight. He would have been under tremendous pressure to inform his superiors of what was going on but he knew that if he did, there was a significant chance of starting a nuclear war. Petrov chose not to follow the data but to trust his intuition which told him that this wasn’t a US first strike. In doing so, he arguably steered the world away from nuclear war and WW3.

What can we learn from this story?

The internet and the computer age have given us the ability to gather enormous amounts of data and it’s often stated that better data leads to better decisions, which is largely true. But it’s important to think about what data you’re gathering, how it is being used and the decisions that are being made as a result of this data gathering.

Petrov considered the data in front of him but instead of blindly following orders, assessed it given the context of the situation. He decided that based on all the other things he knew about the US, that this was unlikely to be a first strike. Five missiles would not have destroyed the Soviet Union, it would destroyed several cities and killed many people but it would’ve left them with the ability to respond. In a nuclear war, you have to obliterate your enemy quickly and completely or you risk them responding and destroying you in return. 

The quality of someone’s life can often come down to the quality of their decision-making. That is true for individuals and organisations. A track record of good decision-making leads to better outcomes over the long-term. Even bad decisions can lead to good outcomes provided the appropriate lessons are learnt from them.

Petrov’s story reminds us that data needs to be placed into context with the other information we know and understand. It’s a reminder that our intuition, that feeling we have in our gut, is not something to be ignored. 

When he was the Commanding Officer of 42 Commando, Charlie Stickland told me to ’trust my spider sense. If something feels wrong, it is wrong’. 

Sometimes in life we are forced to make decisions without data. For example, we have no reason not to trust the person who knocks on the door late at night, but something tells us it doesn’t feel quite right so we put the chain on the door. We have no evidence to suggest that this individual presents a threat and they probably don’t, but we err on the side of caution and adjust your behaviour accordingly.

Data is valuable but good decision-making requires you to be aware of its limitations and learn to listen to your intuition. If Petrov hadn’t listened to his gut – you might not be able to read this now.