During WW2, London was heavily bombed by the Germans.
One of their methods of attack involved something called a V-1 rocket or ‘flying bomb.’
The Luftwaffe first started work on this early type of cruise missile in 1939 but it didn’t enter service until near the end of the war in 1944. It was nicknamed the ‘doodlebug’ by the British because of the buzzing-like sound of its pulse jet engine as it approached.
At its peak, more than one hundred V-1 bombs were launched at Britain every day. Many of them targeted the country’s capital and caused devastation and loss of life on the ground.
Terrified Londoners began to plot the strikes on a map and soon discovered what they believed to be a distinct pattern. This, in turn, gave rise to theories about which parts of the city were the safest and which were at the most risk.
After the war had ended, statistical analysis revealed a very different picture. The impact sites were distributed completely at random.
The V-1’s gyrocompass guidance system was extremely rudimentary and therefore not particularly accurate. Where the bombs were planned to land and where they actually landed were two different things.
The frightened city dwellers’ attempts to infer information from the impact sites is a classic example of ‘narrative fallacy’ in action. This is the human tendency to seek patterns or meaning in things even if they aren’t really there.
Why do we do this?
In basic terms, we have evolved to favour order and predictability. This makes sense. Order and predictability equals safety and therefore an increase in the likelihood of our survival.
As a result, we will do anything to avoid uncertainty. Even if this means making up a story to explain away events that are entirely random. As the social psychologist Timothy D. Wilson says,
“People are masterful spin doctors, rationalisers, and justifiers of threatening information and go to great lengths to maintain a sense of well-being.”
This evolutionary quirk makes us highly susceptible to persuasive stories regardless of their authenticity, which explains how seemingly smart individuals can be co-opted into cults. It also explains why some people are superstitious and why we love things like weather forecasts, despite their frequent inaccuracy.
This readiness to explain away random occurrences as having a ‘story’ behind them is because our minds are primed to create more ‘cause and effect’ scenarios than exist in reality.