Narrative Fallacy

During WW2, London was heavily bombed by the Germans.

One of their methods of attack involved something called a V-1 rocket or ‘flying bomb.’

The Luftwaffe first started work on this early type of cruise missile in 1939 but it didn’t enter service until near the end of the war in 1944. It was nicknamed the ‘doodlebug’ by the British because of the buzzing-like sound of its pulse jet engine as it approached.

At its peak, more than one hundred V-1 bombs were launched at Britain every day. Many of them targeted the country’s capital and caused devastation and loss of life on the ground.

Terrified Londoners began to plot the strikes on a map and soon discovered what they believed to be a distinct pattern. This, in turn, gave rise to theories about which parts of the city were the safest and which were at the most risk.

After the war had ended, statistical analysis revealed a very different picture. The impact sites were distributed completely at random.

The reason?

The V-1’s gyrocompass guidance system was extremely rudimentary and therefore not particularly accurate. Where the bombs were planned to land and where they actually landed were two different things.

The frightened city dwellers’ attempts to infer information from the impact sites is a classic example of ‘narrative fallacy’ in action. This is the human tendency to seek patterns or meaning in things even if they aren’t really there.

Why do we do this?

In basic terms, we have evolved to favour order and predictability. This makes sense. Order and predictability equals safety and therefore an increase in the likelihood of our survival.

As a result, we will do anything to avoid uncertainty. Even if this means making up a story to explain away events that are entirely random. As the social psychologist Timothy D. Wilson says,

“People are masterful spin doctors, rationalisers, and justifiers of threatening information and go to great lengths to maintain a sense of well-being.”

This evolutionary quirk makes us highly susceptible to persuasive stories regardless of their authenticity, which explains how seemingly smart individuals can be co-opted into cults. It also explains why some people are superstitious and why we love things like weather forecasts, despite their frequent inaccuracy.

This readiness to explain away random occurrences as having a ‘story’ behind them is because our minds are primed to create more ‘cause and effect’ scenarios than exist in reality.

For a more detailed look at narrative fallacies, we recommend the excellent books by Nassim Taleb, The Black Swan and Fooled by Randomness.


Original Source

Our Top 5 Books On Problem Solving

Problems. Let’s face it, we all have them.

And we’d all like less of them.

Yet, we are never taught how to solve them effectively at school.

Bizarre.

As a result, problem solving is a skill that is in high demand. Whether it’s in your current job or at home, your life is guaranteed to become a lot easier if you can get better at it.

The good news is it isn’t a talent limited to the lucky few.

It’s actually a skill and habit you can learn and here are five books to set you off on your merry way:

1. Problem Solving 101 by Ken Watanabe

Originally written to help Japanese school children learn how to be better problem solvers, this book ended up as the county’s best selling business book of 2007.

Watanabe uses three fun and simple-to-follow case studies to illustrate various practical tools and methods you can start using straight away. As its name implies, Problem Solving 101 is a short, easy read that offers a good introduction to the craft.

2. Seeking Wisdom: From Darwin to Munger by Peter Bevelin

Peter Bevelin has done us a great favour by gathering together the practical wisdom of some of the world’s greatest minds and putting it all in one book. It covers everything from the way our minds evolved to the psychology of misjudgement and how we can become better thinkers.

As well as trawling the history books for timeless insights from distinguished thinkers like Confucius, Richard Feynman and Michel de Montaigne, Bevelin has also consulted the minds of top level thinkers like the billionaire investor Warren Buffett and his partner Charlie Munger.

It’s the type of book that you should probably return to every once in a while to keep your problem solving skills razor sharp.

3. The Art of Thinking Clearly by Rolf Dobelli

One of the greatest challenges we face when solving problems is our own mind. We are prone to many cognitive biases; more than 180 of them to be precise. These ‘impediments’ of thought lead us to think irrationally or illogically, which makes us less effective.

Dobelli’s book references a number of fascinating real world examples of how the most common biases we suffer from impact our thinking. Just being aware of them will make you a better problem solver by helping you to recognise and avoid your own blind spots.  

4. One Step Ahead: Notes from the Problem Solving Unit by Stevyn Colgan

Stevyn Colgan spent thirty years in the Metropolitan Police. Twelve of those were spent as part of Scotland Yard’s award-winning Problem Solving Unit, a specialist team with an extraordinary brief: to solve problems of crime and disorder that were unresponsive to traditional policing methods. His book shares some amazing true stories of problem solving in action and is a joy to read.

It’s one of the most interesting books we’ve read in recent memory and will equip you with endless fodder for dinner party conversations.

5. Mastermind: How to Think Like Sherlock Holmes by Maria Konnikova

Konnikova has written a brilliant and very thorough book that examines the mind and ways of thinking of Sir Arthur Conan-Doyle’s most famous character. It’s a little dense at times but it’s worth persevering with for all the brilliant nuggets contained within.

Holmes is one of the world’s most proficient problem solvers and Konnikova highlights the key characteristics that make him so effective. Highly recommeded. 


Can’t wait for the books to arrive? Then check out our ‘Think Like Sherlock’ online master class taught by a former Scotland Yard detective. You won’t regret it. 

Original Source

Snake care when setting a bounty

When problem solving it’s important to treat the cause and not the effect. 

The Indian cobra is a highly venomous species that often lives close to urban areas.

During the British rule, the Government became concerned by the number of these deadly snakes roaming the streets of Delhi, so they devised a cunning plan to resolve the situation.

A fixed bounty was offered for each snake caught and killed.

Initially, this proposal worked a treat and the number of reptiles quickly began to fall.

Cue lots of patting on backs in the corridors of power. 

But then something odd happened.

The number of bounties being paid out started to increase at a rapid pace. Alarmed at this turn around, the officials investigated.

They discovered that local entrepreneurs had taken to breeding cobras and killing them for the express purpose of collecting a payout.

Furious, the Government immediately canceled the program whereupon the disgruntled businessmen (who now found themselves with a large number of worthless snakes) released them all onto the streets.

With more cobras on the loose than ever before, the not so clever Brits had paid a handsome sum to make the original problem worse.

Whoops.

This is now known as the ‘Cobra Effect’ and is a wonderful illustration of the Law of Unintended Consequences. These types of situations occur because when presented with a problem we often jump to a solution.

This sometimes results in us treating the symptom not the cause, which can end up making the problem worse.  

———————————————————————————-

If you enjoyed this story you might also enjoy the related posts ‘When incentives lead to bad behaviour’ and ‘Hot tin hat’. 

Oh and for an excellent online course on problem solving check out ‘Think Like Sherlock: How to Solve Problems from a Scotland Yard Detective.’ 

Original Source

3 common biases that impede effective problem solving

It’s amazing how your own mind can be your worst enemy when it comes to solving problems. Simply being aware of the shortcuts it takes when thinking can put you in a better position. 

How easy would life be without any problems? Smooth sailing right?

However, life wouldn’t be life without them. Also, many of the greatest problems are potential opportunities waiting to be discovered.

Fortunately, we can all learn how to be better at solving them.

Aside from learning problem solving techniques, one of the most helpful things to be aware of when solving problems is your own cognitive biases.

These are distortions in your own thinking that are easy to overlook and lead you to making poor decisions. The CIA describes them as being “… mental errors caused by our simplified information processing strategies.”

Here are three of the most common ones to be aware of:  

1) Confirmation bias

This occurs when we favour information that confirms our existing beliefs. For example, during an election, people tend to seek out positive information that puts their favoured candidate in a good light. The media use this to their advantage all the time. They provide compelling points to encourage us to formulate an opinion. Any other evidence that might contradict this is usually undermined, or not reported.

2) Recency bias

This is when we place greater importance on information that we’ve recently acquired. A classic example of this is financial traders looking at only the most recent events whilst disregarding older pieces of information which are equally important (and sometimes more important).

3) Framing bias

This concerns how we are influenced by the way information is presented, as opposed to the information itself. For example, a yoghurt could be labelled as 90% fat free or, alternatively, as containing 10% fat. Similarly, a burger could be ‘framed’ as being 75% fat free as opposed to being labelled as containing 25% fat. Which of those options sounds the most appealing?

If you found the above interesting then you might like to check out four more examples of cognitive bias we wrote about previously. 

1. ‘Neglect of the Absent’ 

2. ‘The Bandwagon Fallacy’ 

3. ‘The Planning Fallacy’ 

4. ‘Gambler’s Fallacy’ 

Original Source

The value of a ‘pre-mortem’ when problem solving

A good problem solver recognises that the implementation of a remedy is not the final step in the problem solving process.

In fact, the last stage should be a review of both the process of how you went about your chosen solution and a measure of its impact (i.e. how successful was it in addressing the issue). 

This is often referred to as a ‘post-mortem’ because it occurs after the event.

However, there is the less frequently practised and less well known technique called a ‘pre-mortem’ devised by the psychologist Gary Klein. 

This takes place before you have implemented your chosen solution.

Rather than ask why something failed or wasn’t as effective as desired, a ‘pre-mortem’ imagines a bleak moment in the future and asks “Why did it fail?”

Problem solving often occurs under pressured circumstances and the temptation exists to rush the implementation process to ‘get it done.’ This exercise forces you to pause and reflect on your chosen antidote. 

Try to encourage yourself and your team to write down as many reasons as possible as to why the solution you’ve chosen wouldn’t have worked.

Doing so might uncover an unexpected flaw or it may encourage you to further refine your idea and make it more effective.

Want to learn more about techniques like these and how to become an expert problem solver? Join our ‘Think Like Sherlock’ course co-created with a Scotland Yard detective with 30 years of problem solving experience.

Original Source

The quick thinking Polish warship captain who saved Cowes

The ability to think laterally is a key problem solving skill. In this true story, it became a life saving one. 

WWII offers up a treasure trove of interesting stories.

Many of them share the brilliant, spur of the moment thinking of individuals faced with a life threatening situation.

On 4th of May 1942, a squadron of 160 German bombers flew across the English Channel to attack the Isle of Wight’s capital, Cowes.

The small island was woefully equipped for such an event and it seemed almost certain that the invading aircraft would destroy the town and killed hundreds of people.

What the Germans hadn’t counted on was the response from an unlikely source: the Polish Navy.

Built in 1935, the ORP Blyskawica (mean ‘lightning’) was a destroyer that was in the port of Cowes for repairs.

Although the vessel was in port, she was technically still armed. Besides, her guns were by far the most serious armaments available anywhere on the island.

As the planes bore down on the island, the captain, undaunted and thinking on his feet, ordered a number of his men back on deck and instructed them to ready the ship’s guns.

As soon as they were ready, the captain gave the command and his team of gunners fired a heavy volley of shells at the attacking forces.

It was a masterstroke of thinking under extremely pressured conditions. Firing the ship’s weapons when docked was not normal procedure.  

The counter attack was enough to persuade the Germans to turn around and head back.

This unorthodox decision by ORP Blyskawica’s commander undoubtedly saved Cowes and the lives of many of its inhabitants. 

It is difficult to think laterally at the best of times. Often your first response to an unusual solution is going to be “we can’t do that!” or “that’ll never work.”

Try to stop yourself at that moment in time and ask yourself “why not?” Develop a habit of challenging your own assumptions and writing down your ideas no matter how fanciful they may sound initially.

A helpful approach to encourage lateral thinking is known as the ‘Six Thinking Hats’ exercise. 

Original Source

Six Thinking Hats

Kevin May owns the Sticks advertising agency in Seattle.

He holds ‘brainstorm salons’ where he invites groups of smart people along who have nothing to do with the ad campaigns they will be talking about.

He says it’s amazing what an engineer has to say about lingerie or an artist has to say about accountancy problems. You can get amazingly different insights, reframed approaches and other ways of looking at problems that people immersed in the problem would never come up with because they’re blinkered by being too close.

As individuals we are prone to similar patterns of thinking. Unsurprisingly, we don’t all think in the same way.

So when we’re faced with a challenge that requires us to be resourceful and think differently, this can leave us a bit hamstrung. The reason is because, in order to be an effective problem solver, you need to think differently.

Dr. Edward De Bono has been described as one of the world’s greatest thinkers.

He has dedicated most of his life to teaching people ‘how’ to think rather than ‘what’ to think.

He devised the ‘Six Thinking Hats’ technique which is a role playing method whereby a group of people debate a problem by adopting different ‘hats’.

Each ‘hat’ requires the person to respond to the problem from a certain perspective. They are as follows:

1. White Hat – Facts

2. Red Hat – Emotions

3. Yellow Hat – Benefits

4. Green Hat – Ideas

5. Blue Hat – Planning

6. Black Hat – Judgement

Each person adopts the mentality of their given hat and applies its ‘lense’ to the problem at hand. As you go around the table you will gather six very different perspectives on the same problem. What you’ll find at the end of this exercise is that you’ve magically created a whole new list of possible approaches or solutions to your original problem – you’ve learned to think differently.

Sometimes to generate a breakthrough you need to adopt a different mindset. The ‘Six Hats’ exercise is a useful tool to help you adopt multiple points of view on the same problem. 

Interested in becoming a better problem solver? Take our ‘Think Like Sherlock’ problem solving course now co-created with an ex-Scotland Yard detective and former member of the Metropolitan Police’s elite ‘Problem Solving Unit.’ 

Original Source

How Captain Cook solved the problem of scurvy on British ships

When making decisions, we can be subtlety influenced to make a certain choice depending on how it is presented to us.  

For seamen in the 1700s one of the biggest threats was scurvy.

It is a rather unpleasant disease that rots your gums and then, if left untreated, will eventually kill you.

The famous British explorer Captain James Cook travelled on long voyages where scurvy posed a real threat to his crews’ survival.

At the time, there was a poor understanding of what actually caused it. (Hint: It’s a lack of Vitamin C). Cook remarked that the Dutch sailors seemed to suffer far less than their British counterparts so he enquired as to what they might be doing that was different.

After a period of observation, he remarked that they all carried barrels of sauerkraut onboard.  

So Cook ordered for his ships to follow suit.

However, having the sauerkraut (which contains small amounts of Vitamin C) onboard and getting his sailors to actually eat the stuff were two different things.

The British sailors hated the foreign ‘kraut’ and wanted to stick to their own food.

So how did he get his cantankerous crew to eat sauerkraut?

Well…for a while he served it only to his officers whilst making sure that they ate it in front of the crew. It wasn’t long before envy set in.  

Then, one day, he said “Well, I suppose the men can have it one day a week.”  

In one stroke, he had his whole crew eating the stuff, saving many lives and ensuring the success of his many overseas voyages.

This is a great example of the technique known as ‘framing’.

It deals with how we make different decisions depending on how the information is presented to us. Because context is so important to us when making decisions we can be subtlety influenced to make a certain choice if the context or ‘frame’ of something is altered.  

For more great stories like these and to learn about the behavioural psychology behind them check out our course on Behavioural Economics.

Original Source

Why people need just the slightest of excuses to behave badly.

Sometimes in solving one problem we can inadvertently create a new one.

Abandoned cars are a nuisance.

Their selfish owners make them a problem for someone else to deal with by dumping them at random. And it’s a problem that seems to be on the rise. 

Earlier this year, The Telegraph reported that the UK was ‘becoming a scrapyard’ due to the fact that the number of abandoned vehicles had trebled in less than five years. 

To distinguish between cars that have simply been parked for a while and those that have been abandoned the police will often place a ‘police aware’ sticker on the vehicle to let the public know that they know.

Interestingly, this has the unintended consequence of making them more likely to be vandalised and stripped of their parts.

Why is this?

It’s really rather simple even if it is a bit of a tragic commentary on human behaviour. The presence of a sticker is a signal that the owner has abandoned their vehicle and therefore it’s no longer someone’s possession. This is an invitation for opportunists to behave in a criminal fashion.

This is a great example of the law of unintended consequences where one action can end up creating a new desired or undesired outcome.

To learn more about the art of problem solving register for our upcoming course ‘How to think like Sherlock’ which is packed full of useful insights like the one above. 

Original Source

The Art of the Checklist

When it comes to solving problems or making difficult decisions do not underestimate the power of a simple checklist. 

The chances of you dying in an aeroplane crash are extremely rare.

In fact, if you live in the US you are more likely to die strangling yourself in your own bedsheets.

This is because modern aircraft, if properly serviced, are extremely reliable.

The majority of accidents are, in fact, caused by pilot error. Around 80% according to the aircraft manufacturer Boeing (interestingly this was the reverse in the early days of aviation where 80% of accidents were attributable to mechanical failure). 

To help mitigate mistakes made by pilots, Cockpit Resource Management was introduced in 1979 by NASA psychologist John Lauber as part of a study to increase overall flight safety. A big part of this was to introduce simple pre-flight checklists.  

The TED speaking surgeon Atul Gawande wrote a great book about the role of checklists in massively improving aviation safety and how he adapted this approach to reduce the incidences of surgical errors.

His compelling book The Checklist Manifesto makes the case that experts (anyone experienced at their job regardless of their field) often tend to overlook the obvious or trivial.

Unfortunately, it is precisely these ‘small’ errors that in aviation and surgery can lead to the loss of human life.

Checklists can also be a handy tool for solving problems in your own life. They can help you better understand the problem and its potential solutions. 

One tip is make yourself a checklist of impartial questions to ask yourself as a way of examining your own thinking and biases. Here are some suggestions to get you started:

  1. Have I looked at both sides to this story in equal measure?
  2. Have any existing prejudices impacted my decision making?
  3. Am I approaching this problem in the same way to other ‘similar’ problems in the past?
  4. Have I done sufficient research to see if someone has solved this problem previously?
  5. How would ‘X’ go about making this decision? 

To learn more about the art of problem solving register for our upcoming course ‘How to think like Sherlock’ which is packed full of useful techniques and approaches like the one above. 

Original Source

How to get rid of nuisance park drinkers

In the summertime the parks of London come alive.

The locals are so used to spending the majority of the year in the dark and the rain that as soon as the sun makes an appearance the hordes descend on the closest patches of grass.

Whilst great for them, it does, however, cause a few problems for the local park authorities.

One particular park had an issue with a rather enthusiastic group who would start drinking very early in the morning when the park was empty.

The park authority reps would ask them to move on but, because it was the only open and comfortable space in the local area, they would soon return.

How then to remedy this problem? Specifically, how to solve the problem cheaply and efficiently? 

Extra park reps on patrol was a no go. Too much money. 

The solution in the end was delightfully simple.

They just turned on the sprinklers an hour earlier.

Because no one wants a wet bum whilst they drank their beer.

We tend to assume that most problems require additional resources or a totally new solution to solve them.

However, it is often best to simply examine more closely the ‘tools’ already available.

The solution can often be found from using them in a novel way.  

Original Source