top of page
Search

(The following blog post has been written using information and direct quotes held on public record.)



I want you to imagine that you have a well-established, rock solid business. You are a leader in your industry with a prestigious reputation. You have deep pockets for research and development and you have some of the brightest minds available. You need to create an innovative product that customers will buy.


"You would have every reason to be confident that you’re going to knock it out of the park."

And yet once the product is sold, within a short time you need to withdraw it from market and your exceptionally intelligent designers will be described as “clowns who are supervised by monkeys.”


It might be humorous if it wasn’t for the fact that I’m alluding to the Boeing Max aeroplane, of which two aircraft crashed, killing 346 people.


What went wrong technically?


A basic rule in human factors when designing equipment, machinery and products is to have an obvious way of reversing an action or event when things go wrong. In other words you need to know what to do to put things right; however it seems that the pilots had a lack of information about how to override an aspect of the plane's auto-pilot. The lack of information proved to be critical, when investigators found that the auto-pilot had malfunctioned because of what’s thought to have been faulty sensor readings.


Even if we ignore the potential of a poorly designed sensor system which relied on only one sensor, there was nothing in the design that made it obvious to the pilots about how to regain control of the plane. Furthermore no training was provided to prepare pilots to know what to do in that situation.


What were Boeing employees saying behind the scenes?


So now it’s time to step back and try and understand how Boeing got into such a mess. How do you get into a situation where Boeing employees say things like ‘this is a joke, this airplane’s ridiculous’, or even before the first crash: “Would you put your family on a Max…? I wouldn’t”.


Simulator Training


Lion Air was the airline that operated the first plane that crashed. Lion Air contacted Boeing before it put the plane to work, asking about pilot simulator training. It was discovered that a Boeing employee said to another employee the request was down to Lion’s “own stupidity” and called the airline “idiots.” Following the catastrophic events, Boeing have stated that before the grounded plane returns to the sky pilots will undertake simulator training.



Why didn’t Boeing Want Pilots Trained?


A Boeing employee who was developing computer-based training for the new plane suggested the pilot manual should have more detail on handling possible emergencies, but he was told that the company couldn’t add more information because doing so might lead to regulators digging deeper. The regulator might then insist on extensive training for pilots flying the new planes. Boeing’s thinking was that training would take time, be expensive and could put off airlines who were thinking about buying the plane, at a 100 million dollars a piece.


The same employee who wanted more detail in the manual said he thought training would be important. Responding to his concern that training should be provided, a senior Boeing official said that’s “Probably true, but it’s the box we’re painted into,with the senior official then adding it’s: “A bad excuse, but (it’s) what I’m being pressured into complying with.”






An investigation also showed an email trail where workers congratulated each other for using "Jedi mind tricks" to persuade regulators that simulator training wasn't necessary. In further emails, one worker said: ‘I honestly don’t trust many people at Boeing’


And another worker said after both crashes: "I still haven't been forgiven by god for the covering up I did last year. Can't do it one more time. The Pearly gates will be closed."


All this shows that ethical standards at Boeing had slipped way below what they should have been.


It all points to a lack of psychological safety, where people feel safe and comfortable airing their concerns and they’re confident they can make things better.

But there was a feeling of powerlessness. As one employee said:I don’t know how to fix these things … it’s systemic...Sometimes you have to let big things fail … maybe that’s what needs to happen”.


And as we know it did fail, but it really shouldn't have taken two deadly crashes to get Boeing's attention to put things right.



How do you promote psychological safety?


  • Don’t compromise your values

  • Let people know you will support them in doing the right thing

  • Ask people if they have concerns.

  • Celebrate diligent dissidents, or in other words people who do the right thing, even when it’s not easy.



Dr. Jared Dempsey is the Principal Psychologist at Kognivate.


This blog is for education purposes and may be freely shared with attribution.



Lonely tree seeks attentive lover


The Tree of Ténéré had stood for 300 years in the Saharan desert. It was 250 miles away from the nearest tree and was known as the world’s remotest tree. It was hardly beautiful, but it was a testament to hardiness and resilience. It was a landmark and an icon. That was, until 1973 when a drunken truck driver plowed into it and destroyed a National treasure.


According to Murphy’s Law: If Something Can Go Wrong it Will. The Tree of Ténéré seems to offer some evidence for the law. Further supporting evidences include your stunning ability to choose to stand in the slowest queue in the supermarket. How when running late and you choose the single-track back road—as a shortcut; you then get stuck behind a convoy of 25 tractors on their way to the village tractor show. And it’s absolutely positively guaranteed when you are on your way to an important interview the trains will be cancelled, due to too many leaves on the track, global warming causing the tracks to buckle or Paul and Bill the clippies have had a heavy night and both phoned in sick. You get the picture: the world conspires against you in cruel and devious ways. In many ways The Tree of Ténéré had it good for 300 years, you on the other hand, have fate trying to cause your demise at least weekly. Is it too much to ask for one pair of good matching socks for today’s big meeting, is it, is it really?



"Just because you’re paranoid, that doesn’t mean the universe isn’t out to get you"

But in reality assuming there are two lines at the supermarket checkout you can join, there really is an equal chance you will be in the quickest line. The reason we seem to think we end up in the slowest line more frequently is that, we take more notice of those times we end up in the slowest line, they become more firmly indented into our memories and are more available for us to remember, a form of a psychological quirk called the availability heuristic (a heuristic is a rule of thumb). It’s the same reason why all of a sudden after seeing pictures of spider bites in the news, after 40 years of not batting an eyelid when I saw a friendly arachnid in the corner of a room, I’m now reaching for a newspaper. To any of our American or Australian friends, in the UK we’ve generally managed to whittle out any dangerous creatures, it is a remarkably safe island with the exception of the terra bastardous, otherwise known as the Jack Russell Terrier.  


So events that are anxiety provoking, or plain just piss you off, create strong memories that can skew our thinking and make something seem more frequent or threatening than it is in reality.

Why Does the Toast always Land Upside Down?



Despite you being no less lucky than anyone else, you might be interested to know that when the toast falls off the table it is likely to land butter side down. Physicist Robert Matthews claims that when toast falls from an average sized table it will usually do a half turn, going from butter side up to butter side down on the floor, and to avert this disaster your table would need to be around 10 foot high.


What does this mean for performance? As ten foot tables would require ten foot high chairs which would be an added safety hazard, I think we should concentrate on other lessons from Murphy’s Law and a skewed version of probability and events. If we look at high performance teams in the world of sports, they feed off confidence, they play well, in part because they expect to play well. And you can see the opposite effect when a team starts to lose, it can have a crisis of confidence. Players become scared of making errors and they don’t want the ball. And when they are on the ball they act nervously and with indecision. The same nervousness and self doubt can plague work teams after one or two relatively high profile mishaps. One of the biggest things I try to do with work teams that have lost their edge, is help them restore their self-belief, to see themselves as capable, professional and winning. This is best done by leaders positively reinforcing small and big wins; as well as looking at mistakes as learning opportunities.

Who was Murphy Anyway?


The Murphy in question, a certain Captain Ed Murphy of the US Air Force was an aviation engineer, he had developed some sensors which were to be fitted to the harness of a cart that would hurtle a person at great speed in a straight line to test how many G’s of force a person could withstand. On the first trial with the new sensor fitted, a chimpanzee was securely fastened into the cart, and then was thrust forward at speed, all in the name of science. However, when he came to a stop, the new sensors recorded a G reading of exactly zero. A measurement that was wholly inaccurate. Upon finding out about the test failure, the fitment of the sensors was checked, it turned out that the 16 sensors had been fitted the wrong way around by an assistant. Depending on who you believe, Captian Murphy was a little cranky that day and went into full-on throw the junior team-mate under the bus mode. He supposedly said:

“If that guy has any way of making a mistake, he will".

The first lesson here, is unless you want to be remembered in history as the cranky guy who rolls over other people in the team, then it’s probably best to take a more even-handed approach when things don't go to plan.






A Positive Approach to Using Murphy’s Law


Colonel Stapp quickly adopted and adapted Murphy’s observation which he termed a law and it became the catchier 'if it can go wrong, it probably will.' The second lesson we can learn is much more positive. Stapp was asked at a press conference why no one had been severely injured during the cutting edge tests—they were traveling at G forces that up to then people had thought were not possible. Stapp said that they took Murphy’s law into consideration, they actively explored all opportunities for something to go wrong and took steps to counter them. Which is exactly what sound risk assessment is about.


We can find one final lesson around Murphy’s Law. Although many years before Captian Murphy was found to be berating his assistant, Alfred Holt said in 1877:


"It is found that anything that can go wrong at sea generally does go wrong sooner or later, so it is not to be wondered that owners prefer the safe to the scientific …. sufficient stress can hardly be laid on the advantages of simplicity. The human factor cannot be safely neglected in planning machinery. If attention is to be obtained, the engine must be such that the engineer will be disposed to attend to it."

So the final lesson is to ensure that the equipment people use is simple to operate, provides clear and engaging feedback to the operator and have an obvious means of reversing errors.

In conclusion:


Lesson one: When things do go wrong, express confidence in your team and be even handed when individuals slip up.


Lesson two: Explore all possibilities for process failure and mitigate using sound risk assessment


Lesson three: Make sure equipment is designed to engage the end user


Dr. Jared Dempsey is the Principal Psychologist at Kognivate.


This blog is for education purposes and may be freely shared with attribution.


18 views0 comments



In a recent news article Richard Cockerill, head coach of Edinburgh rugby club, said


"I hate losing more than I love winning. If I win, I enjoy it for an hour and go home and watch the game and start working for next week. If I lose, it sits with me until about Wednesday.”


Richard might be more like the rest of us than he realises. One study asked 32,000 people how they were feeling at any given moment, by randomly pinging an app on their mobile phones.

The researchers running the study found that when people’s football teams lost they felt twice as much pain, than the boost they felt when their team won. The fact that we don’t like losing is maybe not that much of a surprise. And as long our drive to win, isn’t a drive at all costs, then generally drive and motivation are good things to have in life.


But there are times when being driven brings about less than ideal outcomes. I have a friend who, on one Christmas Eve, witnessed two women shoppers tussling over the last turkey in the supermarket, with the result that one customer ended up with a frozen turkey bounced off her head, by the other shopper.



And thus it is, that in the heat of the moment, once committed, we keep on going, dismissing rational thought and pressing on blindly, until we have achieved our objective. But what does this aversion to loss, and insistence to carry on regardless, have to do with human factors?




Once we have a plan, we can keep going even when warning signs start to appear, that tell us that we should stop and rethink what we are doing.

In aviation pilots are taught about plan-continuation-bias, which is the urge to continue with a plan, even when the pilot should step-back and do something different. For example instead of trying to land a plane when it’s unbalanced (in other words it’s too wobbly), the pilot keeps on going to ‘get her down’.


It’s thought that plan continuation bias might play a part in up to 40% of aeroplane incidents.*

Of course the bias to keep on going is not restricted to pilots. All too often after accidents on platforms, oil rigs and elsewhere, the investigations have shown people have got ‘caught up in the job’ and kept going even when risks started to emerge.


So, what do you do about it?


  • Talk about it! Let your team know that plan continuation bias and loss aversion are real threats.

  • Simplify the language, so if a situation changes and risks do start to emerge, team members can call it out, and say “hold on, is this a case of press-on-it-is?"(press-on-it-is, is what some pilots call plan continuation bias, as it’s not such a mouthful to say).

  • But above all else, don’t go shopping for turkeys on Christmas Eve.




Dr. Jared Dempsey is the Director at Kognivate.


This blog is for education purposes and may be freely shared with attribution.




* (Bergman & Misdukes, 2006).

https://www.niesr.ac.uk/blog/are-football-fans-irrational

2 views0 comments
bottom of page