Occasionally, along comes a book that you struggle to put down once you’ve picked it up; dominates all book recommendations to your friends for months on end; you end up buying for people that you think would benefit from reading it; and it fundamentally changes your outlook on specific aspects of your life, forcing you to take positive action. “Black Box Thinking”, by Matthew Syed, has made a significant impact on how I view making mistakes, failing, learning and has prompted me to look at how I reflect on my behaviour.
Over the course of the next two blogs, I will review "Black Box Thinking" and look at the five key points that I took from reading it, in more detail.
Growth Mindsets vs Fixed Mindsets
The book takes its name from the in-flight recording systems that air investigators use to understand what happened in cases of failure, accidents or near misses in the aviation industry and appropriately Syed opens by looking at the approach this industry takes to learning from error. On board any aircraft is an orange container, containing two data recorders, which collect both electronic information and information documenting in-cockpit communications. Historically, these were two black boxes (hence the name), but in either presentation, the storage devices are virtually indestructible and can be recovered even in cases of catastrophic crashes.
In the event of an air disaster or near miss, pilots voluntarily submit a detailed report of their actions along with the information, circumstances and subsequent decisions informing those actions, to an independent investigation panel. This panel then examines all available evidence, interrogates it, interprets it into a digestible report and then publishes the findings for open access to all in the industry. There is no blame culture and a climate has been created to encourage open divulgence, which subsequently facilitates maximal learning from each incident and in turn the introduction of any reforms, systems, processes or measures that can be adopted to reduce or eliminate the risk of a similar event happening again in the future. The industry recognises that “human error” is often the result of poorly designed systems and seeks to improve them accordingly.
Syed illustrates this approach to learning with an example from the 1940s when several Boeing B-17 bombers were involved in a series of runway accidents. A psychologist was a employed to investigate the crashes and study the characteristics of the events. The report concluded that the switches controlling the flaps in the B-17s were identical to those controlling the landing gear (i.e. the wheels) and the switches for each were positioned right next to each other. Under the pressure of difficult landings, pilots were found to have pulled the wrong switch and instead of retracting the flaps to reduce speed, were retracting the wheels, causing the aircraft to “belly flop” onto the runway, causing some disastrous results. The result of the report was that the shape of the switches were changed to represent the equipment they operated and accidents of that nature disappeared overnight.
“Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died…We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.”
Captain Chesley Sullenberger
This is a perfect example of a growth mindset and relies upon the philosophy that to achieve success in a complex world, the talent or ability of people has to be supplemented with hard work, practice, persistence, resilience, perseverance and collaboration, in order to learn and improve. The outcome of this approach to harnessing the benefits of learning from failure and engaging with incidents when they are most threatening to ego, is that in 2014, there was only one accident for every 8.3 million flights. This is an incredible statistic driven by the open, empowering culture that has evolved in what historically was a dangerous profession.
Contrast the growth mindset with the fixed mindset, whereby the belief that talent reigns dominant over hard work and practice, holds strong. Syed compares the aviation industry’s growth mindset with the fixed mindset that characterises the healthcare industry. Syed highlights the long and expensive educations of doctors, where the culture fosters the belief that talent is enough. Senior practitioners are often put on pedestals and believe themselves to be infallible and as a result, when an error occurs, it has the effect of being threatening to the individual.
Subsequently, instead of replicating the model of responding to failure adopted by the aviation industry, the medical default is to become defensive, cover up the error to avoid looking untalented or become self-justifying. How often do you read in the press or hear the line “it was just a complication of the procedure” or, “there was nothing I could have done, it was a complication of the patient’s presentation” and even “it’s just one of those things”. As a result, the motivational impetus to investigate cases of error, potentially identify patterns of practice, which may lead to reforms to reduce such occurrences, is eliminated.
Furthermore, the culture in the medical profession is one of high blame. As a result, if practitioners believe they are going to open themselves up to litigation as the result of investigation, the drive to be open about their practice is withdrawn. Consequently, the availability of information available to improve practice is extremely limited and the implications of such behaviour is statistically frightening.
In 2013, a study published in the Journal of Patient Safety, attributed preventable medical error to causing the deaths of 400,000 people each year in hospitals in the United States alone. These are deaths caused by, amongst other avoidable causes, misdiagnosis, dispensing the wrong medication, operating on the incorrect body part and post-operative complications. Syed reports on the testimony delivered to the Senate by Johns Hopkins medical professor, Peter Provonost, in which he described this figure as the equivalent of “two jumbo jets falling out of the sky every twenty-four hours.”, a degree of preventable harm that places hospital stays as the third greatest cause of death in the USA. Such a statistic would simply not be tolerated in any other forum. What’s more, these figures didn’t even account for avoidable deaths occurring in outpatient clinics or nursing homes, which, when factored into the equation, has been estimated at raising the total to over 500,000 deaths per year in that one country alone.
Yet, further analysis of the statistics provides even more cause for concern. Syed cites the testimony of Joanne Disch, a clinical professor at the University of Minnesota School of Nursing, taken from the same senate hearing. Disch highlighted the incidences of non-lethal harm caused by medical error, and cited results of a study that reported the number of patients experiencing severely disabling complications as being ten times greater than the number of deaths caused by the same factor. That works out at “10,000 preventable serious complications per day”.
These statistics don’t suggest for one minute that doctors are homicidal or incompetent but they do suggest that there is a tendency to put up the defences when their professionalism is called into question and an overwhelming desire not to appear inept, thus undermining one’s credibility in the eyes of their colleagues. After all, if one spent that much time and money on one’s education, what does that imply if a mistake is made? Therefore, studies show that for senior doctors, the practice of being open about mistakes can actually be a traumatic experience and as such cover-ups are encouraged and the vital information the profession needs in order to learn, is destroyed.
Is this really the case? I’ll have to be honest here and agree. I’ve witnessed such behaviours first hand. Misdiagnoses covered up, blame apportioned to other parties in the multi-disciplinary team, cloak and dagger texts bouncing around trying to coordinate satisfactory responses to management, classic lines being fed to players and very little in the way of reflective practice. Whilst this is not unique to doctors, nor unique to the USA, I have found that medical professionals in this part of the world are put on a pedestal, rarely questioned and blindly afforded trust to act without implication, far more than elsewhere in the world. This isn’t to say that this position is callously abused, but it does then make it incredibly difficult for someone in that position to voluntarily show fallibility for the reason that they might jeopardise the faith placed in them or the positions bestowed upon them. Such situations are only exacerbated in the high-stakes world of sport and as a result, I have seen some clinicians surround themselves with “yes men” in order to insulate themselves from accountability and aggressively oppose those that question their perceived knowledge, in order to survive and thrive.
On the flip side, I have had the pleasure to work or spend time learning with some incredible medical and surgical staff in the USA, that are very open to collaboration, demonstrate incredible humility, aren’t dominated by their egos and subsequently achieve some exceptional results. These clinicians remain at the cutting edge of their craft, with athletes traveling from around the world to seek their advice and expertise. Such professionals openly adopt the growth mindset that the aviation industry displays and it rubs off on those around them.
Reflecting on the growth mindset versus fixed mindset approach, I decided to conduct my own “Black Box” experiment. I wrote to a cohort of staff in my last job, including medical services, performance science and coaching staff and asked them two questions.
- During the time we worked together, identify two aspects of my practice that you believe I executed poorly, how this affected your life or work negatively and how you would recommend that I should have acted to achieve more favourable results.
- During the time we worked together, identify two aspects of my practice that you believe I executed well, how this affected your life or work positively and how you would recommend that I could have acted even better to achieve even more favourable results.
The feedback I received was of incredible value. It enabled me to look at areas of my work that I needed to concentrate on improving, decisions I had made that had been for specific reasons but perceived differently by others not party to all the information (therefore implicating the need to improve aspects of my communication) and elements of my practice that I could rightly be proud of. I subsequently put strategies in place to develop skills that would have helped me perform better and guide my personal development plan that I am now working on. It wasn’t necessarily a comfortable process but it was unbelievably worthwhile and will enable me to do my job better in the future.
Cognitive Dissonance
Syed introduces the concept of cognitive dissonance with the statement “We cannot learn if we close our eyes to inconvenient truths, but we will see that this is precisely what the human mind is wired up to do, often in astonishing ways".
Leon Festinger, a researcher from the University of Minnesota coined the term “cognitive dissonance” to describe the inner tension that is experienced when our beliefs are challenged by evidence. Instead of accepting our initial judgements were flawed when evidence provides a challenge to their foundations, research shows we are more likely to reframe the evidence, inventing new reasons, new justifications or new explanations in order to deny the facts…and if that fails, the next option is to ignore the evidence altogether. Simply put, challenging our dearly held beliefs can be perceived as being too threatening and the deeply ingrained human trait of cognitive dissonance becomes more dominant the more we have riding on our judgement.
The profession that Syed uses to illustrate this point most powerfully is the justice system, where DNA testing became available to challenge the convictions of thousands of inmates. In case after case, solid DNA evidence was presented to justify overturning thousands of prosecution cases in the USA, however, in countless cases the prosecutors vehemently defended the convictions, often with laughable reasoning.
As we saw when looking at how some doctors respond to failure or error, many prosecutors' professional competence is so tightly bound to their self-esteem, partly due to the length of their training. Furthermore, the nature of their work sees them spending long, emotionally intense hours of investigation with bereaved families. As such, when confronted with evidence that suggests they have ruined the life of an innocent person by sending them to jail and that the person responsible for inflicting great pain on a victim’s family remains free, the level of threat is significant and a state of denial is adopted all too readily.
Beyond the individual prosecutors, the systems put in place by justice departments have even sought to protect convictions from being overturned. For example, in the USA, only New York and Illinois permitted DNA tests after conviction up until 1999. Furthermore, even today, whilst all states do now allow post-conviction DNA testing, a ruling called the “finality doctrine”, which denies access to DNA test evidence being used after a certain time limit in old cases, as such preventing powerful information being used to prove a wrongful conviction, remains in some states. Others only permit the use of DNA testing if the suspect did not originally confess and the Supreme Court in the USA has admitted that they will only retry cases if it can be proven there was a procedural error as opposed to a factual error. Unlike the aviation industry, where the findings of inquiries into cases of error are published for all to learn from, the American Judicial System does not keep records, beyond the one-line order filed by the judge, when a conviction is overturned in light of new evidence. As such, there is no analysis of where the system failed.
Syed further points to the dangerous prevalence of cognitive dissonance in the medical profession, economics and in politics, where one of the most high profile examples is that played out during the second Gulf War.
Both Tony Blair and George Bush sanctioned strikes on Saddam Hussein’s regime based on the evidence that Saddam’s “WMD programme is active, detailed and growing”. Months and years passed without one weapon of mass destruction being found and yet both Blair and Bush have stuck rigidly to their beliefs that they were right to invade. Blair initially suggested weapons inspectors had not looked hard enough, before suggesting that Saddam had destroyed the weapons and then finally admitting the weapons did not exist. Bush justified his decision based on the need to get rid of a “very bad guy”, fight terrorists, promote peace in the Middle East and increasing American security.
Syed then appropriately points to the findings of Sydney Finkelstein, who investigated major failures in over 50 corporate institutions and concluded that “error-denial increases as you go up the pecking order”…it appears those at the top just have more to lose if things go wrong. Returning to the healthcare industry, one study in the US reviewed error reporting in 26 acute care hospitals and found that nearly half of all errors were reported by registered nurses. Physicians contributed less than two percent.
Personally, the safety net I put in place to manage my own cognitive dissonance has always been to recruit a team made up of people from different backgrounds, philosophies and opinions, which ensures that the decisions I make are both more widely informed and questioned by those who have a different perspective from my own. As such, the process of reflection and assessment of the results achieved are contributed to by people that aren’t necessarily vested in staunchly defending one approach at the expense of another and are therefore open to testing different strategies. Encouraging the testing of various approaches ensures an open loop learning system is adopted, rather than the closed loop system Syed describes as prevailing in the US Justice System.
In the next post, I will consider the concepts of "Getting Started vs Launching with Perfection"; "Marginal Gains" and "Redefining Failure". In the meantime, if this review has piqued your curiosity, go and buy the book...I can assure you, you'll not regret it!