Episode 13: Lessons from a Netflix Film
What happens when safety takes a backseat to profit? How can lessons from an airline tragedy guide efforts to make health care safer? Join host Karen Wolk Feinstein and aviation journalist Andy Pasztor, formerly of the Wall Street Journal, and transportation safety expert Chris Hart, founder of Hart Solutions LLC. and former chair of the NTSB, for a fascinating conversation on how a “culture of speed” at Boeing depicted in a recent Netflix documentary, “Downfall: The Case Against Boeing,” led to tragic consequences and how to protect safety in our revenue-driven approach to health care.
Listen to this episode on: Apple Podcasts | Spotify
Featured Speakers
- Karen Wolk Feinstein, PhD, President & CEO, Jewish Healthcare Foundation & Pittsburgh Regional Health Initiative
- Chris Hart, JD, MS, Founder, Hart Solutions LLC, pilot, attorney, and former chair of the National Transportation Safety Board
- Andy Pasztor, aviation journalist formerly with the Wall Street Journal
Referenced Resources (in order of appearance)
- Downfall: The Case Against Boeing (Netflix, 2022)
- In 737 MAX Crashes, Boeing Pointed to Pilot Error – Despite a Fatal Design Flaw (PBS, 2021)
- Up Next for Patient Safety – Episode 03: Paying for Safety
- A nurse made a fatal error. Why was she charged with a crime? (Vox, 2022)
- What Happened to TWA Flight 800? (History, 2021)
- What Really Happened to Malaysia’s Missing Airplane (The Atlantic, 2019)
- What is Human Factors and Ergonomics? (Human Factors and Ergonomics Society)
- Crew Resource Management (CRM) (SKYbrary)
- Probably Cause: It took 28 seconds for USAir Flight 427 to plummet from the sky. It took the National Transportation Safety Board five years to figure out why (Smithsonian Magazine, 2002)
- David Henson
- FAA failed to properly review 737 MAX jet’s anti-stall system: JATR findings (Reuters, 2019)
- Out front on Airline Safety: Two Decades of Continuous Evolution (FAA, 2018)
- What Really Felled the Hindenburg? (Smithsonian Magazine, 2017)
- The Long Forgotten Flight That Sent Boeing Off Course (The Atlantic, 2019)
- Boeing Charged with 737 Max Fraud Conspiracy and Agrees to Pay over $2.5 Billion (U.S. Dept. of Justice, 2021)
- Jury Finds Former Boeing Pilot Not Guilty of Fraud in 737 Max Case (New York Times, 2022)
- Error traps: Let’s raise awareness (Patient Safety Learning Hub, 2020)
- Paul O’Neill
- In Remembrance of Paul O’Neill (Health Affairs, 2020)
- Authority Gradients (SKYbrary)
- Profiles in Patient Safety: Authority Gradients in Medical Error (Academic Emergency Medicine, 2004)
- Medical error – the third leading cause of death in the US (BMJ, 2016)
- To Err is Human: Building a Safer Health System (Institute of Medicine, 2000)
Episode Transcript
[00:00:00] Andy Pasztor: Boeing changed from an engineering-driven company to a much more financially-driven company, more concerned about returns and Wall Street’s view of the company…
[00:00:15] Chris Hart: There are a lot of reports on what went wrong when some adverse event occurred but there are not many reports on why it went wrong….
[00:00:22] Andy Pasztor: I think we need some outside, reliable source of data to be able to put it into the proper perspective…
[00:00:31] Chris Hart: Safety to me is not an appropriate variable to compete on because you want everybody to be as safe as possible and not be afraid to share safety information because there’s a disparity between their safety goals….
[00:00:45] Karen Wolk Feinstein: Welcome back to Up Next for Patient Safety where we envision a world where medical errors, adverse events and preventable harms are avoided, and where we examine the most promising paths to prevent these tragedies before they occur.
I’m your host, Karen Feinstein, CEO and president of the Jewish Healthcare Foundation and the Pittsburgh Regional Health Initiative, which is a multi-stakeholder quality collaborative. We’ve been working to reduce medical error for over 20 years, mostly unsuccessfully, but we can’t give up because there’s too much at stake. And that is the loss of approximately 250,000 lives a year. And long-term injuries for many more.
I recently watched the new Netflix documentary, Downfall: The Case against Boeing. The documentary explores the events that led to two major crashes of Boeing 737 MAX airplanes in 2018 and 2019, as well as the multi-year investigation into the factors that led to the crashes. While watching I couldn’t help but think of the analogies to our revenue-driven, pay-for-procedures healthcare system in the U.S. I’m interested in how safety can take a back seat to income maximization and can lead to cutting corners, which is what happens when making money and saving money become the north star. We’ve also heard on previous podcasts, how the current healthcare payment system in the United States is actually dis-incentivized to safety.
I’m also aware of the detrimental effects of what I’m calling the culture of speed. Of course, speed equals money so we may be reiterating the same thing. Too often, institutions pride unnecessary speed over that stitch in time that allows precision, accuracy, and mindfulness. This concern is especially timely right now, given the recent prosecution of a nurse, RaDonda Vaught. For those unfamiliar with the case, RaDonda was a nurse at Vanderbilt University Medical Center who accidentally administered the wrong medication to a patient leading to their death. She was under unusual and unnecessary pressure to act quickly, which caused her to skip several procedural steps that would have prevented the tragedy. We will be covering this case specifically in more depth on a future podcast.
So today we’re bringing together two important voices in aviation safety for conversation on what healthcare systems can learn from the Boeing case. Andy Pasztor is a journalist formerly with the Wall Street Journal, who has over 20 years of experience writing on the topic of the aviation safety. He’s the author of the 1995 book “When the Pentagon was for Sale: Inside America’s Biggest Defense Scandal.” He’s currently writing a book about the history of air safety. Mr. Pasztor was featured extensively in the recent Netflix film Downfall: The Case Against Boeing.
Chris Hart is the founder of Hart solutions with specializes in improving safety in a variety of contexts. Mr. Hart has 15 years of experience as a transportation safety regulator at the Federal Aviation Administration (FAA) and the National Highway Traffic Safety Administration (NHTSA), and 12 years at the National Transportation Safety Board, including three years as acting chair and chairman. Mr. Hart holds a BS and MS in aeronautical engineering from Princeton university and a JD from Harvard Law School.
So with these two excellent guests, let’s dive into it. Let me begin with you, Andy. Can you tell us about some of the major events and developments and aviation safety you’ve covered over the decades?
[00:04:29] Andy Pasztor: Since the early nineties, of course, there’ve been a variety of developments and crashes. I covered TWA 800 and the Malaysian aircraft that disappeared in the Pacific, among many other dramatic events. But I would say that the changes since the early nineties have been absolutely unexpected and fairly, very dramatic. First of all, the change from a reactive system that dealt with looking at what happened to aircraft after they crashed to a much more predictive, proactive system that tries to develop safety programs and safeguards to prevent crashes. This was quite controversial in the beginning. We can talk about that a little bit, but as all of the parties gain trust and gain traction in these proactive systems, it became a tremendous boost to safety.
Secondly – and Chris will talk about this I’m sure – much more focus on human factors in how cockpits are designed, how pilots interact with automation and the safety boost that automation provides, but also the downsides and potential pitfalls that it can cost crews. And third of all, I would say something the aviation world calls, crew resource management. That means how pilots work together in emergencies. I think this is particularly important as we talk about medical settings and how nurses and physicians and surgeons and other players in the system interact. And so those are some of the big changes that have taken place and have made aviation much safer than anybody ever expected decades ago.
[00:06:16] Karen Wolk Feinstein: Andy, I don’t know if you covered USAir Flight 427…
[00:06:20] Andy Pasztor: I did, though that was at the beginning of my stint as an aviation safety reporter. But yes, I covered it in its last iterations, but since then, I’ve done a fair amount of reporting on that crash. It affected so many aircraft and it was such a difficult investigation. So it’s really one of the landmark crash investigations that the NTSB conducted.
[00:06:47] Karen Wolk Feinstein: Well, it was a landmark process because at first as you may remember when the airline said, “pilot error, pilot error,” but the NTSB kept on it and then discovered a failure that would allow crashes like this not to happen again. So, thank you, NTSB for your diligence. I also often noted in this case, we mentioned of the nurse in Nashville. How sad it is when the system is designed to allow errors and one person gets punished.
So, anyway, Chris, let’s ask you a question. What led you to the chairmanship of the NTSB that I just referenced? What background prepared you for that role – that leadership role there?
[00:07:34] Chris Hart: Well, actually what started was the, uh, my mother told me the first thing she ever saw me draw was an airplane. So I guess I’m an airplane nut, guilty as charged, and plus I’m a pilot, so that makes this whole safety thing kind of up close and personal. But I think that the thing that really sort of made me fascinated with being not only at the regulator, I was at the FAA as you mentioned for quite a few years, but also at the NTSB, was David Henson, the FAA administrator at the time when I went to the FFA, was such a big picture thinker and he really inspired me to think in big picture ways.
And a lot of the things that Andy mentioned, describe some of the big picture issues that go beyond just an individual making a mistake. They go into the whole system issues of the system, having what are often known as error traps – in the worst case – where it’s just a matter of time before people who are in that system, doing whatever they were supposed to do make a mistake because the system was not well-designed for the people. It wasn’t… it wasn’t user-friendly for them, for the end users of the system. So, my interest in the NTSB was just because I’m a big picture thinker and I’m fascinated with safety and plus being a pilot myself, I’m very interested in keeping that safe too.
[00:08:41] Karen Wolk Feinstein: A pilot, a lawyer, an aeronautical engineer. I’d say you are well suited for almost any leadership role. Andy, I do have to say, I thought the movie Downfall was excellent. Can you give us a brief overview of the two crashes featured in the film? What created the early confusion about the cause of the 737 MAX-8 tragedies. What after three years of investigation, do experts believe caused the crashes?
[00:09:12] Andy Pasztor: So, I will try to make it brief because it can be complicated and sometimes a little bit difficult for a layman to follow. Essentially, the crashes occurred in 2018, 2019. And it was determined that there was a faulty design of a system that controls a moveable panel on the back of the plane. Essentially it was a single point of failure, so if a single sensor failed, then the automation and the software was designed to push the nose of the plane down strongly and repeatedly. So, in both of these crashes, the sensor wasn’t operating, the plane went into a dive and the pilots in a variety of reasons in the two crashes, couldn’t pull the nose of the plane up in time.
The significance of the crashes of course had to do with the design of the aircraft, the way that Boeing communicated internally about the design. The way they communicated with regulators with the FAA about the design, the crashes also created a huge issue about the FAA’s relationship with other regulators and the investigation eventually determined that indeed the design was not proper. It was a faulty design based on incorrect assumptions. Improper communication and in fact, improper assumptions about how pilots should be trained, whether they should know about such systems. And so, it’s created a tremendous amount of… it roiled industry and created a huge number of changes, which are now working their way through the system, both in the US and internationally.
Tragically, 346 people lost their lives and I have to say – and we discuss it a little bit more in the beginning and through the early and middle part of the investigation Boeing insisted – as you said in US Air 427 – it insisted that the pilots were at fault, that the system was in fact adequate… and that turned out not to be the case.
[00:11:18] Karen Wolk Feinstein: So that leads us, Chris, to understanding a little better, the role of the NTSB and investigating crashes like Boeing. What kind of expertise would be assembled, but also, what would… what other organizations would assist the NTSB in their investigation?
[00:11:38] Chris Hart: Well, thank you for the question, the NTSB has a major role, obviously in commercial accidents in the US, but in international accidents, it’s somewhat different. So, in this answer, I’m going to chime in on some of the things that Andy just mentioned, and answer to your previous question. When it’s an international accident like that the country where the accident occurred as a general matter is in charge of the investigation. So, so the NTSB just plays a technical support role as an accredited advisor and that’s by international law. And the reason we are enabled as an accredited advisor is because of the American interest of it being an American built airplane. So, NTSB plays a much lesser role in international accidents is it does in domestic accidents.
So what we have seen from the NTSB and other sources, my sources that the FAA asked me to lead a process that they call the Joint Authorities Technical Review, which brought together people from around the world from airplane approval agencies around the world… brought them together to look at the FAA’s main question, which is how did a system this defective get through the approval process to be able to fly as you know, to be able to get the approval to… to have its certification to fly around the world? So that’s where the failure occurred and basically these two crashes – these two tragic crashes were a wake-up call that that approval process needs to be modernized. It’s not broken, a lot of people said the process is completely broken, but it’s not completely broken. In fact, quite the contrary, the US had almost 10 years of carrying almost 10 billion – with a “B” – passengers with only one fatality.
And that’s an indication of a very strong, a very safe airplane and a very safe airplane is an indication that the approval process is working very well too. And it was for all that time, but this time it did not work well. And there are lots of reasons why, one of one of those reasons actually is that immense safety that I’m talking about… when they were that safe, what happens is when things get so safe, as people get complacent, they say, “oh, we’ve taken care of that problem. We don’t need to worry about it anymore.” So that causes them to miss weak signals about things going wrong. So, one weak signal was, what are pilots going to do if this thing goes wrong? So, Boeing assumed in the moment based on what they knew at the time, they assumed that if pilots saw this error – there’s a training that all pilots receive called, “stabilize” or “runaway training,” and Boeing assumed that pilots would do that, but they didn’t actually put pilots in the simulator to see what they would really do. And that was a huge mistake because you can’t just assume human behavior… human factors is the toughest challenge of all. You have to actually put the pilots in the simulator and have them fly it and see what happens and that’s where this process failed was in allowing the system to get the approval at the FAA in the first place. But there are so many aspects of that, it’s a very complex approval process.
Boeing has a lot of things being approved so the company is delegated the authority to approve a lot of things. And there was question about how that occurred… is that the fox guarding the hen house kind of thing? So, there were lots of questions around the certification process. And one of the things that certification process failed to do was to require Boeing to A) tell pilots about the system in the airplane and B) train pilots on how to use that system in the airplane. And it’s not only the pilot, you know, there were some maintenance issues involved in this too. And so one of the – one of the parts of bringing an airplane into the real world is to make sure that the people who maintain it, understand how that system works as well.
So there were lots of places where that system failed and needs to be upgraded and that’s what the FAA asked me to do. To lead that process with those international experts from nine different approving authorities plus NASA, to look at this one and make recommendations on how to make this approval process more robust. So that something this weak with, as Andy said, “a single point failure,” that should never, never happen in approving an airplane, but it happened here and there are lots of reasons why that happened. That should never happen and that’s one of the things that the FAA wanted to find out was how did this airplane ever get through the process and what can we do to make sure that it never happens again?
[00:15:39] Karen Wolk Feinstein: Well, I was struck when you talk about complacency, because most of us do not look at the airline industry and think complacency. The commitment that airlines make to zero crashes, zero. They’re only going to be content when there are zero crashes, I think is so important. I saw a program last night; they’re still investigating the Hindenburg explosion. That was in 1937 and it was a blimp -most of us will never ride in a blimp – but I mean, you know, that kind of dedication… and I know that those of us who’ve observed safety in the healthcare industry. How many times we’ve heard, physicians and others say to us, “look, medical errors and inevitable part of healthcare. You’ll never get to zero,” but I worry if you don’t strive for zero… complacency is built into that.
So, Andy, here’s something interesting, as the film noted, Boeing historically had a great reputation for safety. What was the expression… “if it ain’t Boeing, I’m not going!” What factors led to a shift in culture that changed the prioritization on safety? In your estimation, what of these shifts caused the 737 MAX-8 failures?
[00:17:02] Andy Pasztor: Those of us who worked on the film, tried to put this into a broader context and tried to use Boeing as an example, or a microcosm… really of larger trends affecting all sorts of businesses and industries. The striving for short-term results, the financial pressures as there were mergers, and there were fewer firms, fewer larger firms competing against each other. I think that in Boeing’s case, an important part of the shift and a part of the reason for the tragedies that occur really goes back to the summer of 1997. That’s when the company merged with McDonnell Douglas, a weaker competitor, but a competitor with the different kind of history and culture. And there’s ample evidence over the years that Boeing changed from an engineering-driven company to a much financially-driven company, more concerned about returns and Wall Street’s view of the company. And so, I think that percolated through the hierarchy and through the engineering ranks, of course, it’s not to say that Boeing lacks fantastic engineers or that it’s lost some of its expertise.
I don’t think that’s the case, but the message from the top as the film shows and as lots of other reporting indicates the message from the top was more focused on returns, more focused on financial issues and how well the stock was doing, and that permeated many of the meetings, many of the discussions, and particularly for the 737 MAX, the issue was how much training would be required by pilots who started to fly it and some of the decisions that were made reflected the financial impact of assuring airlines that they wouldn’t require additional pilot training. And so, I think that really is a major reason that Boeing seems to have lost its way, certainly with the MAX.
[00:19:12] Chris Hart: So, can I chime in on that one? Cause I’m not – I think that may be an overstatement to say, I mean, for example, when people say they abandoned safety in favor of profits, I think that’s an overstatement. I think what they were trying to do, there’s a difference between abandoning profits in favor of safety and making safety improvements as cost-effective as possible. Because the reality of it is that as much as safety experts hate to admit it, if a safety improvement is not cost-effective, if it doesn’t, if it hurts the bottom line, it’s probably not sustainable. So, you’re probably wasting your time creating a safety improvement.
That’s not going to be sustainable in the safety improvement that Boeing was trying to create here was to allow the purchasers of this airplane to require less training on their pilots, because they were trying to make the airplane behave exactly the way the predecessor airplane, the 737-NG, trying to make them behave the same. So that would minimize simulator training for the new airplane and that, so that’s basically trying to make safety more cost effective. And of course, in hindsight, we know in hindsight that it was a terrible decision, but based on what they knew at the time, I don’t – I’m not so convinced that it was such a terrible decision to try to minimize the simulator cost to the purchasers of the air.
[00:20:21] Andy Pasztor: So I would just say two things. Number one, since the film has been streaming, we’ve been surprised by how many folks inside and outside Boeing and in the industry have said that the film underplayed, didn’t give enough credence and emphasis to the impact of the merger with McDonnell Douglas. So, I mean, people who worked for Boeing and still work for Boeing really do think that that’s cultural shift and focused on not as Chris was saying… no one’s suggesting that Boeing completely turned its back on safety or that purposely wanted to build planes that would crash or build planes that had inherent faults. But the emphasis and the impetus for financial returns colored some of those decisions. These are difficult decisions.
They’re not black and white decisions often. And so, I would say that the merger colored some of those decisions. And secondly, I take – I understand Chris’s point that many people have made. That you have to have a good safety program and a good safety feature. You have to think about the cost. That’s absolutely true but what the investigations by Congress and by the Justice Department and by the media have shown is that Boeing officials, certainly the pilots who were in charge of training programs, they purposely hid information from airlines, from regulators from everyone they could. And the only possible reason to do that was to minimize the training costs for this new airplane. So, the Justice Department has said they don’t believe that senior Boeing executives actively participated in deceiving regulators, but it’s absolutely irrefutable that lower level Boeing officials driven by financial concerns did mislead, purposely mislead, customers and regulators. About the system that we’re talking about on this aircraft, the defective, poorly designed system.
[00:22:31] Karen Wolk Feinstein: One thing that, you know, the old expression “time is money” and the feeling now “speed equals success” and I type in something I need, and Amazon delivers it the next day on my doorstep. I understand that. The problem is though it also encourages people to cut corners. You know, you can sort of can’t have it all. And in the case, from what I know of it as possible, this isn’t what happened, but what I know of the case in Nashville, the nurse was under speed, but it wasn’t life-or-death. And in healthcare we have life-or-death emergencies, but the radiology department didn’t want to change its scheduling for one of its diagnostics. So, she had to hurry, hurry, hurry… she was on a floor – she wasn’t a regular nurse on that floor – grab the medicine medication, rundown to radiology… but it was a scheduling issue. The woman had been shopping the day before. So, you know, I worry a little bit that we just cut corners. There’s this belief that as a nation, everything has to hurry up and be instant.
Anyway, let me go back to another question, Andy, just again, something that fascinates us, how much trouble did you have to go through to get the information that you needed when you were looking into the two plane crashes? I mean, how critical and how accessible is data transparency and aviation? It’s famous for that, we all know about the black box, but how does it really work?
[00:24:03] Andy Pasztor: So, the real system among investigators, crash investigators and airlines and the airplane manufacturers. There’s a huge amount of data that’s shared even about specific accidents. The airline or the manufacturer can put out an emergency directive or message saying “there’s a problem with discovered after a plane crash,” the FAA can do that. They can mandate fixes, they can even ground aircraft. The National Transportation Safety Board certainly can put out warnings before it finishes its investigation in fact that’s – it’s part of its charter is to warn people about problems. So, there are lots of ways to get information. In the aftermath of a crash for outsiders, including the press it’s increasingly… it’s exceedingly difficult to get accurate, correct information. And so, I think, that’s not going to change, there are good reasons for people not wanting to talk prematurely.
We found it very difficult to try to get even the rudimentary facts about some of the systems on the plane, but of course, Congress has a role and unlike the medical setting, congressional committees stepped in very quickly and started gathering information. And even though they had trouble getting some documents from Boeing, in fact, they started delving into it. And of course, so did the press, but to answer your question directly, it’s exceedingly difficult to find out what happened in the aftermath of a crash for outsiders, unless either the regulators or the NTSB or the plane manufacturer wants to provide that information. And too often, I would say they, they don’t provide enough preliminary information to help people understand what happened. There could be more transparency without hurting the probes that are underway.
[00:25:59] Karen Wolk Feinstein: So, I know Chris is going to want to say something!
[00:26:01] Chris Hart: And just from a different perspective, because our investigation the Joint Authorities Technical Review was created not to find out what went wrong, but it was created to find out how to make the approval process more robust. And in our search, we found nobody who intentionally lied or withheld information from anybody. In fact, when we were doing our Joint Authorities Technical Review Boeing was extremely open with us on information they gave us and they even reconfigured simulators while they’re trying hard to use all their simulators and their resources to get their airplane back in the air again. They set aside some for us so that we could do our study better because nobody, nobody, nobody ever wants an airplane to crash, especially the manufacturer. So, we didn’t find… but again, that wasn’t the scope of our review on did anybody deceive anybody, but from what we saw, we didn’t see any intentional withholding of information from the regulator or from the carriers.
[00:26:58] Andy Pasztor: Now of course, I mean, we don’t want to go too much on the MAX crashes, but in fact, the Justice Department released scans of emails would show that there were Boeing officials purposely withheld information and misled regulators and airlines before the crash. Chris is talking about the effort to understand what happened after the crash. I’m talking about what occurred before the plane was certified as safe to fly and I think we should keep that in mind.
[00:27:33] Chris Hart: What we found was a huge failure to communicate because of a very complex delegation process, but we did not find any intentional hiding of information from anybody. And so that’s just, I guess, a matter of the different perspective.
[00:27:45] Andy Pasztor: Well, the Justice Department got Boeing to acknowledge that their employees purposely withheld information and essentially prodded the airlines and prevented the FAA… part of the FAA from finding out about the need for training or the issues are surrounding training. I think we should recognize that Boeing has acknowledged that and taken responsibility for some of its employees who were charged criminally – one of them was charged criminally and was acquitted in a trial – but Boeing signed documents saying we agree that our employees did commit criminal acts in not properly providing information to regulators.
[00:28:36] Karen Wolk Feinstein: I’m going to ask Chris a tough one, perfect or imperfect, the transparency and the collaboration among airlines in the interest of public safety looks pretty good from the perspective of healthcare where we have yet to see the major stakeholders come together. I know you serve on the Joint Commission for the Accreditation of Healthcare Organizations. How do we transfer this… this collaboration and transparency in the interest of public safety? How do we transfer that to healthcare?
[00:29:14] Chris Hart: Well, that’s a very good question. A lot of people ask me are people in the aviation industry more safety oriented than people in other industries? And it’s not necessarily that they’re more safety oriented, but it’s that there’s so much attention placed on aviation safety there. And that’s because a large percentage of the population is afraid of flying. So, there’s immense political and media pressure on anything that goes wrong in the aviation system. And footnote, all of aviation is governed by federal law and all of the federal legislatures are frequent flyers because they have to go from DC to their district and back. So, they’re… they’ve got a lot at stake because they’re on those airplanes a lot themselves.
So, there’s no question that there’s an intense interest because an airplane can go off the end of the runway, people slide down the slides and sprain ankles and its national news for three days. Versus a hundred people every day, dying on the highways and you don’t hear anything about it. So, there is a lot to be said for all that media – that intense media and political pressure that causes and that has resulted in, and I think it is largely responsible for the amazing amount of collaboration that goes on in the aviation world that improves safety. And collaboration means, include everybody who’s got a dog in the fight. If you’re involved in a problem that affects you, then you need to be involved in the solution to that problem. And that’s what that collaboration does, is it makes sure those systems are end-user friendly so that the end-users of those systems don’t get caught in what are called error traps that are created by the system. I don’t see any of that human factors emphasis in healthcare.
And I’ve… since the whole industry is so human labor focused, it all depends so much on human nature, on human – on basic human behavior – that making those procedures and systems and equipment human friendly to the end-users, that’s so crucial, but it’s not – I don’t see it happening in healthcare, and I think it’s going to have to happen in healthcare. That’s one of the big lessons that healthcare can learn from aviation is being collaborative and being respectful of the needs of the end-users is crucial to making sure that you don’t generate error traps that will create very foreseeably, create errors.
[00:31:22] Karen Wolk Feinstein: Let’s all of us think a little bit about leadership. So, and the role leaders play in creating a culture of safety. So, Paul O’Neill, chair of Alcoa, which was the safest corporation in the world, helped me set up the Pittsburgh Regional Health Initiative. And one thing that struck me, Paul would walk through the plants and as you know, of course, Alcoa had plants all over the world, over a hundred thousand workers, and he would hand everyone a card with his home phone number. People had home phone numbers in the nineties. He said, “please call me, if you reported a safety hazard or a safety incident, and nobody did anything about it.” So, I said, “oh my God, how many calls were you getting?” He said, “oh, I only needed one. And my managers knew I was serious and they would act right away and report within 24 hours if there were a safety incident.”
So, I’m thinking about healthcare and the problems we have. I mean, we call people who report something, a whistleblower, which is really has a lot of negative connotations. And when we assess and accredit and license healthcare facilities… when we reward them with reimbursement, we don’t assess whether there’s leadership with a priority on safety, whether there’s a culture of safety. For both of you, how would you look at an organization and say the leader is respected by the outside world? Because safety is very important. You can tell that there’s leadership for safety. You can tell that there’s a culture of safety. How do you look at an organization and determine that?
[00:33:11] Andy Pasztor: Maybe I can quickly start that and Chris can chime in. You’re not talking about the secret sauce of why the aviation industry is so safe…. we’ve been talking about failures and failings in the MAX design, but it’s, it’s an incredibly safe system. And I would say that a big reason is, maybe the most important reason is, voluntary reporting about incidents. When people make mistakes, when things get screwed up, when there were near accidents or close calls in the air or landing issues. People are told and ordered from the top, the pilot’s job and the mechanic’s job and the air traffic controller’s job entails letting others know about the close calls and the mistakes and the precursors to potential bigger problems.
And that has been inculcated and it’s ingrained in the aviation industry. And in fact, when this whole effort started, the focus was, you will get fired maybe if you don’t report some mistakes, but you will not get fired if you report mistakes that were not intentionally done – didn’t involve alcohol or drugs – but other people could get caught in those traps as Chris said. And for me, the aviation industry represents a top-down absolute commitment to those voluntary reporting programs. And it’s not because the airline managers care less about making profits than other industries. It’s not they’re positive or more magnanimous. Maybe it’s not even that they’re more positive or they’re more forward-looking. They’ve made a clear-cut business decision that it’s bad for the bottom line to have even one crash. A jumbo jet crash can entail a billion dollars in liability and all sorts of other problems and boy the MAX issues have costed multiple billions of dollars and terribly tarnished its reputation.
So, I would say the aviation industry is a very good example where the top-down commands and expectations are absolutely clear cut. And I think it’s very relevant to medicine because many critics of the current reimbursement and medical safety system would say that management at too many hospitals has decided the cost of doing business is acceptable. The current rate of unnecessary deaths and medical errors, it’s horrible, it’s very bad, we’re going to do what we can about it. But in fact, it’s a sustainable cost of doing business. And the airline industry twenty years ago decided the status quo was not a sustainable situation. And the cost of crashes was not acceptable. The financial and economic cost to the companies, to the airlines and the manufacturers was simply not acceptable. And I think that’s a major turning point. And when we talk about the medical system, I think it’s really important to keep that in mind.
[00:36:17] Chris Hart: And I think Andy has really put his finger on one of the main foundational differences between aviation and healthcare and that is the fuel for the process improvement. If you assume that the people you have are on the whole, competent, proud professionals who are highly trained and who are living by the credo of “do no harm” and yet you’re still having things go wrong. If, when things go wrong, it’s probably not because you had quote “bad people,” it’s probably because you had systems and processes and procedures that they’re following, that aren’t working the way they’re supposed to work and that create error traps. And that’s what information about errors and near misses reveals is every time you have an error and a near miss, that shows that there’s a defect in the system and that that’s, that’s what needs to be targeted, not targeting the person, but the defects in the system. And that’s what needs to be fixed. And if you bury that, sweep it under the rug, bury that valuable information about errors and near misses under the rug, you lose all of those opportunities to make your system safer.
So, I think that that is one of the fundamental foundational differences between healthcare and aviation is what do we do with this information about errors and near misses. Do we use it as a learning opportunity, or do we hide it as much as possible? And to the extent we don’t hide it, we punish people for it.
[00:37:30] Karen Wolk Feinstein: It is so critical also to understand how little gestures say everything. So, I think of a hospital where they started ordering a cheaper surgical tape and the surgical tape was giving people big blisters, giving the patients large blisters, which were not only uncomfortable, but a pathway for infection. And when the nurses reported it, everyone ignored it and they made every connection they could to replace a surgical tape. And finally, we’re told, “well, we ordered a large batch. Let’s wait until it runs out, so I’m sorry.” That’s one little gesture… the nurses disclosed that there’s a problem, but you know, we’re going to throw away all the surgical tapes? So, you know, I always think of how leadership says so much by sometimes the smallest response to something that could cause harm.
[00:38:28] Chris Hart: That’s also a great example of a process that’s obviously not friendly to the end-user.
[00:38:33] Karen Wolk Feinstein: I would say! And I would like to think it was rare, but I’m afraid it’s not.
[00:38:39] Andy Pasztor: And you can look at this from a macro perspective as we have in that also from kind of the micro perspective, that you raised. In aviation, mostly two pilots, sometimes three… two pilots in the cockpit, the copilot these days is obligated and trained and expected to tell the captain, “We have a big problem here, captain, we know we shouldn’t be doing this.” And after the fact report to the airline, this is what happened. And basically to report on the captain. How many nurses feel empowered to do that in an operating room or after the fact when something happens? I would say not nearly as many as co-captains in airline cockpits and that… that authority gradient that it’s called sometimes, is quite a big deal. And aviation has found a way for the most part, even around the world to try to minimize the authority differences and encourage copilots to report. And that’s just a small example of how the systems differ, but I think an important one.
[00:39:54] Chris Hart: So, let me tell you that was a huge uphill battle because aviation comes from maritime and maritime tradition for hundreds of years was, captain is God and everybody else carries the bags and shuts up. Well, aviation was that way too, for a long time, until they realized, you know what? We’ve got two competent people here who can contribute to the safety of this effort, let’s use that, that knowledge and expertise that we’re paying for it in this other seat and make and take good advantage of it. And that’s, that’s a great example of an evolution that was difficult in in aviation. And I think that the reason that they were able to make that evolution that healthcare is having trouble with is very, very simple. The simple reason is the pilot is the first one to arrive at the scene of the crash.
Doctors and nurses rarely get hurt from their mistakes, physically hurt, but they don’t… that doesn’t mean they’re any more willing to make mistakes – they don’t want to make mistakes either – but the point is when you really have got so much at stake as being the first one to arrive at the scene of the crash. And you’re much more willing to participate in reporting problems that are occurring, and you’re much more willing to have your co-pilot tell you what to do to keep you from dying. So, I think that’s a huge difference. That’s one of the reasons that one size doesn’t fit all. In healthcare you typically don’t get hurt by your mistakes whereas in aviation often, you do.
[00:41:09] Karen Wolk Feinstein: And another problem that we have in healthcare is if you are a patient or let’s say if you’re a customer and you, for any reason, think that the airline industry isn’t safe, you have options. Well unless you’re going to South Africa, but you have options. So, look at what happened during the pandemic when people were afraid of getting COVID because they were in close quarters, you know, families like ours, we started driving on crazy vacations, any place we could drive, you have an option. But the problem for the public is they don’t have a lot of options. When you’re really sick, you need to go to a hospital, you need to find a doctor or you need to find a primary care office.
How do we get the public to care as much about safety, their safety? The numbers are extraordinary. We talk about how many people died in the 737 MAX compared to quarter of a million people who may die from preventable medical error in a normal year and multiply that in a pandemic. Why is the public so complacent about healthcare where they have to go for care and yet, you know, they’ll respond immediately to a threat in air travel? How do we get them activated?
[00:42:31] Chris Hart: That’s a very good question, because as I mentioned, it’s that political and media attention on safety mishaps that creates that lights that fire under the safety professionals under all of our professionals in aviation. You don’t have that in healthcare and I think it, my theory is, because these happen in onesies and twosies at most. And it doesn’t make it get any media attention, doesn’t get much political attention and you know, it wasn’t until the Institute of Medicine published their report, To Err is Human, in the late nineties that people were aware of large numbers, like what you’re talking about in terms of how many people are killed every year in US hospitals due to medical error. But even then, if you go to Joe public and say, “how safe is healthcare compared to aviation?” there’s really no way to compare them because in aviation you get all that media attention. You don’t get that in healthcare.
[00:43:19] Andy Pasztor: I agree with Chris, the media is important, but also looking down the road a little bit… reliable, trusted, third-party sources of information, not just from the media, but from some healthcare organizations, from whatever it is – some new organizations that need to be created. There’s a dearth of information, benchmarking systems to other systems, or just letting people know about the extent of the safety issues that hospitals face without some sort of trusted third-party outside data source – and there are some organizations that do that, of course, but maybe not enough.
The media can help but I think there won’t be a really big step change until there is something. I mean, of course, Karen, you’ve talked about this, something comparable to the NTSB or some sort of national organization, which is able to release the information, even if it’s on a large scale to let people know what the safety issues are. I don’t think the media alone is enough. I think you need some outside reliable source of data to be able to put it into the proper perspective.
[00:44:40] Karen Wolk Feinstein: Andy, I agree with you very much. We would love to have an NTSB, a National Patient Safety Board. We think that having a home for that kind of research that study the understanding for the preconditions for harm and finding solutions and particularly one thing I like… I love all the autonomous solutions that take the burden off the operators, the things that have become a part of air travel, train travel, travel on the highway, that protect passengers. They just happen autonomously, a lot of them. And I think that that’s really critical. We are very primitive in healthcare. Just look at the case of the nurse in Nashville.
So, Chris until we have an NTSB-like National Patient Safety Board. I’m pretty excited about your company, which does focus on improving safety, particularly from a perspective of autonomous technology safety technologies. So, I do want to tell you that. We hope your solutions will get applied to healthcare as soon as possible, because we’re so far behind. And I want an airbag when I travel through my healthcare providers. I want autopilot. I want those automated safety dispensers to not let my nurse take out a deadly drug, at least not without a rescue drug. So, we’re cheering you on and I hope you bring your engineering and aviation expertise to autonomous solutions in healthcare.
[00:46:28] Chris Hart: Well, one of the things that would help do, because I am pushing for as you know, an NTSB-type of organization in healthcare. One of the things that I would help do, my impression is, correct me if I’m wrong, but my impression is there are a lot of reports on what went wrong when some adverse event occurred, but there are not many reports on why it went wrong. And that’s what the NTSB is really good at is cause it used to be back in the old days, their reports used to end with pilot error, but then as their processes evolved and they became smarter at this, they realized, “you know what, that pilot didn’t make an error on purpose” and plus, by the way, footnote, he killed himself. So there must be something going on in the systems around the pilot. So let’s figure out why this pilot error occurred. Was it because the equipment was not friendly to him and didn’t give him the information that he needed in the moment? Or is it because the pilot was not well-trained? Or is it because the pilot was fatigued or impaired? Or what was the reason for why did this error occur?
And that’s one of the things that the National Patient Safety Board could help do, is to identify not just what went wrong but why it went wrong cause unless you know why it went wrong, you don’t know what to do to keep it from happening over and over again. So having said that, I would love if I’m going and picking hospitals, I would love to have a table that say, this hospital has this level of safety and this hospital has that level of safety, but then here’s one of the conundrums of that approach is… you notice airlines don’t compete on safety. You don’t see any airline ads that say, “we are the safest airline out there.” So, there is no competition on safety and it’s precisely because they don’t compete on safety, that they are willing to share safety information so freely because it’s not because it doesn’t, it’s not used against them, so to speak.
And I’m not sure about how… I think ideally, hospitals shouldn’t be competing on safety either. They should all be world-class safe, and it shouldn’t be… they should be competing on other factors besides safety. Safety to me is not an appropriate variable to compete on because you want everybody to be as safe as possible and not be afraid to share safety information because there a disparity between their safety.
[00:48:27] Karen Wolk Feinstein: Well, there’s an expression, “from your lips to God’s ears!” Andy, did you want to say something?
[00:48:32] Andy Pasztor: I just want to make, quickly piggybacking on Chris’s point, which I completely agree with, but, if you want that to sort of go back to where we started our conversation, if you look at why the 737 MAX crash has created such a maelstrom of concern and effort and investigation really worldwide. It’s because two of the same planes, relatively new planes, both of them in a new design crashed so quickly one after the other. It just never happens in aviation for the same basic problem. And if you had a much more transparent medical system where people could see problems occurring and could read about problems occurring, you would have a better way to prevent the same mistake from occurring in this hospital and the next hospital and the next state and aviation has been very good at not repeating, recently in the modern jet era, not repeating the same mistakes, certainly in the last 20 years or 30 years.
I think the medical system is not as good at not repeating the same mistakes. You know, part of that is just to make sure that information about the mistakes, even tragic mistakes get widely disseminated so that people can deal with it and react to it and prevent it.
[00:50:00] Karen Wolk Feinstein: Well, sharing information would be wonderful. How about more sharing of solutions? If you know everyone that had a solution, every health system, hospital, or practice that came up with a really good solution, had a central home to share it. I like to say we’d be building a better airplane, not screaming for better pilots. But anyway, you guys, this has been a great conversation, Andy and Chris, I can’t thank you enough. I am very inspired when I talk to people like you, who look at our industry from the outside. Because there’s a lot of wisdom there. So, thank you so much for sharing your ideas today.
[00:50:44] Chris Hart: Thanks for having us on to do that, it was a pleasure.
[00:50:47] Karen Wolk Feinstein: To learn more about the effort to establish a National Patient Safety Board, please visit npsb.org. We welcome your comments and suggestions. If you found today’s conversation enlightening or helpful, please share today’s podcast or any of our other podcasts with your friends. We can’t improve the effectiveness of our health system without your help.
You, our listeners, friends and supporters are an essential part of the solution. If you want a transcript or the show notes with references to related articles and resources, that can be found on our website at npsb.org/podcast/. Up Next for Patient Safety is a production of the National Patient Safety Board Advocacy Coalition in partnership with the Pittsburgh Regional Health Initiative and Jewish Healthcare Foundation.
It is executive produced and hosted by me, Karen Wolk Feinstein. Megan Butler and Scotland Huber are my associate producers. This episode was edited and engineered by Jonathan Kersting and the Pittsburgh Technology Council. Thank you, Tech Council! Our theme music is from shutterstock.com. Social media and design are by Lisa George and Scotland Huber. Special thanks to Robert Ferguson and Steven Guo. Thank you all for listening.
—