Episode 23: Where Is the System?

How can health care address the systemic challenges that have impeded progress on safety for decades? What lessons can be learned from other industries that have made safety a top priority? Join host Karen Wolk Feinstein and guests Nancy Leveson, professor, engineer, and systems safety expert, and Dr. Michael Shabot, physician, former healthcare executive, and expert in high-reliability healthcare safety and quality, for a thought-provoking discussion on how we can build a safer future for both patients and workers.

Listen to this episode on: Apple Podcasts | Spotify | Health Podcast Network

Featured Speakers

Referenced Resources (in order of appearance)

How Paul O’Neill Fought for Safety at ALCOA (David Burkus, 2024)

How Safe Are Power Plants? (The New Yorker, August, 2022)

Why Nuclear Power Is Safer Than Ever (GIS Technology Breakthroughs, 2022)

Out Front on Airline Safety: Two Decades of Continuous Evolution (Federal Aviation Administration, 2018)

Hospital CEO turnover is on the rise (Health Exec, February 2023)

NQF to Update, Harmonize Serious Adverse Event Reporting Criteria (Healthcare Innovation, April 2024)

Thinking Out Loud: Electronic Health Records (American Council on Science and Health News, May 2024)

One in Four Hospital Patients Suffers Adverse Events, and Some can Be Avoided (Chief Healthcare Executive, January 2023)

Nearly 1 in 4 US hospitalized patients experience harmful events (NBC News, January, 2023)

Do You Trust The Medical Profession? (The New York Times, January 2018)

Patient Safety Indicators Continue to Lag Post-Pandemic (Association of Health Care Journalists, March 2024)

Memorial Hermann Earns Straight A’s in Patient Safety (The Leapfrog Group, April 2018)

What Happened During the Deepwater Horizon Spill? (National Ocean Service)

Resistance to patient safety initiatives (PSNet, 2025)

Texas Surgeon is Accused of Secretly Denying Liver Transplants (The New York Times, April 2024)

What’s the Best Way to Stop Tragic Accidents? (The New York Times, October 2021)

Heartbroken (Tampa Bay Times, 2019)

40% of Doctors Eye Exits. What Can Organizations Do to Keep Them? (AMA, November 2023)

America’s Workplaces Are Still Too Dangerous (The New York Times, April 2021)

Improving Patient Safety Using Principles of Aerospace Engineering (MIT News, 2024)

Health Costs, Medical Bills Are Top Tconomic Concern Among Voters (The Hill, February 2024)

Episode Transcript

[00:00:00] Nancy Leveson: Good engineering has to take into account the people who are using the systems, the social factors, and the motivational factors. Safety is much, much beyond just a few equations or some physical design features.

[00:00:20] Michael Shabot: The change in anesthesia wasn’t by an organization telling anesthesiologists to be safer doctors. It was by implementing technology that allowed them to safely administer anesthetics, making it 10 to 100 times safer than it was before.

[00:00:42] Nancy Leveson: Aviation, nuclear power, the industries that have had the very highest success, have lots of information. They collect information continually because you can’t manage something if you don’t know what’s going wrong.

[00:01:01] Karen Wolk Feinstein: Welcome to Up Next for Patient Safety. We’re thrilled to continue sharing compelling conversations with top experts doing important work to make healthcare safer. I’m your host, Karen Feinstein. President and CEO of the Jewish Healthcare Foundation and the Pittsburgh Regional Health Initiative, which is a multi-stakeholder quality collaborative located in my hometown but connecting to the nation and the world.

We’ve been working to reduce medical errors for over 25 years, and I might add, we never believed progress would be so slow. However, I know revolutions come from hope and not despair. These conversations are intended to inspire all of us with hope as we work for a safer future.

We’ve spoken often about how other industries have managed to address problems with safety while the healthcare industry has continued to be plagued by errors despite decades of effort to address the problem. My guests today have spent their careers studying what it takes to improve safety in an array of industries and how these learnings could be applied to health care.

Nancy Leveson is a professor of aeronautics and astronautics at MIT. She’s an elected member of the National Academy of Engineering. Nancy conducts research on the topics of system safety, software safety, software and system engineering, and human-computer interaction. She’s received many awards for her work, including the IEEE Medal for Environmental and Safety Technologies, the ACM Allen Newell Award for Outstanding Computer Science Research, and an AIAA Information Systems Award for developing the field of software safety and for promoting responsible software and system engineering practices where life and property are at stake. She’s published over 400 research papers and is the author of three books, including the most recent, An Introduction to Systems Safety Engineering, which just about covers everything you need to know in this field. She consults extensively in many industries on ways to prevent accidents.

Nancy gained her degrees in computer science, mathematics, and management with a sidebar in cognitive psychology from UCLA. Nancy has also published a couple other books, Safeware: System Safety and Computers, and Engineering a Safer World. We are delighted to have you here, Nancy.

And your companion is Michael Shabot, who has written and presented extensively on safety culture and high-reliability organizations. We may get into a really interesting fertile ground because Nancy does not think reliability necessarily is the answer to safety. We may get into that later. Michael is the founding partner of Relia Healthcare Advisors and former executive vice president and system chief clinical officer at Memorial Hermann Health System in Houston, Texas, where Memorial Hermann received the John M. Eisenberg Award for patient safety and quality. He serves as an adjunct professor at the University of Texas Health School of Medical Bioinformatics and at the University of Alabama School of Health Professions. He also serves on the Board Quality Committee of the Mercy System in St. Louis, Missouri. So welcome, Nancy and Michael. You’re both very busy and well-informed people.

So let me start with my first most basic question. As both of you know well, and have informed many of us, other industries have demonstrated that it’s possible to make dramatic improvements in safety. Can you each share the required building blocks that you think have helped other industries and compare that to health care. You can go in whatever order.

[00:05:10] Michael Shabot: I think there are two major building blocks. One is culture and the other is leadership. Without those, you really can’t get very far in terms of safety. And I think Professor Leveson would probably agree with that.

With regard to culture, I had the opportunity to be flown out to the USS Nimitz, a nuclear carrier about 150 miles off of San Diego. I made an arrested landing and was immediately taken up to the bridge, which was very busy. They were in war games, real war games. I spent 24 hours aboard the ship, which was my first exposure to high reliability. I did not have that in health care, and that’s why we’re talking about it today. The average age of the sailors on the ship was 19-1/2 years, including a smaller but older officer crew. Every crew member that we interviewed, I was with a couple other medical staff officers, told us their safety job first, which had been signed off by their next in command, and then they told us their Navy job. We spent a lot of time on the flight deck, which sounded like it would be very dangerous. It was as safe as you could possibly imagine. And there we were literally enveloped in a culture of safety that we had never, never seen before.

The second thing is leadership. I point to the story of Paul O’Neill when he took over ALCOA in 1987. The company was a mess. Profitability was very, very poor. Inventories were bulging. He met with the financial analysts, his shareholders, the big banks, and the grading agencies for the first time in 1987. And he started with, I want to talk to you about worker safety. He went on for a bit and he was interrupted by one of the analysts, who asked what Alcoa was going to do, what he was going to do, with their wasted inventories.

And he answered, you might not have heard me correctly. I believe that worker safety is the key to revitalizing this organization. He closed with I intend to go for zero injuries. He was very serious about that. And that, as you, as you may know, turned the company around to record profits within 12 months.

[00:07:37] Karen Wolk Feinstein: Well of course I like your reference to Paul O’Neill, who was my co-founder here of the Pittsburgh Regional Health Initiative and brought us the kind of credibility that we could never have had without Paul’s partnership. Paul really believed that safety was the fundamental platform in any organization for high performance, that it said everything about your values. He believed if you could master an environment that was as safe as possible, you could probably accomplish whatever your organization set as its mission.

It’s been very interesting because I have to say, I haven’t found a lot of people after Paul, a lot of leaders who would lead with safety, but Nancy, I know you have looked at many industries. What are they doing? Name one—what have they done that we haven’t done in health care?

[00:08:37] Nancy Leveson: I absolutely agree that culture and leadership are important. And Paul O’Neill’s lessons are wonderful. There’s a mistaken idea that if it’s safe, it’s going to be less efficient, less reliable, less profitable. And that’s absolutely untrue. In fact, those things go hand in hand. The biggest problem I see with respect to safety in health care is that the safety management systems in healthcare facilities and in government oversight agencies are very weak and aren’t designed very well.

For example, there’s almost no feedback, or very little feedback, about the problems that exist for those who can do something about the problems. A basic principle of management is that without information about the process being managed, it’s not possible to make good decisions. And the industries that have been successful and dramatic improvements in safety, such as nuclear power, aviation, and all of those collect extensive information, and they use that information to design improvements. Otherwise, you’re sort of working in the dark. You’re doing things, while I think this was the problem, I think this will happen, and then you end up not necessarily doing very effective things.

[00:09:59] Karen Wolk Feinstein: I think, Nancy, tell me if I’m absolutely going in the wrong direction, but after reading your book, and even though you are sitting in ground zero for engineering, mathematics, computer science, I love the fact that you say safety is also a political, social, motivational, attitudinal communications challenge. It’s not something you just engineer in. Am I misquoting you?

[00:10:29] Nancy Leveson: Well, you know, I consider engineering very broadly. As I think you said, my degree is in computer science. I am not an aeronautical engineer, although I’ve worked in safety and aerospace for a very long time. Engineering is just design. And so many people think it’s just a bunch of differential equations and numbers and that’s not it at all.

Good engineering must take account of the people that will be using the systems, the social, as you said, social factors, motivational factors, safety is much, much beyond just a few equations or some physical design features.

[00:11:15] Karen Wolk Feinstein: I’ve heard Michael talk about the fact that mining, and I love his, one chart says it all, right? One graph says it all. Mining, which is basically considered to be the most hazardous industry in the world, gets safer and safer and safer. What is holding health care back? Our curve is going in the wrong direction. Their accidents are going down. Our adverse events are going up. What is holding us back in health care? I know you both have an opinion about this.

[00:11:52] Michael Shabot: Well, I think there are two major factors. And you’re right, Karen, careful measurements of harm that have been done in the last year are literally the same measurements, the same percentages of harm. Measured 20, 25 years ago, and that’s depressing.

Two major factors. Number one, leadership. The average tenure of a CEO, hospital or health system CEO in this country is 4 to 5 years. That isn’t really long enough to change the culture of an organization, which is what needs to happen. And secondly, very importantly, most clinicians, most people who work in health care, trained, and worked in an environment where harm was relatively common.

They believe it’s normal and natural. And that zero harm in health care really can’t be prevented. That is something that we’ve proved to be untrue, but generally across the healthcare industry, that’s the feeling. Zero is just not possible.

[00:13:07] Nancy Leveson: I agree with those, but I would also add some others. One of the most important is the focus on blame and human error in health care. There is this belief that there’s just the bad apples and I, you know, if you get rid of the bad doctors, you’ll get rid of all the problems. In fact, I had a paper submitted to a prestigious healthcare journal, and the editor refused to send it out for review because, she said, all we have to do is get rid of the bad apple doctors and then we don’t need all these other things that you’re suggesting we do. And it translates, in the most root cause analysis, that most acts, most adverse events are believed to be caused by human error.

I believe a lot in systems thinking, systems theory, human error is a product of the environment in which humans are operating. Stress, incentive structures, bad equipment, for example, electronic health records (EHR) are causing lots of problems, but they’re never blamed. The systems, the EHR systems aren’t blamed. The humans who the EHR system induces an error in, those are people who are blamed. And there’s talk about “just” culture. What that does is say well, we won’t blame the human, but then we don’t look deeply at what really is the problem. Therefore, you can’t fix something if you don’t understand it.

I think that’s part of the problem. I think there’s also a problem, having worked in healthcare safety and starting with some people being massively overdosed in 1985, I’ve been working in it for a long time. And there’s this deep distrust of anyone outside of health care. Only doctors understand medical safety and physicians understand it, but other people don’t.

You certainly don’t want an engineer in here because all they do is make machines, and so you’re not necessarily getting the leadership, the right leadership in the patient safety community that Michael was speaking about.

[00:15:38] Michael Shabot: Let me just comment on that. First, I would have to say that physicians are not trained in safety and quality. The training in medical school, I mean, they should be, but the training in medical school and in residency is to jam in as many possible facts and techniques as possible in the limited time available, which of course is 7 to 10 years. It’s a long time.

That’s the thought. We didn’t really get any training in quality and safety, except in the last few years. Where I’m on staff at the University of Alabama, there are now graduate schools in quality and safety, but there certainly weren’t when most physicians went through.

The point you made about blame, one of the things we’ll talk about and the key to high reliability safety at Memorial Hermann was implementation system-wide of “just culture” throughout the entire organization. So that separate from any individual care provider harm that may have occurred that may go to traditional peer review, the event was analyzed and brought forward, and those involved in the event, even a close call where there was no harm, were honored for bringing it forward, bringing it up. Those individuals were brought to the board room from across the system to explain what they did to the board quality committee or the board itself.

They were honored at their own institution and at the system level. And then the problem was dissected from the top down, meaning from management, from initiatives, from supplies, all the way down to the individual unit and care provider level to determine from the event not the individual.

What could be done to prevent that event from ever occurring again? There were literally thousands of those kinds of solutions that came out of just culture. Without just culture, without psychological safety, you have clinicians and employees hiding adverse events rather than disclosing them.

I’ve consulted for two healthcare organizations in different parts of the country whose boards found out there was a safety problem when they read it in the daily newspaper. That’s the wrong way to learn about a safety problem.

[00:18:17] Nancy Leveson: I don’t disagree. I actually believe in just culture. It’s just not enough to say we’re going to have a just culture. You have to do more. I’m going to say something a little controversial. And that is, I’ve been involved in a lot of important accident investigations, and many that you’ve heard of, Deepwater Horizon, the Columbia Space Shuttle, and Texas City, a large refinery explosion. I’ve been involved in investigating these and also hundreds of others that you wouldn’t have heard of. One of the themes is that the worst accidents are occurring in the companies that are adopting HRO theory. Just culture isn’t HRO, it existed long ago. I think people are putting a lot in the HRO theory that isn’t there.

If you look at the papers and what they did, these accidents were caused exactly by doing what they suggest one do. Most of them are what we’ve known for 100 years in safety engineering are bad things to do, but it doesn’t come out of safety engineers and the community of safety engineering. It comes from some management schools. I think people should be very wary about adopting high-reliability organization theory because much of the practices that they are promoting, and I mean, they mean well, but they actually are causing the most serious accidents, including Deepwater Horizon, for the way we polluted the whole Gulf of Mexico.

[00:20:09] Karen Wolk Feinstein: Nancy, I know, I’ve been through the book. You don’t say this without a certain amount of depth of knowledge. It’s something we all have to take seriously. In what way is high reliability and compliance important but not the answer? Not that we want to be unreliable and non-compliant, but we need to go beyond that.

You all mentioned some things that are very interesting. You cited Paul O’Neill, and he always said, you can’t fix what you don’t understand. Medical education formerly has been so resistant to incorporating instructions in quality engineering, safety science, systems theory.

I am amazed that most of our health professionals know less than a community college graduate in business about how to set up a safe organization. I don’t think this has been talked about now for two and a half decades. I am very frustrated. I see very little change. I still don’t think we are graduating people in health sciences who understand how to design a safe system. Hopefully, with some leadership that will change.

The other thing you mentioned that I thought was very interesting is about leadership. I can name leaders who set an astonishing example and got a lot of positive press: Gary Kaplan at Virginia Mason in Seattle, Patty Gabbow at Denver Health.

I could also name, and I won’t, leaders who followed that example who got pushed out. Making safety a major platform did not buy them time at their institutions. So, it’s a tricky issue in health care, and often because the physicians have been resistant if we’re being open about this. And the last thing I just want to quickly mention is, yes, we can’t blame this on bad docs, but we do have an issue that we can’t skirt. And that is, it is very hard to remove a bad doctor. I will name one transplant surgeon who put inferior organs in people who didn’t qualify. He was passed along to two major academic medical centers while CORE knew about the problem and had raised the alarm. The state licensing board knew about the problem, but they said they were helpless.

So yes, I agree with you. This isn’t just about bad docs, but sometime or other, we’re going to have to deal with this issue. We don’t have a way of removing some of these bad docs.

[00:22:57] Nancy Leveson: Exactly. I wrote an editorial about the system problem that is that nobody wants anyone else to come in and do something about bad doctors, but the system, the design of how to get rid of bad doctors or retrain them or do something about it is broken.

[00:23:20] Michael Shabot: Well, all right, I won’t disagree. However, let me tell you that there are examples of health systems that have addressed and solved those problems objectively, by looking at events that have occurred and looking at individual clinicians, physicians, and others that have recurrently got into trouble. I can tell you that the health system that I was most recently at physicians that had ongoing problems were let go by their fellow physicians through peer review.

The high-volume surgeon at one of the hospitals was let go. The highest volume medical admitter was let go. It is possible when the whole organization is safety culture oriented and believes in it, then not only the administration and the staff, but also the physicians want to continue the high reliability. It’s the best thing for their patients. It keeps them out of trouble, meaning it literally prevents peer review. By that, let me just say when I left Memorial Hermann, one of the hospitals, there were 17 at the time. I was in its going into its ninth year without a retained foreign body, okay, anywhere in the hospital, not just the operating room.

That meant that no physician went to peer review for a retained foreign body. Now, that wasn’t achieved by telling the doctors and nurses to count better. As you know, Nancy, that just doesn’t work. But I want to make the point that processes were changed across the system to ensure that there were no retained foreign bodies. By that I mean policies and procedures that were agreed upon across the system by physicians and nurses in those areas, and a technical change was made. RFID tags were put in sponges, and patients were wanded before they were closed, whether the sponge count was correct or not. Because, as you probably know, the Association of OR Nurses has measured and published that 57 percent of retained sponges are with operations with a correct sponge count.

So, the point I want to make about high reliability is that number one, it is culture based. Number two, it doesn’t work unless there’s that culture, it requires consistent leadership over years. Third, many individual processes are studied of events that have occurred, even close calls where no one was harmed, and so that from the top of the organization down to the individual care provider, the entire process was made highly reliable. The problem that Nancy addresses in her writings, which are excellent, is that we do not have a healthcare system in this country. Even healthcare systems, groups of hospitals and ambulatory, they don’t have a system. There is no overall system, frankly, to be engineered, but what we did do was to engineer literally thousands of individual processes to make them highly reliable so that adverse events did not occur in these hospitals and clinics for years and years at a time.

[00:27:15] Nancy Leveson: Reliability and safety are different properties. You can have a very highly reliable system where every component works exactly the way it’s supposed to and have horrible accidents because it’s in the interactions of the components that’s the problem. You’re talking about redesigning safety systems. I absolutely agree with that. That’s why we have to redesign the processes, but we can’t redesign the people and say they should be highly reliable. That, or that we should comply with procedures. One of the biggest problems is complying with procedures, because the world is changing. The world is always adapting. Medical care adapts.

And we humans are great at adapting to these kinds of situations. For example, COVID, and all the chaos that it brought in the hospitals. But doing things, following procedures is quite dangerous, or insisting that people think that doing that will solve our problems.

In fact, one of the most successful ways to strike in industries where they’re not allowed to strike, air traffic control, and the government makes it illegal to strike, what they do is something called “work to rule.” In other words, they follow all the procedures exactly that they’ve been given to follow, and it will bring a system down to its knees. I mean, it causes chaos very quickly.

We have to be careful. People are very, very, very creative. They’re very adaptable. They’re flexible and they adapt to the situation that they’re given. And they adapt procedures when the procedures turn out to be not working well. Now, we should change the procedures, of course, eventually, figure out why they’re not working well and change them, but simply talking about reliability and safety as the same thing –they’re totally different and they can conflict. I’ve worked on systems where they made it more reliable, which made it less safe and they made it less safe, but it had to make it less reliable. Especially when we have humans and culture and other things involved here that are not. You know, reliability is an engineering concept, and it’s good for physical devices, but not necessarily for humans.

[00:29:53] Karen Wolk Feinstein: The one thing I think we can agree on, and Michael, I’m going to go across the gulf from Memorial Hermann and look at a renowned, one of our most renowned medical centers, who had children’s hospitals in Florida where a cardiac surgeon had gone rogue and the pediatricians knew it. They wouldn’t refer to this cardiac surgeon, but he went on and on causing havoc. When, for both of you, what system, in any industry, what industry became safer when they buried errors, when people didn’t take action and disclose errors? And when really there’s an industry in health care where they recode away errors, which is somewhat shameful. How many other industries have companies that will come in and look at your data and rearrange it so that you look less error-prone? What could we do to change that? That is part of the culture that cannot lead to progress.

[00:31:05] Michael Shabot: In health care, if you bury errors, you’re going to be burying people. The kind of environment that I’ve described, and I was fortunate enough to work in two different systems with this environment, so I, you know, I can’t be told that it’s impossible because I’ve seen the possible, you know by changing processes. And one thing I would say to Nancy is that many times the processes that I’ve talked about perfecting are very straightforward processes. They just need to be agreed upon across the system from the top of the organization, the executive ranks all the way down and then followed. So, I mean, to avoid an infection with a central line, which can be deadly, there are about 20 steps that need to be taken during insertion of that catheter.

I started this at Cedars Sinai in Los Angeles and I continued it system wide at Memorial Hermann. Every one of those steps is measured and recorded in the electronic health record, and the results are reviewed. When we started doing that, taking all the best evidence, the infection rate literally went to zero. It was zero almost every month in almost every hospital. These are measured very carefully by independent infection preventionists and with the blood culture lab tests, et cetera. So, no recoding there, just really patients aren’t being made sick or dying from those infections. So, perfecting those processes and making them reliable, I’ll use those terms, protected patients in a healthcare system that is in many ways not a system like as an engineering term.

And let me just add that one of the problems is that, of health care, one of the challenges is that patient care is so stochastic. It’s not predictable. It’s one thing to, in a clinic, to perfect the process of accurately giving a vaccine. A procedure can be outlined for that that everyone agrees on. It can be relatively simple. It’s followed. No one gets the wrong vaccine. In that same clinic, a patient may come in with abdominal pain of uncertain etiology and be worked up as an outpatient, given orders for lab tests and imaging. The patients are on their own to get those tests, they may find it challenging to schedule appointments for those.

Their symptoms may evolve over time, they may see another doctor or a specialist, so there’s another set of ideas about the etiology of abdominal pain that has to be taken into account. There may be family or care provider issues or cognitive issues. There isn’t an existing process for that.

Now, could there be? I think there could. I think if we truly had healthcare systems, there would be some entity or persons that would guide the patient through all that. We just don’t have that. You are, as you know, Karen and Nancy, you are your own advocate for your health care. You are your own systematizer. No one’s doing it for you.

There are special challenges that we have, and that’s why we focused on making countless individual processes reliable across the system. Before I left, we’d given out nearly 400 what we call Certified Zero Awards. These are for hospitals that went a year or longer without having a measurable adverse event occurring in many different categories. We had given over 1.5 million blood transfusions without a transfusion reaction. That’s better than six sigma reliability. The South Carolina hospital association in 2013 saw what we had published and started awarding Zero awards to its hospitals. And they had, by the end of last year, they had awarded over 1,500 Zero awards.

So, it is possible. It’s challenging. It’s difficult. We’re not operating within a true system, but it’s possible to make it safe.

[00:35:47] Nancy Leveson: I absolutely agree that it’s possible to make it safe. And, you know, my whole career is designed to provide ways to design the procedures. I mean, that’s what we do is design procedures that eliminate hazards.

I don’t think we disagree about that. The problem is that you have to worry, the problem is focusing purely on compliance with procedures. The reason, for example, we could build airplanes without any pilots today. We could, we could be flying them tomorrow. I’m not getting on one, but we, the airlines love it because pilots are expensive.

The reason we leave humans in systems is because they can adapt to situations that are different, that no one thought about at the time. And that’s why I want a doctor who doesn’t just follow a recipe, or a computer that diagnoses me. I’d like a human that can adapt and that can think for themselves.

So, there are times, I absolutely believe that the procedures can be improved. And in fact, that’s what we’ve done with laboratory data recently, in the study, and we showed how you could change the procedures or how you could change the management system to reduce adverse events from that.

But again, the reason we have human doctors and we can’t just have computers diagnose medical problems from the symptoms that are typed in is because reliability, repetition of the same thing over and over again in some situations is important, all right? It’s important that you always disinfect, for example, or that you go through procedures that make you aware that you might be leaving a sponge in or operating on the wrong part of the person, those are important, but they’re not the only solution to our safety problems. We need procedures, but we need people who know when to rigidly follow the procedures and when not to.

[00:38:25] Michael Shabot: I agree. I agree. And I, and I think part of our safety problem in healthcare is that we don’t really have systems. And that, as you pointed out in Engineering a Safer World, there’s drift in procedures. Humans adapt. They may find a quicker or easier way of doing something and, and trust me, we found any number of those in the high reliability procedures that we had developed. And in some cases, we had to modify or change them to take special circumstances that weren’t perceived in advance into account.

I do want to get back to Karen’s point about wayward physicians being tolerated. I will just tell you from experience that in an organization with a very strong culture of safety, which isn’t tolerated at any level, the administrative and executive level, or the physician or care provider level.

When that occurs in an organization, and I knew about the external facts of the situation that you described, Karen, but I can tell you that it could not have occurred in an organization that has a strong culture of safety. Hold each other accountable in a culture of safety and take whatever measures are required, even though they can be very difficult and painful, that is required to keep the environment safe.

[00:40:09] Karen Wolk Feinstein: So, let me take you to a place where I think we’ll all fundamentally agree. It’s about leadership and buy in. Gotta have buy-in. And because health systems are odd, where the power lies is a little complicated. Buy-in is missing. We know we have some shining examples, and Memorial Hermann has won awards and accolades. Unfortunately, we don’t all have systems where there’s widespread leadership and buy in.

Let me look at a shining example. And that’s one specialty where you got buy-in and that was anesthesia. I’m going back to 1985, right in the vicinity of Boston. The Anesthesia Patient Safety Foundation was formed, and there was widespread buy-in and acknowledgment that their specialty had problems. No more burying. Acknowledgment of problems, interdisciplinary. They look for engineers, scientists, human factors, anyone who they thought—didn’t have to be medical—who could help them as a specialty become safer. As we know, anesthesia became dramatically safer. Not only that, but they also weren’t afraid of technology. It enabled, but it wasn’t the answer. It was a human-technology partnership that eventually led to the whole simulation movement and development of simulators, involving both MDs and nurse anesthetists. But what I love now, when I travel the world, and I find leaders in safety, I want to tell you, whether it’s the Pacific Rim or England or Scandinavia, they’re anesthesiologists.

Anesthesia got excited. It was a source of pride for them. And it was an intellectual engagement that hasn’t stopped. How do I get that chain reaction in health care where we would be able to say to people leading our systems, this is good for your workforce? It’s going to be worthy and supported by your board. Do a Paul O’Neill—own this pursuit of safety as a reflection of what you know your core mission is.

How do we light that spark? I don’t think the spark has been ignited.

[00:42:46] Michael Shabot: I think you’re right. Let me comment first. And, you know, I was a practicing surgeon, trauma, general surgery, general surgeon with surgical service, in an ICU when that change occurred in anesthesia. Anesthesiologists were, frankly, flying blind in administering anesthetic gases. You know that. There were no measurements that were readily available. They just sort of experientially, on a patient-by-patient basis, gave more or less one gas or another to put the patient to sleep. What changed was required in hospitals, and it was very, very expensive at the time, was to put in a mass spectrometry measurement devices in every operating room.

At that time, there were no room type devices, so gas lines had to be run from every operating room back to a central mass spec where the gases were analyzed, and results displayed electronically in each room. Then anesthesiologists knew what they were giving, and they could see the patient’s response and they could tailor it. The change in anesthesia wasn’t by an organization telling anesthesiologists to be safer doctors. It was by implementing technology that allowed them to safely administer anesthetics, making it 10 to 100 times safer than it was before. It was a remarkable change. That’s the kind of change that I aim for in remaking the health systems that I was in. You’re right, that’s the model, but it was a change in process. I want to make one point and, of course, Nancy has made this point as well.

We cannot ever rely on the expertise of a physician, a nurse, a technician, et cetera, in health care to always do the right thing. We’re all humans. We all make mistakes every single day. And if you’re a doctor or a nurse or whatever, you’re going to make mistakes at work as well. We have to have processes in place that protect patients separate from my remembering to do the right thing for each patient every day.

That’s what makes anesthesiologists safer providers than they used to be, and we want to do what they did for all health care.

[00:45:20] Karen Wolk Feinstein: Nancy, from your wisdom looking at other industries, one thing we know right now, by looking at the satisfaction surveys, our doctors and nurses and other health professionals are not happy. In fact, some of the systems where I’ve looked at this, half the doctors said they wish they’d chosen another profession. How do we make the pursuit of safety morally, intellectually, legally invigorating? How can we not ignore the fact that nurses, one of the number one reasons they say they want to leave the profession is they can’t practice safely.

This is not getting attended to. But my feeling at looking at anesthesia is that this was invigorating. People say with pride that they’re part of a specialty that takes on safety, still takes it on, and they own safety. How do you instill this? It is a worthy pursuit, but it’s also a practical pursuit.

Our workforce is not happy. They do not want to practice, as Michael said, in a system, if there is, in a non-system that isn’t designed for safety. How do we spark that pursuit?

[00:46:39] Nancy Leveson: Well, I’m certainly no expert on workforce issues. I’m an engineer but let me say that it is not just introducing technology. Introducing technology has made some healthcare workers very unhappy and has introduced lots of problems.

We have to be careful how we introduce the technology and, you know, clearly the anesthesiology group has done an excellent job of engineering their processes, their equipment, and everything else. Let me just maybe say that there are difficulties of doing that at a higher level and in a wider framework. Again, I think that the safety management systems are broken in health care. We looked at the laboratory, we just did a large study of adverse events related to laboratory data and what we found is that nobody is managing this.

Now, there are some groups in the government who manage parts of it really well and other parts aren’t managed at all. There’s nobody managing them. There’s very little feedback of information. Again, aviation nuclear power, the industries that have had the very highest success, have lots of information. They collect information continually because you can’t manage something if you don’t know what’s going wrong.

And we can get, sometimes, the information at a local hospital. But when you look at something like the laboratory data, which comes outside of hospitals, some of it’s over the counter, some of it is point of care. Laboratories may be divorced from outside the physical hospital facilities and operating in a larger sense, we just don’t have anyone managing that well enough. One of the amazing things is that this latest study is more attention than I’ve ever gotten in 30 years of health care safety, and we’ve been talking to all the government agencies. It’s amazing. Virtually everyone, every group in HHS has wanted to hear about this. And what we did was just look at the management systems that that are in place and the lack of information, the lack of controls, the lack of ability to do anything about something, authority.

You know, management is authority, responsibility, and accountability. Sometimes in some parts of the systems, there’s nobody held responsible or too many people held responsible, so everybody thinks somebody else is going to be doing it. One of the problems with reliability is depending on redundancy. It doesn’t work, even in physical systems. They don’t have the authority to do something. Until we fix that, we can’t get a larger fix over more than just a small piece where people have come in. It’s wonderful to see where people have come in and redesigned their systems and figured out what they needed in terms of technology, what they didn’t need and what kinds of training, all of the things they changed. But we need that globally.

[00:50:24] Karen Wolk Feinstein: You’ve given me an opening for our closing. And that is, as Michael knows, we’ve been working hard on trying to get a National Patient Safety Board. I love it that you said you were speaking to all the different organizations at the federal level who touch on safety and health care.

And that is part of our problem. There are quite a few bits and pieces scattered around the federal landscape. But we don’t have a home for safety. We don’t have a federal locus where every day, all day, the focus is on how to ignite that spark that really, which is somewhat invigorating to take on the problems within health care and start solving them and using what’s available from technology without relying on a technology fix. So, you have given me a wonderful opportunity to close on that note because I do think that, and I love the fact, thank you, and Michael didn’t do this prompted by me at all. Paul O’Neill’s success at Alcoa was widely attributed to his passion, his core passion for safety. You gave us everything to think about. We need to instill that in health care. We need to have every healthcare system as you did at Memorial Hermann get excited about their core mission and use it as a platform. We’re not there yet, but hopefully through this podcast and the wisdom generated here, we can instill that enthusiasm.

Nancy, it’s somewhat of an existential crisis right now, when you think of the fact that patients are more distrustful than ever of their health system. Employers are very frustrated and disgruntled. One of the number one issues on the part of America and the public when they got polled is the cost of their health care and their fear of it undoing, unraveling their family savings. We have a lot to build on. Michael, we’re gonna have to rely on your leadership. We’ve got to get the health care industry to understand. With your help, both of you. Thank you so much for today. This was a very, very vigorous and invigorating conversation. Thank you, Nancy. Thank you, Michael. And I know we could go on for another hour.

To learn more about the effort to establish a National Patient Safety Board and any of the topics we’ve talked about today, please visit npsb.org. Also, we welcome your comments and suggestions.

If you found today’s conversation enlightening or helpful, please share this podcast with your friends and colleagues. We can’t improve the effectiveness of our healthcare system without your help. You, our listeners, friends, and supporters are an essential part of the solution. A transcript of this episode and references to related articles and resources can be found at npsb.org/podcast. Up Next for Patient Safety is a production of the National Patient Safety Board Coalition in partnership with the Pittsburgh Regional Health Initiative and the Jewish Healthcare Foundation. It’s produced and hosted by me with enormous support from Scotland Huber and Lisa George. This episode was edited and engineered by Jonathan Kersting and the Pittsburgh Technology Council. Special thanks to Teresa Thomas, Carolyn Byrnes, and Robert Ferguson from our staff. Thank you all for listening, and please take action, whatever you can do