Episode 15: Proposing New Partnerships

How has our fragmented approach to safety hindered real progress on preventing medical errors? Can we reshape the healthcare landscape through directed collaboration? Join host Karen Wolk Feinstein and guests Leah Binder, president & CEO of The Leapfrog Group; former NTSB chair Chris Hart, founder of Hart Solutions; Dr. Najmedin Meshkati, professor of Civil/Environmental Engineering, Industrial & Systems Engineering, and International Relations at the University of Southern California; and healthcare executive and patient safety leader Julie Morath, member of the IHI Lucian Leape Institute and PCAST Patient Safety Work Group, for a deep dive into the potential to create meaningful partnerships among patient safety related organizations and finally make substantial progress on preventing harms before they occur.

Listen to this episode on: Apple Podcasts | Google Podcasts | Spotify | Health Podcast Network

Featured Speakers

Referenced Resources (in order of appearance)

Episode Transcript

[00:00:00] Christopher Hart: Collaboration was really the foundational element that made this work as well as it’s been working…

[00:00:10] Najmedin Meshkati: However, if you look at the literature of Joint Commission, only very small percentage of mishaps and sentinel events are reported to Joint Commission…

[00:00:22] Leah Binder: First of all, they have to see the problem and when they look at their claims, they can’t see the problem. You can’t see patient safety through claims…

[00:00:32] Julie Morath: So, a perfect partner would be the NPSB feeding in information and data so that it can be disseminated in such a way that is a clarion call to action.

[00:00:47] Karen Wolk Feinstein: Welcome back to Up Next for Patient Safety, where we envision a world where medical errors, adverse events, and preventable harms are avoided, and where we examine the most promising paths to prevent these tragedies before they occur.

I’m your host, Karen Feinstein, CEO and president of the Jewish Healthcare Foundation and the Pittsburgh Regional Health Initiative, which is a multi-stakeholder quality collaborative. We’ve been working to reduce medical error for over 20 years – mostly unsuccessfully – but we can’t give up because there’s too much at stake. And that is the loss of approximately 250,000 lives a year and long-term injuries for many more.

Throughout this podcast, we’ve talked at length about solutions to the problem of medical error, the range from innovative technologies to human factors engineering to changing physician behavior. In this episode, we’re focusing on a concept that’s been discussed in previous episodes, that of establishing an independent, non-punitive federal agency, a National Patient Safety Board, to pull together the data and expertise that already exists. In the patient safety landscape right now, there are many players, but no central home. So, no single entity has the responsibility to ensure safety. Without a home, the ability to share learnings broadly and scale solutions has proven impossible as well as the potential to prevent harms before they occur.

But a National Patient Safety Board can’t do the job alone, nor should it replace existing organizations. In this episode, we’ll be exploring avenues for establishing meaningful partnerships with organizations that already have a hand in patient safety. We’ll also be looking at other industries that have made incredible strides in improving safety through collaborative approaches.

So, let’s jump into our discussion with this incredible, well-informed and experienced panel. So, Chris Hart is a member of the Joint Commission’s Board of Commissioners and a member of the President’s Council of Advisors on Science and Technology’s (PCAST) Patient Safety Work Group. He’s a former chair of the National Transportation Safety Board as a lawyer and pilot. He’s going to share examples of public private partnerships from the aviation industry.

Dr. Najmedin Meshkati is a professor of civil environmental engineering, industrial and systems engineering, and international relations – believe me, he is that multifaceted – at the University of Southern California. He’s also a commissioner of the Joint Commission, is on the board of the Patient Safety Movement Foundation for over 35 years. He’s researched risk reduction reliability and the culture of safety and human factors in industries, including nuclear power, aviation, petrochemical, and transportation. He’s also inspected nuclear power plants around the world, including not only Chernobyl, but here in Pennsylvania, our Three Mile Island accident in Harrisburg.

Julie Morath is a member of the IHI Lucian Leape Institute, a member of PCAST’s Patient Safety Work Group, and a former member of the Joint Commission’s Board Commissioners. Julie has held so many leadership roles, there’s almost no role that she hasn’t played in healthcare, both the front line and the highest level of executive responsibility. And she holds the John Eisenberg Award for Individual Lifetime Achievement – the first person to hold that award as she just told us.

And, never last, Leah Binder is president and CEO of the Leapfrog Group representing employers and other purchasers of healthcare. Leapfrog collects data on safety and quality in hospitals and issues annual ratings. She’s a champion in the broadest sense, an advocate for quality, and contributes frequently to major publications. She’s routinely listed among the 50 most powerful and influential people in healthcare.

So I’m going to start with you Chris. Can you talk about the public-private partners who serve on Commercial Aviation Safety Team (CAST) and why these partners collaborate?

[00:05:10] Christopher Hart: Thank you, Karen. And you put your finger right on the essential point, which is collaboration. That anytime you have a system that’s this complex, you need to – in order to figure out how it works – you need to have everybody who’s involved in the process or who’s touched by the process or even the maintenance people for the process, they all have to be involved or there’s a good likelihood that your process will not be suitable for all the people in the collaboration and therefore it won’t work.

So just to give you an example. We see a lot of information about medication errors. Well, without knowing about the medication error procedures, which of course are different everywhere you go, I would be very surprised if it couldn’t be made much better by a collaborative effort that includes everybody that’s affected by that process so that you’ll significantly reduce errors. If you talk about equipment design… equipment design is so inadequate in so many ways because everybody’s competing, but nobody does it the way that’s best for human factors.

So that results in A) equipment that’s not that friendly for humans in the first place, B) equipment in the same hospital, but serves the same function provided by two different vendors so they work differently. So one, the knob turns clockwise for more and for the other the knob terms counterclockwise for more so… imagine if one of your cars had the gas pedal on the right and the brake on the left and the other way around in your other car. Imagine what a mess that would be. And that’s exactly what we’re looking at, is again, the failure to collaborate.

I think we need, and another example is collaboration and communications between doctors, between staff, not just doctors, but staff and patients regarding several things. One is their diagnosis. Two is their treatment while they’re in the hospital. Three is their post-hospital treatment at home, I think that needs to be collaboratively developed in order to be more effective. So there are enormous opportunities for collaboration, and that’s really the glue that holds complex systems together. And when they started collaboration in aviation, after they had an accident rate that had been coming down for decades and was starting to get stuck on a plateau, they started a collaborative effort to fix it because the regulator at the time said, “in order to get off this plateau, I, the regulator don’t need more money. I don’t need a bigger stick. I don’t need more enforcement. What we need is all of us to work together collaboratively to make – to understand this system and make it work better,” and that the result of their collaboration was that they took that flat stuck rate, which a lot of safety experts at the time thought was unimprovable and reduced it by more than 80% in less than 10 years.

So collaboration was the foundation for that amazing process. And I see much, much room for that in healthcare in a number of ways. So that’s the bottom-line answer to your question is collaboration was really the foundational element that made this work as well as it’s been working.

[00:07:54] Karen Wolk Feinstein: We say amen, that we need more of that in healthcare. So just give us a sense of how this really works. How CAST gets data from ASIAS, the Aviation Safety Information Analysis and Sharing System at MITRE, and then how does it work? How do they identify risks and adopt safety enhancements?

[00:08:15] Christopher Hart: Well, for starters, it is entirely voluntary. None of this is mandatory, so that means people are doing it because it’s working and they keep doing it, and it’s been going for more than 20 years because it’s worked so well. So let me tell you what it’s done. It’s while it’s improving safety, the conventional wisdom is if you improve safety, you’re likely to hurt productivity. If you hurt productivity, you’re likely to… if you improve productivity, you’re likely to hurt safety. And that unfortunately is all too often true. But the airline industry showed that if you do, if you do this collaboratively, you can reduce, you can improve safety while improving productivity at the same.

Another thing that happens frequently in complex situations like healthcare is when you make a change to one subsystem that’s interconnected with all the other subsystems throughout, you get unintended consequences. And the difference is that people who are in the collaborative effort, they’re thinking and they’re talking safety, but they’re thinking about the unintended consequences on themselves. They’re also thinking about their own productivity. That’s why they were able to improve the productivity, because nothing exits that collective consensus pipeline if it hurts anybody’s productivity. So, when, when everybody is at the table, who, to whom these unintended consequences could occur, then you minimize unintended consequences.

So, in so many ways, the collaboration proved to be much more successful than ever intended. And the icing on the cake, as I say, was that it improved productivity while it was improving safety.

[00:09:41] Karen Wolk Feinstein: Talk a little bit then about you… we have CAST, we have ASIAS. How did they work with NTSB?

[00:09:47] Christopher Hart: ASIAS is the information source. So what ASIAS does is it gathers the information. These days it gathers it from most of the flight day recorders from most US airlines and puts it together in a single massive system-wide database. So CAST is the database, the source of the information upon the commercial aviation team acts to see what are the tall poles in the tent? What do we need to address? And not only that, but after the fact, are we actually addressing it? Is what we’re doing effective to what we’re trying to do because it quickly brings that feedback to say that this is working or this is not working. So the ASIAS is the information source, the fuel, and CAST is the machine that uses the fuel.

[00:10:29] Karen Wolk Feinstein: Spoken as a pilot. You’ve done this very well, so I’m going to ask some questions of Naj. Naj, you’ve looked at several high-risk complex industries, and you’re also a proponent of having an independent agency, a home. Could you talk a little bit about why independence in your mind, whether it’s petrochemical, industry, or nuclear power, why is independence so important?

[00:10:56] Najmedin Meshkati: Thank you very much Karen and good afternoon from Los Angeles. It’s really a pleasure for me to be on this panel, particular with my friend, the Honorable Chris Hart for many years. It’s really an honor for me to be in the same list with you, Chris. As you mentioned in my background, I have been working with multiple, what I call that safety critical industries in the last 35 years of my academic research career, nuclear power plants and chemical processing. Recently the last 20 years, healthcare, maritime, mining, and oil and gas drilling and refining, and others. One very important thing that since we are concentrating on the National Transportation Safety Board, I will tell you one story about NTSB, which is a federal safety investigative agency or federal independent safety agency. There are two others that I’ve worked with, I had the privilege of working with… one of them is the US Chemical Safety Board. The other one is a smaller organization called Defense Nuclear Facilities Safety Board, which is only with the jurisdiction over Department of Energy, Nuclear Weapon Labs like Pantex, Hanford, or others like that.

The importance of independent safety board is the following. First, they are not basically considered as part of the regulator. In the case of NTSB, they are not considered part of the Department of Transportation, and this independence really gives them a lot of latitude. And some of their studies that I have seen have been amazing. For example, NTSB investigated an accident at Fort Totten, which happened in Washington, D.C. in June, 2009 – if I remember. When a train, Washington metropolitan or Jersey train got involved in an accident. That accident investigation by NTSB, it’s become almost like a quintessential accident investigation. The part about safety culture of that accident investigation, which is in their report, and I’ve used that in my classes as handout.

That Fort Totten accident investigation by NTSB has been used by the Nuclear Regulatory Commission as an example of safety culture, and then they relate the findings of NTSB to nine traits of nuclear safety culture. This, to me, is quite amazing. And the other thing is after the Three Mile Island accident of March 28, 1979, and then particularly after Chernobyl accident of April 26th, 1986… in United States there was a lot of interest in independent safety board and in fact, our own President Biden that who was a senator back then from Delaware, he proposed a legislation – as I may have shared that with you, Karen you know that – he proposed a piece of legislation called Independent Nuclear Safety Board, and he proposed that in 1987, 89 until 1992. But unfortunately, it didn’t pass because at that time, Senator Biden realized, Nuclear Regulatory Commission is a regulatory agency, but for doing a good incident and accident investigation, we need an independent institution.

[00:14:41] Karen Wolk Feinstein: I have read a lot of your comments on what went wrong at Three Mile Island. You understand, those of us in Pennsylvania, every time we drive to our capital, we see the two towers. We’re reminded constantly. But it is interesting. It was a catastrophe… we haven’t had anything like that since. What has prevented another Three Mile Island?

[00:15:08] Najmedin Meshkati: That’s an excellent question. The nuclear accident at the scale of Three Mile Island, we call that low probability, high consequence event, and I can’t be happier that we haven’t had anything like Three Mile Island in this country, which was a reactor meltdown. However, we cannot be complacent. Part of the reason is some of the recommendations were made by not the Nuclear Regulatory Commission but by two other commissions, Rogovin and Kemeny. One of them was a Presidential Commission, the other one of US Congress. They did an investigation of Three Mile Island accident, and they made a lot of recommendations to President Jimmy Carter back then that were incorporated.

Another reason that we are having a good system in United States, in the case of our nuclear power system is after Three Mile Island or because of Three Mile Island, the industry got together and created an entity, which is called Institute of Nuclear Power Operation (INPO). INPO is really an industry generated and created entity that shares information. They do a peer review of their performance indicators and others. Unfortunately – and I say that unfortunately – they don’t share that with public, and their argument is, this is confidential information and that, but the existence of import… the results of that of INPO has done a great job for the nuclear power industry in United States. Information sharing, analysis of the incidents, and then share that information with their, within their members.

[00:16:55] Karen Wolk Feinstein: So you bring up something very important to us in healthcare because the institute keeps the information confidential. Do you have confidence that near misses, mishaps, you know, any adverse events actually get reported that might not otherwise? A lot of things go unreported and we, I know we have David Classen here and he’s an expert on digging down and telling you how many things go unreported. Do you think that in the nuclear power industry, because information that’s given to the Institute for Nuclear Power, because it’s confidential, people really are sharing any adverse events, any near misses?

[00:17:39] Najmedin Meshkati: I think that’s one of the reason, and the – in the case of INPO, it’s many people they criticize that that is a secretive organization and I understand that to some extent that because it is confidential and some of them are performance-related indicators that’s what they shared that within the members. However, you mentioned that I’m on the board of the Joint Commission – by the way I’m not representing Joint Commission in this meeting, I’m only speaking on behalf of myself – there is a database in Joint Commission, which is called Sentinel Event Database, and this is based on the voluntary reporting of the mishaps or near misses from different healthcare organizations.

However, if you look at the literature of Joint Commission only very small percentage of mishaps and sentinel events are reported to Joint Commission. I have used that database for looking at something like the wrong-site surgery or one of my former doctorate student and fantastic colleague, Dr. Maryam Tabibzadeh, has just recently published a paper using that sentinel event database for the foreign object retention during surgery. I think we have something that Joint Commission, Sentinel database, but we need to expand upon that for this voluntary reporting.

[00:19:05] Christopher Hart: What we found at the NTSB was many times when we investigated accidents, things that the regulator did or did not do were links in the chain to the mishap. So if the regulator does that report that’s not likely to appear in the regulator’s report. But if we do the report, in fact, in every industry we looked at, there were more recommendations to the regulator than to any other entity in that industry. If the regulator investigates it, it’s not likely to report things that see it just because, you know, fish don’t see the water sort of thing, may not even see the problem, but we see it from an outsider’s viewpoint, and that’s crucial to having an independent investigation. Thank you.

[00:19:38] Karen Wolk Feinstein: Thank you, Chris. And for those of you who weren’t alive during Three Mile Island, there’s a really good Netflix replay and you can also look up Dr. Meshkati’s comments after the incident, which I would say are characterized by total candor. And then the last question that is probably pretty obvious Naj, how have autonomous technologies critical roles? We have a nuclear reactor here, not too far away… how have autonomous safety technologies kept us safe?

[00:20:13] Najmedin Meshkati: This is a double edge sword because I remember particularly after a series of situation that we have had that I refer to a quote by my mentor, the late professor Jens Rasmussen. He said, “the only reason that we are keeping the human operators in the system is to plug the mind of the system designer.” And this is exactly what has happened, that autonomous technologies have been very helpful in basically eliminating several types of errors that they are could be related to the skill-based or knowledge-based errors or at the level of cognitive control as we use that SRK framework.

However, we really need to make sure that if and when these autonomous technologies, they go down, what’s gonna happen? Who is gonna save the day? We have a paper published about the role of operators’ improvisation in “saving the day” in averting disaster. One of the cases that you are very familiar with is Captain “Sully” Sullenberger, and of course the Miracle on the Hudson. The other case, which is a little less known is Fukushima Daini. Not Fukushima Daichi. Fukushima Daini was saved because of the improvisation, led by Superintendent Masuda and his operators. That’s why when it comes to the autonomous technology, I usually feel a little bit and nervous. Yes, it could save and it could protect us from several type of error. However, it may cause another type of unintended consequences as we saw that in the case of, Maneuvering Characteristic Augmentation System, MCAS of Boeing 737. See, this is very important. Whenever we bring this autonomous technology, we really need to make sure that we have done a great job.

We have done our due diligence in looking at the technology readiness level and its alliance with the human readiness level. Human readiness level is something that our Human Factors and Ergonomic Society has developed a document on that with the American National Institute of Standards, ANSI Standard. And this is very important about using more autonomous system, more automation in the system so that we don’t repeat the debacle that we had with the MCAS and 737.

[00:22:59] Karen Wolk Feinstein: So it’s tech enabled, but inseparable from a human role and human skills and human awareness of how to avert disasters, but also a great deal of testing and experimentation before anything gets deployed. So let me move on to Julie, Julie Morath. Julie, you have served on the Joint Commission and been involved with us at NPSB for quite a while. What kind of synergies would be possible between the Joint Commission and an NPSB?

[00:23:36] Julie Morath: Karen, and I’d like to break that down to three parts. Just comment on Joint Commission, comment on the potential of NPSB, and then what I see jointly could happen. So in terms of synergies, both organizations have values and goals to protect the public through safer care. So that’s a great start. However, the Joint Commission works in real-time and at a local level with the organizations that they accredit and review, and their job is to assess and make judgements about performance to standards. So that results in ranking comparisons and consequences to that… the unintended downside is that it often inspires protective and defensive mechanisms and maneuvers by the hospitals that are being looked at because it does affect reputation, finances.

So it… the success of the Joint Commission I think really depends on the local cultural climate of patient safety and transparency to really embrace what the Joint Commission and the learnings from an assessment have to offer to get better. What we don’t want is an impoverished view based on protectionism. What the NPSB I believe, is set up to do is instead of assess and judge… to study, learn and diffuse innovation, and disseminate recommendations. And that comes at it an entirely different way, which is very liberating. It’s a safe place, there’s no judgment, no policing. And it also brings together experts in the sciences that are involved so that there’s creative imagination that can be used for system solutions and be proactive and long term. And that gives us the opportunity to look at and be inspired by will be and can be versus what is and fixing what exists today.

Together, I think the organizations can work to sort out duplication, remove and simplify things that burden the organizations. If you just look at the maps that you had in your slides, imagine being the end user of all that advice and expertise and measures. And, so for our families, our patients and our providers and organizations, it can be very onerous. So the idea that this complexity can be tamed, coordinated through collaboration is a very powerful way to approach this. Removing and simplifying things that burden and begin to build a more robust infrastructure of collaboration so that we can begin to really tease out what’s important from what might be urgent, and begin to build things, solutions that are sustainable.

[00:26:53] Karen Wolk Feinstein: And, you know, the, this sort of a dream scenario could be that you heard Chris talk about misaligned equipment used for the same purpose, but you turn, you see, you say, turn the knob one way and turn the knob the other way. So, sort of a dream I have is that the NPSB would actually work with the vendors and the health professionals who say, you know, now that we have so many travel nurses, for instance, they go from unit to unit, they go from hospital to hospital, and they encounter all these different defibrillators with, with actually contrary operating mechanisms. And that the Joint Commission would then, you know, if, if the recommendation is for standardization and the vendors start to standardize, that maybe the joint commission would say, “look, we want the standardized version of defibrillators because it, it’s not making sense to us how your system wouldn’t be deploying that.” You know, some way that, that once we realized that, that there was a joint solution that would save many lives, that that would become part of the accreditation process. But anyway, that’s, that’s my dream going forward.

So you’ve had a lot of experience with nursing and hospital health system associations. It would seem to me that whether it’s your professional specialty or your, you know, the fact that your industry is relatively unsafe, and the public is experiencing a lot of distrust that you would have an interest in making the industry or the specialty safer – let’s hope. How might the NPSB work with associations and specialty groups so that at, you know, with a common interest in making the public feel more confident that, that they will have a safe encounter?

[00:28:55] Julie Morath: You know, we have somehow gotten ourselves in a place where we’re reducing things that are really important, like patient safety to a checklist and paper chase, and we have not fully exploited what we know will work by calling on our – I used to take a human factors researcher from the university on rounds with me to just be astonished by the workarounds and the lack of standardization and the inherent risk and everything that goes on. And the ICU is a perfect example of that. When we are able to identify – and it’s a different level of analysis – rise above the independent interests of the organizations, of the vendors, of the manufacturers to look at what we need to deliver care for our patients. We have the science, we have the technology, and it’s all disaggregated. And so I would hope, and I will work for, that we will see these solutions in standardization, simplification, and appropriate utilization testing before new technologies are introduced into the environment.

[00:30:24] Karen Wolk Feinstein: And I will say that, you know, we can’t discount what this pandemic has revealed about a certain public distrust now of healthcare. I often say to Chris, I get on an airplane, you know, I nap, I sip wine, I read papers. You know, I am like, as relaxed as I can be, but I go for a simple procedure, and I’m just a nervous wreck. I have 50 things in my head on what could go wrong. I’d like to hope that the industry and the associations that play a key role would say, “look we, we’ve got to make the public more confident that they will be cared for as precisely as I feel I am when I get on an airplane.”

So there’s some private organizations that have played a really strong role, that have been very active in patient safety. And then of course we think of Lucian Leape who for me at least started the whole inquiry into “how safe are we?” So talk about independent organizations like IHI’s Lucian Leape Institute. How could they partner with an NPSB?

[00:31:40] Julie Morath: The Lucian Leape Institute is set up to be a think tank, to envision the future, look at what can be, and catalyze action around those areas. So a perfect partner would be NPSB feeding in information and data so that it can be disseminated in such a way that is a clarion call to action. There are other organizations, IHI is well known for its improvement efforts. So if we focus on what those need to be and direct all resources to that, instead of water bugging over the top of so many things that don’t have the impact that we’re looking for to really make a difference in the experience of patients and their families and the wellbeing of those who are caring for them.

[00:32:35] Karen Wolk Feinstein: I’m going to ask Leah some questions. We’ll cover some different ground. Leah, you work with corporations all the time and our big business leaders, how do we get them to put patient safety as a number one item when they’re negotiating their healthcare, health plan premiums… how do we get them – I mean, in many of their industries, they’d go out of business if they weren’t safe. How can they be a player in making healthcare safer?

[00:33:10] Leah Binder: Well, I think first of all, they have to see the problem and one, when they look at their claims, they can’t see the problem. You can’t see patient safety through claims. Now you can dig in and you can probably, some people can – if they spend a lot of time on it, they’ll find some errors, they’ll find some problems – but they won’t see it as a ubiquitous problem from looking at claims, because typically claims don’t bill. Claims are bills, and they don’t bill directly, they’re, most of the time they don’t bill directly for, “we gave them the wrong medicine, we had to treat them because that caused some complications and then we gave ’em the right medication and then they fell and then, you know…” that’s not in the claim. So I mean, it’ll be all the costs of that are going to be in the claim but not that underlying reason, which is the really significant problem of safety.

So that’s why they created my organization, Leapfrog, was to try to measure it, try to find out where these are happening, and to publicly report that so that they had it for themselves, and they could look at it and also, so the public had it and they could make decisions about their own healthcare in light of the problems with safety. So that, that would sort of drive a market for better, safer healthcare. So I think the key to getting the purchasers engaged around safety and many are, they’re involved with what we do, but not enough, not all of them. So the key to getting them involved again is for them to see the issue and then to not to see it in, in a general sense, which we can all recite the statistics. They’re horrible nationally, but they need to know that the hospital down the street from you has a rate of infection that’s three times higher than the one that’s in the next town over. That motivates, and I’ve seen it motivate many, many times, and that’s what we need to be able to do and that’s why we exist.

We actually, I mean, we love the NPSB. We love being part of this coalition. I think it’s a fantastic idea. We also think that when we have these great minds coming up with these fantastic interventions that will really work to improve safety, and they will… we also have to have pressure from the outside, to your point, Karen, from purchasers, but also the public saying not just pressure in a negative sense, but in a positive way saying, “we are watching. This matters to us. How you do is something we are going to take into account in our own decision making, and if you do a great job, we’re going to recognize that. And if you do a bad job, we’re also going to recognize that.” It’s just human nature, we have to be – we have to honor human nature in any effort, no matter how great it is to improve safety in one institution is not necessarily going to be sustained unless somebody cares. And we’re the ones that that’s our job. We feel like our job at Leapfrog is to just care and we hold hospitals accountable and ambulatory surgery centers. There’s a lot of other settings, by the way, where patient safety is necessary, obviously.

So we have to be the ones to look at that to measure the performance, and we do that with a huge number of collaborators like National Quality Forum and AHRQ and others that will help us with very good scientific measures. We have to look in a very responsible way. We have to look at performance, and we have to try to stoke this flame when good things are happening.

[00:36:36] Julie Morath: Leah, one thing I’ve watched with your organization and embraced is that you, through your mechanisms, get the attention of boards of governors and CEOs. And leadership matters in this work. Leadership really matters. And so my drive is how do we keep these leaders awake at night worrying about what we’re worrying about and actually demanding the changes that need to be made within our organizations.

[00:37:07] Leah Binder: I appreciate you saying that, Julie. I think that is really important point. We ask hospitals in our survey if patient safety and quality performance from Leapfrog is incorporated into CEO salary reviews. And the answer for 55% of hospitals is “yes”. So I mean, that’s progress not – and I don’t mean to just leapfrog per se – but it’s progress when C-Suite compensation is tied to safety. That is progress. That’s something we kind of dreamed about 20 years ago, and now it’s happening. So that’s a very good thing. But I, I totally agree with you and I think one of the reasons that our letter grade, you know, the hospital safety grade has had some effectiveness, is that it’s in the front page of the local paper, and that’s read by the board of directors and it’s read by the CEO and suddenly it becomes relevant to them too. Regardless of the extent to which the payment formula rewards them for that the… just the public opinion and I think you made the point about local culture and whether there’s a local community culture that cares about safety. And I think that’s what’s important about what we’re trying to do, what others try to do in transparency.

So I think embracing what’s done by the NPSB with this culture of transparency that we’re trying to promote using highly responsible metrics like we’re getting now from measurement science, I think will have a big impact. I think that it’s critical that they work together.

[00:38:42] Karen Wolk Feinstein: It would also be good to give a lot of thought to something just beyond metrics for governing boards to chew on. A lot of the business community obviously sit on governing boards. And you know, they look at data, but there are a lot of other questions that would tell you – I mean, the Joint Commission, as you know well, and Julie and Chris and Naj, the Joint Commission has an accreditation process – there are a lot of things the governing board should be looking at routinely. I mean, walking the floors. I mean, talking to staff at the front line. Knowing what to look for – that’s just beyond the report card data. And I don’t think that’s being done as well as it could, but it would be great if the boards felt more confident because they were looking at a variety of safeguards that should be there.

[00:39:36] Leah Binder: Can I just say something about that, Karen, too? Cause one of the things that we’ve always, I think, put a premium on at Leapfrog is structural measures of safety. David Classen was one of the founders of one measure we have that we’re really proud of on our survey, which is about the use of technology for preventing medication errors. And testing that technology to make sure it actually works, which actually is not done unless they, somebody takes the Leapfrog survey, which is ridiculous. There should be lots of mechanisms to test whether these systems work to this benefit of patients. But regardless, we want to know, do systems actually put in place the mechanisms that sustain safety. And I mean, we talked earlier about the importance of just sort of an automation in some way, whether it’s human automation or technological automation, we have to automate safety.

It has to be just what we do. That’s how we act. And one way we’ve tried to get at that is looking at technology and other kinds of structures. And so I think that’s another way that, that I think we’ll be able to work together in the future with NPSB is and with others, and all of everyone here today to think about what are the best practices that should be in place without naming like 500 of them, but really what are the fundamental best practices in every health system? Every health provider should have in place to maintain safety over the long run, cause we did see with the pandemic that we’re not there yet at all. We’ve deteriorated on safety in very significant ways during the pandemic. And so we need to be able to make it through the next public health crisis without sacrificing every patient to some kind of extreme risk.

[00:41:17] Karen Wolk Feinstein: So Leah, let’s talk a little bit about data – something that you’re all too familiar with. Do you see measurement organizations like Leapfrog sharing data with NPSB?

[00:41:30] Leah Binder: Absolutely. In fact, we hope that you will look at it – or you or everybody, all of us! I guess we’re collectively, we’re going to own this, but absolutely, we’ll share the data, and we hope that it’s used. That’s the thing about data. See, everyone complains about measurement “oh, we have so many measures to collect,” and I understand it. There’s a lot of burden around measurement, believe me, I see it, but it’s particularly burdensome… measurement becomes particularly frustrating and burdensome, I think, when it’s not used. So there’s so many, so much data that’s collected right now in hospitals that’s never used. Someone just puts it, I guess, in a file somewhere, and it’s never used. And so what is the point? People get really frustrated by that. We have to be able to get to a point where we’re collecting data for measures that are then used and we can all see the impact.

I think anyone who’s taking the time to record something or make sure it gets into a measurement framework, wants to know that’s going to do some good in the future. And we need to show that. I think the NPSB is going to be able to help us do that and make sure we do that. Not just show it to each other within a health system or in a community but show it to the country because let me just say that one other reason. We are big champions of this idea is that we have to put patient safety on the map for Congress. We have to shine a light on it as a separate issue, not as something that’s folded into as like kind of a sideline venture for a whole variety of other initiatives in healthcare. Oh yeah, we’re going to think about safety and we’re going to think about this and I’m going to think about that.

Safety in and of itself as the third leading cause of death in this country ought to have its own center of gravity. And so, we think that this will have a big impact nationally and at least shining a light, putting focused attention on this problem that we all know how to solve. We’re going to come together and make sure we do solve it. And I see our role as just providing whatever help we can do to that process, advocating for it, and shining a light on progress so that we keep it going.

[00:43:31] Julie Morath: I’ve spent a lot of time thinking about measurement, as many do, and I’m sure David Classen would weigh in. But to me, measurement is only of use when it has meaning and that it is interpreted and used for change and to inform change. And I think one of our opportunities here coming up – and we’re spending a lot of time in PCAST work group talking about it – is taking all the data and having some kind of decision-making engineering available so that collectively we are looking at the issues simultaneously and giving them meaning and the ability to apply for effective change.

[00:44:16] Karen Wolk Feinstein: Well, Leah and Julie, you’ve both said the right thing. So now I’m going to throw a curve ball. Those of us know that a lot of health systems hire coding, recoding, upcoding companies to literally change the data, whether to avoid penalties or to look better to get higher ratings. I’ve heard health systems say they hired “x”… “x” is the person, a person who was supposed to come in and help them get better ratings. A person doesn’t help you get better ratings. A system gets structured to be safe and deliver high quality, reliable care. How are we going to get around that? Because literally, once I heard one system say, “well, Leah, to your comment – this other system two hours away – they don’t actually have better outcomes. They’re just better at manipulating the data.” At which point most of us want to scream.

[00:45:12] Leah Binder: That’s what they’ll say. But there’s a whole variety of ways of measuring, and we have to look at all of them. We have to have hospitals, let’s say hospitals are extraordinarily complex institutions. So we have to have, we have to have a different ways of measuring different aspects of what they do. So, and I think we’re starting to get there. I mean, we have endorsed measures for clinical outcomes. That’s good for some processes, it’s okay. We also need measures of patient experience that is related to actual patient outcomes – definitely getting there very… I think the HCAHPS process has been just a breakthrough in the last decade. It’s been fantastic.

We have to have… big one, we have to have ways of collecting data electronically automatically. The way your supermarket knows that you bought Fruit Loops and that they have one fewer fruit loops on the shelf when you buy it, they kind of ought to be able to do the same thing in hospitals, and when there’s an infection, they automatically know that their infection rate just went up. I mean, but we don’t, you know, we don’t have anything close to that yet, but we should! So I think, we have to work on all, a lot of different ways. We have to, the key to that is going to be, we have to have a motivation for doing it. We have to have somebody interested enough.

Now, the interest – let me just add one thing to what Julie said, though, I don’t think it’s going to be just one like here’s the blueprint of what’s good and important because we have a… the public is the most important, beneficiary of, or should be, of measurement. Right now, they’re not necessarily, cause we don’t usually always do measurement just for the public and nor should we, I mean, there’s other reasons to do it, but there’s, the most important measures have to be the ones that the public wants and needs. And that’s where there’s going to have to be a thousand voices on that. But we can make sure that they’re reasonable and that we’re measuring public youth of measures as a gauge for whether it’s an important measure.

[00:47:04] Najmedin Meshkati: If I may just add one more thing about NTSB, which I don’t know if it’s been mentioned yet or not. NTSB after their accident investigation, they make recommendations to the union, to the industry, and to the regulator. For example, in the case of FAA to the FRA or Federal Railroad Administration for Railroad, and then they put that in something which is called the NTSB’s Most Wanted List. I found that most wanted list is being really a very useful document. For example, I can refer to one example… for many years, NTSB was recommending the railroad industry they need to install and use positive train control to prevent heads-on collision. And it was on the most wanted list. And I remember in the old version of most wanted list with this recommendation to FRA and others, NTSB had a color code. Red meaning that it is not implemented yet and orange was, yes, it’s implemented and not fully on that. And this most wanted list every year – they were repeating this most wanted list – with a red color for two Federal Railroad Administration until the Chatsworth accident of Metrolink happened. I think it was September 2008.

Metrolink train had a head on collision with Union Pacific that killed 25 people and injured many here in north of Los Angeles. And then after that, that Most Wanted List of NTSB became very important. And that was one of the impetus that Senator Feinstein and at that time, to Barbara Boxer pushed for the Railroad Safety Act of 2008 because they refer to the NTSB Most Wanted list. And I think this is very important that another cross-cutting issue that NTSB has had it on its most wanted list because it affects all – its five months of transportation has been the issue of fatigue and cumulative fatigue. This thing I think is very important for the healthcare industry, particularly if you look at that in the context of staffing and also clinician burnout. I think we need an organization to look this at this issue of fatigue and staffing level and clinician burnout and to make recommendation for the whole industry.

[00:49:38] Christopher Hart: And one thing that the NPSB can do with data that isn’t currently being done. Another one, another example would be if we have, we hear a lot about too many false alarms. So if, if what the NTSB would do is dig really, really, deep and they would say this step was missed in this operating procedure, which made that operating procedure more challenged. And the reason the step was missed is because the nurse was responding to this false alarm, and in the course of being distracted by responding to this false alarm, they missed a step. So that’s an example where, the digging deep part is very crucial to really getting to the root cause of what went wrong, and going back to why can’t we get rid of all these false alarms?

[00:50:17] Karen Wolk Feinstein: I was gonna throw the last question to Leah, but now I’m just going to make it a rhetorical question for anybody, and, and that is how do we get the health systems to expand the circle of safety expertise everywhere within their units, within their practices and at the C-Suite? Too few human factors engineers. Too few data experts with AI/ML. Just too few of people who come from different disciplines that’s not either business school or healthcare. You know, I’m throwing that out rhetorically, but the industry is not only behind in technology, and I hear you, Julie, it, it has to be tested, but we’re still far behind in safety technology, but also in engaging all these other disciplines that certainly for Naj and Chris in the industries in which they work, multidisciplinary expertise is taken for granted and for Leah as business people.

But hopefully that will happen. I’m throwing that out as a rhetorical question because no one probably has any magic. The magic I hope will be and thank you for hearing from all of you, the panelists. And just say that I think that the kind of partnerships that we could form, if there were an NPSB, that shows we’re aligning as an industry that we really are putting aside our individual competition, our competitive instincts that is secondary now to an industry that is struggling with what we call the twin or triple crisis, if you add the crisis of mistrust, the resignations, burnout, everything that goes with that… the medical errors inseparable from the resignations and burnout, – one creates the other – and the public’s distrust. We’re not going to be able to hide this forever. It’s out there. So hopefully the industry will align. Will form these partnerships. Will share honest data. I don’t care if they do it confidentially and we’ll start to have solutions.

To learn more about the effort to establish a National Patient Safety Board, please visit npsb.org. We welcome your comments and suggestions. If you found today’s conversation enlightening or helpful, please share today’s podcast or any of our other podcasts with your friends and colleagues. We can’t improve the effectiveness of our health system without your help. You, our listeners, friends, and supporters are an essential part of the.

If you want a transcript or the show notes with references to related articles and resources that can be found on our website at npsb.org/podcast. Up next for Patient Safety is a production of the National Patient Safety Board Coalition in partnership with the Pittsburgh Regional Health Initiative and the Jewish Healthcare Foundation. It’s executive produced and hosted by me, Karen Wolk Feinstein. Lisa George and Scotland Huber are associate producers.

This podcast was edited and engineered by Jonathan Kersting and the Pittsburgh Technology Council. Special thanks go to Robert Ferguson and Steven Guo. Thank you for listening.