Episode 04: Factoring Humans + Machines

Join special guests Dr. Jonathan L. Gleason, executive vice president, chief clinical officer, and endowed James D and Mary Jo Danella chief quality officer at Jefferson Health, and Dr. Raj Ratwani, vice president of scientific affairs at the MedStar Health Research Institute, director of the MedStar Health National Center for Human Factors in Healthcare, and associate professor at the Georgetown University School of Medicine, as they explore how human factors engineering can be applied within health care to help humans and machines work better together.

Listen to this episode on: Apple Podcasts | Google Podcasts | Spotify

Featured Speakers

Referenced Resources (in order of appearance)

Episode Transcript

[00:00:00] Karen Wolk Feinstein: Welcome back to Up Next for Patient Safety, where we untangle the web of causes for our high medical error rate and discuss promising solutions with experts. I’m your host, Karen Feinstein, CEO, and president of the Jewish Healthcare Foundation and it’s three operating arms, which includes the Pittsburgh Regional Health Initiative.

PRHI is a multi-specialty quality improvement collaborative. We’ve been working to reduce medical error for over 20 years – mostly I’d say unsuccessfully – probably why we’re here today. But I’m not giving up, there’s too much at stake, a loss of approximately 250,000 lives a year, and then long-term disability for many more.

So I’m always looking for new approaches. And when I look at other sectors and industries, who’ve made impressive gains, I find inspiration. On today’s episode, we’ll be talking about human factors engineering, a field that’s fundamental to making other industries safe, and we’ll consider how its core principles can make healthcare safer. We’ll examine its application to healthcare and think about whether we can leverage autonomous systems to reduce the role that human fallibility plays in patient outcomes.

So we’ve talked about a National Patient Safety Board, and I often wonder what would an NPSB do? So I figure it would look around the nation for the best problem solving experts, and then they would engage them in solving the major patient safety harms. I like to think that’s what we’re doing in this podcast series. Today, I bring you two of those national experts that we found that we would want to engage if we were the NPSB.

First is Dr. Jonathan Gleason, who serves as executive vice president, chief clinical officer and endowed James and Mary Jo Danella chief quality officer for Jefferson Health. Under Dr. Gleason’s leadership, Jefferson Health designed and implemented the OnPoint program for advancing care excellence at Jefferson resulting in major improvements in clinical performance and safety. Dr. Gleason also served as principal in collaboration with Aramark to build and commercialize the EverSafe™ OS safety management platform to enable businesses to manage safety. Previously, he served as chief quality officer at Carilion Clinic, where he developed and led the department of clinical advancement and patient safety.

He’s an advocate for people with disabilities and his landmark publication in the New England Journal of Medicine Catalyst about the impact of COVID-19 on individuals with intellectual disabilities resulted in global changes to vaccine prioritization. He serves on numerous boards, editorial boards, and national committees related to quality, safety, health disparities, and high value healthcare. What you might not expect is Dr. Gleason is also a practicing pelvic floor reconstructive surgeon. So obviously we found a person here with many facets and many talents.

I’m also going to introduce Dr. Raj Ratwani, vice president of scientific affairs for MedStar Health Research Institute, director of the MedStar Health National Center for Human Factors in Healthcare and an associate professor at the Georgetown University School of Medicine. He’s also an active applied researcher serving as principal investigator on numerous grants and contracts, including three research project grants, R01 awards, from the US Department of Health and Human Services, which are among its most prestigious grants.

Raj has extensive expertise in health information technology, usability and safety, interruptions and workflow, data visualization, and data science. His research has been funded by the Agency for Healthcare Research and Quality (AHRQ) National Institutes of Health, the Pew Charitable Trust, and many industry partners. His work has been published in such high impact journals as the Journal of the American Medical Association and Health Affairs. His research has been featured by Politico, Fortune, Kaiser Health News, NPR, and many other media outlets. He holds a doctorate in human factors and applied cognition and was a National Research Council postdoctoral fellow at the U.S. Naval Research Laboratory. So welcome Jon and Raj. We’re interested to hear your perspective.

So let’s begin with some of our questions, Jon, Dr. Gleason, can you tell us where we are? Where are we today in terms of technology and clinical practice and can you give us some of your high-level context?

[00:05:10] Jonathan Gleason: Karen, pleasure to be with you all, really honored. You’re talking about this important topic with Raj as well and I think to contextualize the conversation for human factors engineering and the moment that we’re in, in healthcare, I think one of the most important points to make is that healthcare over the last 20 years really has been an inflection point with technology. And there has been a massive digitization and a technological transformation that has occurred in healthcare over the last 20 years. A good example of that would be electronic health records. When the patient safety movement began 20 years ago, about 15 or 20% of healthcare environments were using electronic health records.

Today almost all are using electronic health records or some form of electronic health record. And so there’s been a massive digitization that has occurred in healthcare and it’s, it’s that implementation of complex operating systems into clinical environments really gives us the opportunity to take full advantage of science of human factors engineering in how we design and continually redesign healthcare.

So, you know, the way that I see healthcare right now is that modern healthcare has become a complex human-machine interaction. The humans and the machines are partners, the machines are no longer our tools, they are our partners in the delivery of care. And as that partnership between humans and machines continues to grow and mature, we will begin to expect and already are expecting the same level of performance, which we expect of ourselves as humans.

And part of that is for clinicians… we have really amazing people who work in healthcare and they’re expected not only to perform at a high level, but also to be constantly surveilling their environments for risks and threats to the patients and have an obligation, identify those risks and threats and to mitigate them. And as technology matures in healthcare, we are going to begin to expect that same level of vigilance in our technology. So put into context for patient safety – and there are many implications of human factors engineering in healthcare, not just patient safety, but with regard to patient safety – the modern patient safety movement did begin in earnest prior to the massive digitization.

And therefore, and necessarily, the modern patient safety movement really began with a focus on social systems, human behavior, team behavior, and accountability. And it’s widely recognized that those social systems to improve safety have had significant impact, but not the kind of impact that we want to have in terms of safety. That’s why we’re having this conversation. And so, you know, social approaches to safety have left us wanting. And the implementation of human factors engineering and into the socio-technical environments does need to take a new and different approach.

And I’ll also note that social approaches to safety from my perspective, face additional headwinds in the coming years because of pandemic-induced staffing shortages, turnover, burnout, fatigue, all of these issues mean that having social systems approaches to safety will maybe be less effective than they’ve been today. And so, again, an even greater sort of opportunity for us to take a full systems approach to the design of our clinical environments so that our environments of care and our technology, are facilitating outstanding care and preventing harm. So, you know, human factors engineering, our topic here, is one of the most important ways that we can accomplish that goal: designing operating systems that account for human capabilities and limitations, and I’m glad that you all are focusing on it.

[00:09:07] Karen Wolk Feinstein: Well, I will say this. I love the fact that an OBGYN surgeon who does a lot of micro work will step forward to take a macro view of whole systems approach to the issue of safety. So thank you and also, I think you make it clear that if we don’t even know the preconditions for major harms… how are we going to prevent them before they occur? So thank you very much, Dr. Gleason.

And Raj, you lead the largest human factors center embedded in a healthcare system in the United States. So for people who might be listening, can you tell us in the most basic way, what is human factors and what do you do?

[00:09:53] Raj Ratwani: Thanks Karen, and it’s really nice to be here today and to also to join Dr. Gleason. It’s a great question and it’s one that I constantly have to focus on. Both my children, nine and five, ask me almost daily, what I do for a living, and I still don’t think I’ve gotten the human factors concepts across to them. So this is a working definition process, but let me start at the most basic level and talk about what human factors is. It’s a multidisciplinary science focused on understanding human capabilities and designing tools, technologies, equipment, processes, you name it, to meet those capabilities.

And probably everybody listening today has had an experience where they’ve interacted with a piece of technology. You name it… it could be your microwave, it could be an ATM, it could be your computer, a smartphone, and you’ve had an incredibly easy experience doing it. That’s great usability and great human factors. That’s a company that’s really understood the user and design things to meet that user’s needs. And all of us has had an interaction that just wasn’t so pleasant. And perhaps we got frustrated and we may have even used curse words, or maybe we threw the device out the window because it was unbelievably frustrating.

That’s an example of where human factors has not really been embraced and they haven’t really thought about user needs. Now, it’s fine to talk about that when we’re considering consumer products, where this becomes critically important is when we start talking about high-risk industries. So if we look at things like aviation, defense – which is where my original career started – transportation, and healthcare, human factors is absolutely critical. And all of those industries, except for healthcare have widely adopted human factors and have been doing so for the last 30 plus years or so. And so they’ve embraced this thinking of, we need to really understand how humans think, how they reason with information, how they interact with things in their environment.

And we have to design for those capabilities. We can’t take the mindset of blaming an individual because they couldn’t use a particular product, or they couldn’t interact with a particular machine effectively. And I think Dr. Gleason nicely characterize this as a human-machine team. That’s really what it is today. Now, if we look at human factors in health care, it’s really in its infancy. And we’re going to talk a lot about that, I think later on to as to why that’s the case, but it really is in its infancy and we need to more widely adopt human factors to improve safety, quality, and efficiency.

Now going to what our center does, we really focus in on two broad areas. The first I’ll just kind of talk about proactive work. And so we spend a lot of time thinking about different kinds of algorithms to identify safety issues before they actually result in patient harm. We typically call these near misses. So really focusing on where something almost went wrong and making sure that we’re doing appropriate risk assessments and redesigning so that’s something that doesn’t actually go wrong. The same thing is true with things like clinical decision support. So Dr. Gleason mentioned electronic health records, it’s now digitized healthcare and we have the opportunity to provide new information to our clinicians and guide their thinking and reasoning so that they’re making effective, clinically safe decisions.

And then of course, things like proactive risk assessments of different devices, technology equipment, making sure that before it actually gets used by our patients, by our physicians, by our nurses, it’s well-designed and safe. So that’s all in the proactive bucket, lots of activities that our center engages in. And then unfortunately, there’s still reactive work, a lot of reactive work. And when I say reactive work, what I mean is that unfortunately it’s something has gone wrong. Perhaps it was a device malfunction, perhaps it was a poor design of a piece of technology that led to a patient being harmed. And our team will very quickly get involved in that process and work to identify different contributing factors to that particular safety event.

So that’s the work of the center. And Dr. Gleason also did a really good job describing these different system components, socio-technical systems, and so I want to just take a minute to talk through how human factors relates to socio-technical systems and the systems-based approach. So, you know, in healthcare, in particular, it’s an incredibly complex environment and you’ll hear a lot of people comparing healthcare to aviation and certainly there’s a lot of similarities. There’s also a lot of differences and I think a lot of people would argue that there are some really unique complexities to the healthcare environment.

Taking a systems approach really means we’re looking at different system factors. And what I mean by system factors, things like the types of technologies being used, the type of people in the environment, the roles of those people, the responsibilities of those people, the different kinds of processes, the policies that govern and shape how we do our work, the culture of our work environment. Those are all considered different system factors. And so when you combine human factors in a system-based approach, what you’re really doing is considering all those different system factors and understanding how they impact human behavior and the outcomes that we’re seeing.

And by taking that lens on the world and thinking about how you can adjust those different system factors to get the behavior that we want, that’s truly taking the human factors approach. It’s not blaming an individual, it’s looking at the system factors and understanding how those need to be enhanced, optimized, and modified to really improve overall performance and drive outcomes.

[00:15:29] Karen Wolk Feinstein: So I understand much better now that if the critical factors in a system, could be the devices, the equipment, the roles to which people are assigned, the culture on a unit or in a hospital or in a practice… if they’re not built around a good understanding of the user, they’re going to cause more frustration and even potentially missteps and harm. So, Jon I’ve often heard you say that you, the human factors person with some skill and interest in it, are there to help the doctors by building them a better airplane. So tell us about the actual work that you’re doing. What do you do right now with human factors within?

[00:16:15] Jonathan Gleason: Raj laid it out really nicely. Maybe I’ll just give two recent examples both of reactive application and proactive applications of human factors expertise. And just point out before I do that, the human factors engineer is really applied in design moments. Where you are looking at a system, you’re creating a design moment, which is what I look for all the time. Let’s create a design moment here and let’s apply the principles of human factors engineering, the signs of human factors engineering, to that design opportunity.

So an example of a reactive that recently occurred was a medication formulation was available to be ordered by someone who was working in a neonatal intensive care unit. That should not be the case. That option should be unavailable to use similar to how an airplane pilots should not have a big green button on their dashboard that says “crash the plane.” They shouldn’t have that option in front of them. And so you know, a very simple application of human factors in that scenario is we remove that option from people who were working in the unit of intensive care unit.

So that’s not a culture approach to safety, it’s a system improvement. It’s creating a system that accounts for human capabilities and limitations, and that someone at some point would have made an error in that circumstance. So that’s an example of a reactive application and an example of a proactive is we currently have a significant improvement event that’s going on at Jefferson to improve our care of sepsis patients. And this is something that all health systems have been working on for a long time. And we’re trying to go from great to elite in our care of a sepsis patient.

And so that’s created for us a design moment in all of our systems and approaches and tools for the identification and the workflows for the treatment of the sepsis patient in designing those protocol workflows, we’ve incorporated human factors engineers to make sure that all of the technology and the workflows are just a terrific partner to people that make it very, very easy for them to do the right thing and very, very hard for them to do the wrong thing.

[00:18:31] Karen Wolk Feinstein: When I think about what you’re talking about, I know that I’ve been called to hospitals when they had a number of defibrillator deaths. And so, as you know, someone who’s not a medical doctor or a nurse, I say “why don’t you standardize your defibrillators?” I mean, this seems rather obvious, right? Because there are these recurring defibrillator deaths because people can’t use the equipment that happens to be available when someone has, I guess, died and they need to be resuscitated.

So I’m sort of curious, what are the headwinds? What are the challenges that people with a human factors background face in trying to bring these basic principles to healthcare? Raj, you want to start with this?

[00:19:22] Raj Ratwani: Yeah, happy to, you know, we face these all the time. So, you know, for me, what I’ve seen time and again, is that people don’t fully understand human factors and systems-based thinking and incentives to adopt this thinking and practice are not always in place. And let me break a few of those things down. So in terms of understanding human factors and systems-based thinking, we’re naturally linear thinkers, that’s the way we like to think about things. And we also naturally love to assign blame to a single individual or thing.

Just turn on the news in any moment. And you’ll see that it’s usually a very simplistic response when, when an accident occurs and we typically look for an individual or thing to blame for it. The reality is that’s just not the case that the world is complex and healthcare is incredibly complex. And so there’s some friction there when we try and think through this human factors systems lens. And until leadership at every level across all the different stakeholders really embrace this kind of thing, we’re just not going to be able to fully adopt human factors.

And related to that is this incentives piece. And what I mean by incentives is, you know, for this to really transform healthcare in the way that we all want to, to really make big leaps and bounds in terms of safety improvements. We need all stakeholders, healthcare facilities, industry partners, policy makers, to all have shared aligned incentives for improving safety and that’s just not always the case. Now I’ll give you a concrete example. Dr. Gleason talked about the widespread adoption of electronic health records. It first really kicked off in 2009 with the HITECH Act, moved to adoption of the EHRs to something like 90 plus percent of healthcare facilities across the United States. Embedded in nearly every single one of those contracts between the electronic health record vendor and the healthcare facility is something called a hold harmless clause.

And what that hold harmless clause essentially says is that if there’s a safety issue related to electronic health record, that EHR vendor will not be held liable for the particular issue. So if you think that through, what incentive does the EHR vendor really have for improving the safety of their product, adopting a human factors approach, working with other stakeholders to make those improvements? It’s certainly not a financial one. And so until we are able to address some of those types of incentive issues, it’s going to be really hard to make these kinds of improvements.

And when we look at other high-risk industries, take aviation, for example, they were able to overcome those. The way aviation was able to make big strides in safety is they brought all stakeholders to the table to have good deep conversations about things like safety. And they were incentives in place to make sure that everybody’s working towards the same goal. And that’s exactly what we need to do in the healthcare space to make sure we have widespread adoption of human factors and to also get the safety improvements that we want.

[00:22:29] Karen Wolk Feinstein: Jon, Dr. Gleason, do you want to respond also to the challenges you face within a system who would it seems like a very practical, rational way of approaching harm.

[00:22:42] Jonathan Gleason: Sure I’ll add I think two things to what Raj said. One is that, as I said that the patient safety movement, which is a very spirited movement, there are many who are highly motivated, in particular caregivers and patients, to improve on safety. Safety events are devastating to everyone. But the approach to the modern patient safety movement really has been one that’s focused on human performance, team performance, accountability, and culture, such that people don’t feel that they’re blamed when they speak up for potential safety concerns, which is very, very important. But what is equally important is what we do with the system. What do we do to improve the system such that there isn’t a need to maybe have to speak up in the future because you’ve reduced the actual risk in the environment through the design of better socio-technical systems, better clinical operating systems. So I do think that generally, this is something that many in healthcare can’t see it because we spent a lot of time focusing on other things.

So I think that’s one issue. And then the second is, if we decided today that every hospital in America needed to have human factors engineer, it would take us a very long time to achieve that goal. There might be 50 clinically informed human factors engineers working in healthcare in our country today. And that is a significant need, a significant gap, and is something that we need a lot of people to come together around finding solutions for that problem. I think that is a solvable problem, but it’s one that will require quite a bit of focus and attention and strategy to be able to create the human factors engineers that were needed.

[00:24:36] Karen Wolk Feinstein: I wish we could get rid of the term whistleblower forever, you know, call them safety heroes, you know, “see something, say something.” That’s how things get fixed. I do know the frustration in one of our hospitals, when several people died in a short period of time from a failure with the same equipment. And so just, you know, trying to understand what happened was not an easy thing. The equipment manufacturer said, well, you know, we have an alarm that goes off. The staff say, “we couldn’t tell that alarm that a serious incident is about to happen, that that could lead to death, from the many beepers and buzzers and everything else on our floor that go off regularly all day.” and we just never got to the bottom of that, we gave up.

So I think is this maybe in my mind, maybe a National Patient Safety Board would get to the bottom of these things and would help us sort out, how do we not have it? We couldn’t even find out if other people in other parts of the country, other hospitals have had the same experience with this equipment. None of this is available. So I think of a National Patient Safety Board as one way to start studying some of these things and find out, do we actually have a problem? And is it fixable or was this very unique experience a one-off… it just happened to be a series of coincidences in one hospital. Right now, you know, even if that one hospital looks at the issue, it might be idiosyncratic, but it might not. It might be something applicable broadly across the country.

So in our effort to create a National Patient Safety Board, we wanted to ask you, how could such a board embed human factors engineering into their process of study and coming up with solutions, and how can they make things that happen in one place or seem to be random or whatever, how can they understand what is really a fairly broad national problem requiring an assembly of experts to come up with some solutions? So I would leave this up to either of you in whatever order to help me with that.

[00:26:54] Raj Ratwani: I’m certainly happy to jump in first. No, I think, in many ways, a lot of the work that we’re doing within the National Center for Human Factors in Healthcare, I think could be scaled up, and are things the NPSB could adopt and would have significant, significant contribution to improving safety. So I would start by referring, as you just described Karen, there’s a lot of safety data out there. There are safety data of patient safety event reports that healthcare facilities across the country are collecting. The FDA has several safety databases, the MAUDE database for medical devices, the FAERS database for medications, VAERS database for vaccines. It goes on and on and on. And in fact, we have been taking the opportunity to catalog these data. And if you look at all of the different data points, just from the federal agency databases, it’s something like over 20 million data points that are out there. And that’s great, data’s fantastic… if you can actually turn that into insight and information.

And so I think one of the first functions and roles here is engaging human factors experts to help develop tools that can analyze all these data so we have clear direction in terms of insights from these data. And some of this is going to be proactive, we’re going to be identifying safety issues through near misses and be able to tackle those issues and unfortunately, some are going to be reactive, but I think part one is making sure human factors engineers are at the table to design those tools, to make it easier to process these data. Once we know where we need to focus our energy. We then want to make sure human factors experts are in the process of designing and developing solutions. And that’s gotta be an iterative process to do that. But that’s absolutely where we need human factors and systems-based thinking to ensure that those solutions that are being developed are robust, sustainable solutions. And then the final piece is making sure that these learnings and the solutions actually get disseminated out to all the different health care facilities.

If we look at, if we look at things now, nationally, there are some incredible pockets of excellence where people are using systems-based approaches and human factors, they’re developing effective solutions. The problem is people often don’t know about them. And so you might have a hospital in California that experiences the exact same safety issue that a hospital in New York experienced three years ago, the hospital in New York developed a great robust solution, they have addressed it, but that hospital in California knows nothing about it. And so we have to get at that dissemination point.

And if we take this approach, I think it actually tackles one of the headwinds that Dr. Gleason brought up, he really nicely articulated that there simply aren’t enough – and I love this term, clinically informed – there aren’t enough clinically informed human factors experts to be employed at every healthcare facility or hospital. Well, if we centralize this resource into a National Patient Safety Board and we elevate the safety issues, so the NPSB has some human factors engineers and is tackling this, we’ve partially tackled that problem. So from my perspective, that would be a great direction for us to go.

[00:29:52] Karen Wolk Feinstein: You’ve answered that question so well and I know the first time that I was speaking with you on Zoom and you described the MedStar Health National Center for Human Factors in Healthcare, I said to you somewhat spontaneously, “that sounds like what I want a National Patient Safety Board to do, that’s exactly what I want a NPSB to do on a national level.” So thank you.

So, Jon, would you like to have the last word?

[00:30:20] Jonathan Gleason: Sure. I will say, from my perspective, you look at the future of healthcare, this issue of human-machine teaming is becoming more and more significant. This is where we’re heading and staffing challenges and all the issues that we face are going to push that even faster and faster. So I think the integration of human factors engineering into the design of clinical operating systems is an inevitability. We are talking about something that will happen. The question is, how quickly will it happen? And I do think that NPSB has an opportunity to accelerate that and in the meanwhile – in that acceleration – save a lot of lives and have a significant impact in accelerating the impact of human factors engineering into healthcare.

So, you know, any credible safety approach at this level will involve human factors engineering. And I think also the NPSB with human factors engineering will have, I think, the ability to pull in all of the stakeholders to the investigation, the EHR, as Raj said is one example. So if you go to the airline industry, when there’s an airplane crash, it’s not just Delta who was there – and I’m not picking on Delta, I just chose an airline – it’s also Boeing and I’m not picking on Boeing either, right? But they’re bringing in not just the airline company, they’re also bringing the companies that are building the technology. And to me, that’s one of the key components of the NPSB and its ability to have impact, is to bring the technology groups in together with the clinical organizations to design a better solution.

[00:32:08] Karen Wolk Feinstein: Even though the Boeing engine didn’t fall off… it’s a wonderful way to end this conversation, I do want to say many thanks to Dr. Jonathan Gleason and Dr. Raj Ratwani. I love doing these podcasts. It’s a chance for me to just step back and listen to people like both of you and to learn in the process. So thank you.

And in conclusion, we know people make mistakes. But when the stakes are as high as they are in healthcare, you have to develop systems that can autonomously prevent errors from occurring, particularly before they happen, obviously. So human factors engineers, I think you’re convinced as I am, they have a key role to play into developing these autonomous systems and making sure they work for people as they work in real settings. And we’ve seen it in so many other industries from air traffic control to manufacturing, to space travel and in a few hospitals. So hopefully, this will spread and hopefully an NPSB will be even stronger building on the background of human factors engineering.

So if people are interested in this podcast or others, you can always find them on the npsb.org site. So please go there, we also, there’s a link to other podcasts. You could look at the coalition to see who’s involved in this nationally, you may want to join. We’d also love you to tell us, give us your reactions to what you’ve heard and tell us about your own experiences with patient safety and some solutions that might have occurred to you. So we welcome you to join us on future podcasts or on our website. Thank you so much.

Subscribe on your favorite podcast app: Apple Podcasts | Google Podcasts | Spotify | Pocket Casts