Episode 12: Systemic Solutions
Where have efforts to reconstruct patient safety failed and what opportunities exist to remake the landscape of patient safety? How can advances in data technologies and systems thinking help us revolutionize over the next decade? Join host Karen Wolk Feinstein and esteemed guests Dr. Kathleen Sutcliffe, Bloomberg Distinguished Professor at Johns Hopkins University, and Dr. Vivian Lee, president of health platforms at Verily Life Sciences, for a glimpse into the potential for a collaborative, tech-enabled transformation of healthcare safety.
Listen to this episode on: Apple Podcasts | Spotify
Featured Speakers
- Karen Wolk Feinstein, PhD, President & CEO, Jewish Healthcare Foundation & Pittsburgh Regional Health Initiative
- Vivian Lee, MD, PhD, MBA, president of Health Platforms, Verily Life Sciences; senior lecturer at Harvard Medical School and Massachusetts General Hospital; and a senior fellow at the Institute for Healthcare Improvement
- Kathleen Sutcliffe, PhD, Bloomberg Distinguished Professor at Johns Hopkins University with appointments in the Carey Business School, the School of Medicine (Anesthesia and Critical Care Medicine), the School of Nursing, the Bloomberg School of Public Health, and the Armstrong Institute for Patient Safety and Quality
Referenced Resources (in order of appearance)
- Introducing national healthcare safety investigation bodies (British Journal of Surgery, 2018)
- Still Not Safe: Patient Safety and the Middle Managing of American Medicine (Oxford University Press, 2019)
- The Long Fix: Solving America’s Health Care Crisis with Strategies that Work for Everyone (W. W. Norton, 2020)
- Alphabet
- Onduo
- Granular
- Healthy at Work
- Institute of Medicine To Err Is Human report (1999)
- A Brief History of the FAA
- Do Black and White Patient Experience Similar Rates of Adverse Safety Events at the Same Hospital? (Urban Institute, 2021)
- Medical Insurance Feasibility Study: A Technical Summary (Western Journal of Medicine, 1978)
- The Healthcare Professional Workforce: Understanding Human Capital in a Changing Industry (Oxford University Press, 2016)
- Paul O’Neill
- John Senders
- Institute for Healthcare Improvement
- The Design of Everyday Things (Basic Books, 2013)
- University of Utah
- W. Edwards Deming
- Verily Value Suite
- An Algorithmic Approach to Reducing Unexplained Pain Disparities in Underserved Populations (Nature, 2021)
- Kaiser Family Foundation
Episode Transcript
[00:00:00] Kathleen Sutcliffe: My view is that safety is an activity. It’s not necessarily a property of systems. We build safety together, moment to moment, through our daily interactions with each other and our interactions with technology.
[00:00:20] Vivian Lee: And so that’s where I think this AI piece and data and creating more intelligent, personalized pathways is the way forward.
[00:00:32] Kathleen Sutcliffe: I think it’s important to remember that aviation didn’t improve by staffing its safety departments with former flight crews. It was a systems approach.
[00:00:43] Vivian Lee: And so, I think there’s a lot of opportunities for building in safer processes into our actual designs.
[00:00:53] Karen Wolk Feinstein: Welcome back to Up Next for Patient Safety, where we envision a world where medical errors, adverse events, and preventable harms are avoided and where we examine the most promising paths to prevent these tragedies before they occur.
I’m your host, Karen Feinstein, CEO and President of the Jewish Healthcare Foundation and the Pittsburgh Regional Health Initiative, which is a multi-stakeholder quality collaborative. We’ve been working to reduce medical error for over 20 years, mostly unsuccessfully. But we can’t give up because there’s too much at stake—and that is the loss of approximately 250,000 lives a year, and long-term injuries for many more.
Today, we’re taking an unusual journey back in time and into the future. We’re joined by two very interesting women: one is a leading historian, and one is a visionary in patient safety and healthcare transformation. We’ll examine some of the major historical milestones in the patient safety movement over the past three decades while envisioning what the next decade could bring when medical safety is enabled by data and technology.
Dr. Kathleen Sutcliffe is a researcher and author whose work has appeared in many management and healthcare journals. She’s the author of a recently highly lauded book, Still Not Safe: Patient Safety and the Middle Managing of American Medicine. Dr. Sutcliffe is the Bloomberg Distinguished professor at Johns Hopkins University with appointments in the Carey Business School, the School of Medicine, the School of Nursing, the Bloomberg School of Public Health, and the Armstrong Institute for Patient Safety and Quality. Her research has been devoted to investigating how organizations and their members cope with uncertainty and how they can be designed to be safer, more reliable, and more resilient. Beyond health care, she’s investigated practices in oil and gas, exploration and production, chemical processing, steel production, and wild land firefighting. That is quite a mix, Kathleen. She serves on the editorial boards of several journals and has served as a proposal reviewer for the National Academies of Science, Engineering, and Medicine. She’s consulted with leadership teams of numerous companies, including Goldman Sachs, Georgia Pacific, Marathon Oil, and ThyssenKrupp.
Dr. Sutcliffe received a PhD in organizational theory and organizational behavior from the University of Texas at Austin, a master of nursing from the University of Washington, a bachelor of science from the University of Alaska, and a bachelor of arts from the University of Michigan. Kathleen is not only well educated, but she has explored many different industries, which would be very helpful to all of us in understanding the path forward for health care.
Our other guest is Dr. Vivian Lee, the author of the acclaimed book The Long Fix: Solving America’s Healthcare Crisis with Strategies that Work for Everyone.
Dr. Lee is president of Health Platforms at Verily Life Sciences, an Alphabet Company, whose mission is to apply digital solutions that enable people to enjoy healthier lives. She’s a passionate champion of improving health in the United States and worldwide, and she works closely with Verily’s clinical and engineering teams to support the successful transformation of health systems to value and advance the co-production of health with patients, their caregivers, and communities.
She’s a founding leader of Verily Health Platforms, and she chairs the boards of Onduo, a virtual care digital health platform, and Granular, an innovative stop loss insurance company. We have two very versatile women today. She also leads the company’s COVID-19 employer and university product Healthy at Work.
Dr. Lee serves as the senior lecturer at Harvard Medical School and Massachusetts General Hospital and a senior fellow at the Institute for Healthcare Improvement. In 2019, she was ranked number 11 among the Most Influential People in Healthcare. She’s a graduate of Harvard-Radcliffe Colleges, she received a doctorate in medical engineering from Oxford University as a Rhodes Scholar, earned her MD with honors from Harvard Medical School, and she was valedictorian of her Executive MBA program at NYU’s Stern School of Business.
We have two very distinguished guests today. Welcome, Kathleen and Vivian. Okay. Let’s get to the heart of the matter.
Kathleen, in 2020, you published your book Still not Safe with the late Robert Wears. The book details the patient safety movement in the United States. So, let’s cut to the chase. Why in your opinion has progress on patients safety been so slow, particularly when we compare it to safety advances in other high-risk, complex industries. Why haven’t we made more progress in the last 30 years?
[00:05:49] Kathleen Sutcliffe: Well, Karen, I think that you perfectly phrased the question with your emphasis on weak progress because we have made progress, and I don’t think anybody would want to roll back time. I think it’s really important to remind ourselves that gains have been made over the past several decades. I mean, just think about surgery, you know, I mean, in surgery, there’s clear evidence that multiple waves of innovation over the past few decades have significantly improved surgical safety. But as you said, you know, Bob and I discussed and talked about the patient safety movement as a “movement be calmed.” And in fact, I think we went as far as to suggest that patient safety may be dead as a reform movement, a reform effort.
So, let me get to your question about what’s contributed to that weak progress relative to other industries, such as commercial aviation, which over 20 years, as we know, reduced fatalities by 95%. I wish there were a simple answer, but there’s not, you know, it’s a complex story, but two things really stood out for us.
The first is the framing of the problem as one of “medical error” rather than “patient harm.” And I think, I mean, if you recall a key message of the IOM’s To Err Is Human report was to focus on the system rather than individuals as the locus of the problem. But I think that we have not really done that over the, the last two decades, the last 20 years.
So, that hasn’t really happened, but second, progress has been weak because health care has pretty much ignored and given up on the call for a multidisciplinary approach, working together with the safety sciences, such as organizational and social sciences, cognitive psychology, human factors, and systems engineers.
Instead, I think health care has opted for a scientific, bureaucratic medicine model under the control of clinician administrators. And you know, I’m not trying to be negative or crass here, but I think it’s important to remember that aviation didn’t improve by staffing its safety departments with former flight crews. It was a systems approach. There were multiple and still are multiple entities cooperating together over time, and I do not think that that’s happened in health care.
[00:08:21] Karen Wolk Feinstein: So, an interesting question, you raised when we called it “medical error,” we figured we had to fix the medical professionals, and that it was something to be fixed within that profession.
So just thinking “what if,” what would have been different if we had engaged some of those other specialists, whether it’s human factors engineers, industrial quality improvement experts, machine learning experts, NASA didn’t send astronauts safely to the moon just by asking the aeronautical engineers, “How do you move a spaceship forward?” What could have been different if we were interdisciplinary?
[00:09:08] Kathleen Sutcliffe: I think one of the things that could have been different, and needs to be different, and I have hope that it will be different, is that we really understand the nature of the work. I think that understanding some of the basic science about how healthcare is done, the ways in which people interrelate, the kinds of practices that are critical for a day-to-day alertness and awareness, how we can ensure more perceptual accuracy and diagnostic accuracy. I mean, I think all of those things are, they’re not individual level. Individuals have to interact with other people, with other technologies, et cetera. So, it’s a problem that has to be resolved by looking at the system, and I don’t think that we’ve really focused on that. And I think it goes to understanding the basic nature of work processes and how the work gets done.
[00:10:12] Karen Wolk Feinstein: And some people can’t even imagine, those outside of health care, some, some of our barriers. I think of a problem in one of our cancer units, when two hospitals merged and then shared one unit, and a lot of things were not to the liking of the nurses or doctors. So, we asked them to meet and give us their suggestions, and it turned out they couldn’t meet together because the nurses had the nurses’ meeting room, and the physicians had a physician meeting room. So, instead of ironing out their problems together, they met separately. You have to really understand the culture, even just the design of our hospitals, to understand why progress is so slow.
So, you said that one of the most difficult aspects for advancing safety in health care is being able to manage the unexpected. Human beings, and we’re reminded of this, we’re not airplanes, and people have to act in real time, in the ways that many of us as individuals react differently to exactly the same procedures or medications or health conditions. Can you talk a little bit about how we might have some promising interventions for people who have to manage the unexpected?
[00:11:36] Kathleen Sutcliffe: Sure. I think bringing up the issue of the unexpected is really important because, from my research, I know that the unexpected is going to happen, and it’s going to happen when we least expect it. And the unexpected is more likely in complex, dynamic environments like health care, where work is sometimes tightly coupled and interactively complex.
You know, you ask, What’s the remedy? And as an organizational scientist, I think about, How do we create systems to achieve complex goals, such as safety and quality, productivity and efficiency? How do we organize for adaptability? And I believe that we organize systematically and that there are two major logics that we need to keep in mind. One is a logic of anticipation and prevention, but the other is a logic of resilience and containment. And I think we need to be thinking about how do we consciously design our systems around those two logics. And in fact, I think that there are, I mean, and again, going back to the issue of a system solution, thinking about trying to understand the work system, I would keep my eye on four key elements. And the first is that I think we have to enact bundles of high-performance human resource practices. At the height of the quality movement, everybody learned from a lot of really important research that was done 20 years ago or so that high-performance human resource practice bundles are critical. But there are a number of things that go into it. You know, the way we select the way we train, the way we give people opportunities to use their discretion—all of these things are critical. So, that’s the first thing.
The second thing is that we need to build vigilant upstream and downstream coordination. It can’t just be nurses doing their jobs or the techs doing their jobs, or the labs doing their jobs and docs doing other jobs. I mean, we’ve got to understand the coordination among all of these entities, and that’s hard work, I will tell you.
I think the third is really where the daily action is at, and that is to build sets of daily work practices. Think about pre-work briefings, huddles, handoffs, rapid response teams, all of these practices that go into the day-to-day work really enable people to be alert and aware of the unfolding situation so they can adapt. And that’s just how it happens.
And then the fourth thing, that I think you’ve already raised a couple of times in your questions so far, is that we have to actively think about building a culture and a climate of trust and respect. I mean, that is really the bedrock, I think, of high-quality, safe care, and I think it’s the bedrock of high-performing organization.
[00:14:43] Karen Wolk Feinstein: Of course, as machine learning takes us to new heights in precision medicine, we might be able to narrow the range of the unexpected, but as you said, we’re not there yet. So, I’ve been interested in the Urban Institute report recently that talked about racial disparities and adverse safety events in hospitals, unfortunately, the disproportionate impact on people of various races. So, in your work, you’ve studied research that dates back to the seventies on inequalities and patient safety for people of lower socioeconomic status.
So, you know, I know, probably, if you had your druthers, we’d have sociologists and anthropologists and psychologists engaged in solutions. Can you just talk a little bit more about how we have studied this problem for a long time and how we might get to some productive interventions?
[00:15:49] Kathleen Sutcliffe: Yeah, I think I can answer the first part. I’m not sure about the productive interventions, but I think we can talk about that. There was a finding that certain social groups were more likely to experience harm from healthcare. Um, in a study that was conducted in the early 1970s by Don Harper Mills in California. That study was called the California Medical Insurance Feasibility Study. And it was really one of the first large-sample studies that measured adverse outcomes to patients in the course of receiving health care. And in fact, the MIFS, as it was called, the Medical Insurance Feasibility Study, was the first to provide us a statistical estimate of the rate of medical harm on a population basis.
But when I say, “little discussed and overlooked,” I really mean it because I think it was a finding that was reported in several publications, but it was not discussed. And that finding was that patients who received health care at government expense were harmed at significantly higher rates than others. But we haven’t discussed it. And I think really it’s only now that we’re starting to understand that certain groups are more likely to experience harm from health care than others. And I think, you know, what’s the answer to that? I mean, I think the answer is that we need to start paying attention to it. That’s the first thing, you know—alert ourselves to the fact that this is happening, and then we need to take a systems approach to trying to solve the problem. So, we probably do need to be bringing in other disciplines.
[00:17:37] Karen Wolk Feinstein: Well, I look forward to that day, and that will also help us understand how you build trust and respect.
So, one thing interesting, we know that medical error, once it was named as such, attracted the attention of the medical profession, who kind of owned medical error for better or worse. But I noticed in your book, you’ve also spoken about the feminization of safety, which is really interesting and how most of the staff deployed in safety management within hospitals, and not talking about everyone, but the majority of frontline quality improvement specialists are women, and often not medical doctors. Does it say anything about the perspectives on safety at the highest levels and the prioritization of safety among other considerations?
[00:18:34] Kathleen Sutcliffe: Yeah, I mean, I think this is a really, really interesting, again, it’s similar to the issue related to racial disparities, a really interesting issue that people aren’t talking about. And the argument that safety’s been feminized, I don’t know whether there’s been a shift in roles or whether it’s always been that way, but what Bob and I observed and experienced throughout our professional careers, attending various conferences, patient safety events, forums, the fact is that many participants are female. But when we wrote the book, we did a more systematic analysis, and that was an analysis of people who were certified by the certification board for professionals in patient safety. And at the time that we did our analysis, it was only a few years ago, more than 86% of the certificants were female.
You know, certainly we can imagine that this is probably partly structural given the composition of the non-physician and non-physician healthcare professional workforce. We know I published a book with some other people on the professional healthcare workforce a couple of years ago, we know that the composition of nurses, physician assistants, and pharmacists healthcare professions, they’re largely female. So, I think it’s unclear about what this says about the prioritization of safety, but I think we’ve got to keep a couple things in mind, and what I think is important to keep in mind. First of all is, What do we know about occupations and professions that have a large percentage of female workers? One of the things we know is that they typically pay less than those with a predominantly male workforce, even when the jobs require similar skill sets or education. The second thing we know is that they’re assigned lower wages because they’re considered to deserve lower earnings than men, and what that means is that women’s work is often valued less than men’s.
So, if we take this together, I think it’s plausible to think that a feminized workforce responsible for a particular role or set of activities might be dismissed, disregarded, or even discredited. I mean, I think we’ve got to put it on the table to think about it.
[00:21:02] Karen Wolk Feinstein: I am so glad you’re bringing it forward. I have spoken at various times to associations of infection control specialists, and the majority, totally not all, but the majority are women, and we’ll go to another country so we, we don’t indict anyone here, but we were working in another country, you know, highly developed, highly technological, with absolutely astonishing rates of hospital-acquired infections, and the infectious disease doctors, as well as the nurses said to us, we have the most frustrating job in the hospital. Nobody pays any attention to us. And that was to a person. These were wonderful, committed people, but they can’t bring infection under control if they don’t have the cooperation of other physicians and nurses. So, it was sort of sad for me to hear that.
[00:21:57] Kathleen Sutcliffe: Yeah. I mean, I totally agree with you. And I think that goes down to the issues related to trust and respect, but I think safety is everybody’s problem. It’s not just the safety department’s problem or the safety nurse’s problem or whatever—it’s everyone’s problem. And it’s serious.
[00:22:16] Karen Wolk Feinstein: So, when I started the Pittsburgh Regional Health Initiative, back in 1998, my partner was Paul O’Neill, who was the CEO of Alcoa, and I recall at the time, it was the safest corporation in the world. And Paul had not only a no-nonsense perspective on safety, it was, he believed as a leader, his primary concern. And he had over a hundred thousand workers then moving vats of molten metal and sheets of razor-thin aluminum, and all over the world, in all of his plants, very few people got injured, but he thought, as a leader, among all the things he was responsible for, above the bottom line, keeping shareholders happy, was keeping his workforce safe.
And that really impressed me. But when I go back to the healthcare industry, I see a lot of resistance, sometimes, to safety measures, to things that are, as we all know, the most common example, hand-washing. Even with people who are on the floor observing, we’re not even probably at 70% yet. And yet hand washing is one of the most basic ways of preventing infection. Can we talk a little bit about, in a way, why there’s so much resistance in health care to even things that are so obvious? What generates that pushback in areas where we know it is absolutely proven that some very basic precautions and protocols will save lives?
[00:24:01] Kathleen Sutcliffe: Yeah, I think that’s a good question, and there are a number of points that I want to make on this, because I’ve done a lot of writing on change over the last few years. So, I’m really pretty familiar with the literature. I think the one issue to get at your point about hand-washing is that people really need to develop routines and habits. And I think habits are one way that people can enact change and to adopt it and to do it over and over and over. And it is really pretty hard for people to change their routine, so that can be one contributing factor to what you’re seeing as resistance is just that it’s difficult to change routines.
But I want to go back to the issue of, when I’ve looked at the change literature, I know that there are many myths. One myth is that change is unquestionably good and that the vast majority of change efforts crash and burn, that this is bad, and that this is a consequence of resistance, okay? But when you dig more deeply into the change literature and change examples, you find that the story’s really much more complex. Change and resistance, I think, aren’t either good or bad. It depends. It depends on whose perspective about change is being privileged. It depends on what element of change is being judged. And it depends on what point in time that judgment is being made, because what I know is that people might resist change because they’re fearful. I mean, absolutely. They, you know, are worried they’re going to lose something or it’s going to be too hard. They won’t understand it. They could be tired. And as I just said, new routines take a lot of effort. But they might resist change because they have better information about why it doesn’t make sense in this particular situation or why it’s not going to work. And so, I think resistance can be an impediment or it can be a resource. And in a way, resistance can be an important form of resilience because people are assuring that similar outcomes in the face of change by doing what they need to do to bring them, you know, they’re, they’re doing what they need to do to make sure that they have the same outcomes, even when they’re supposed to change it. I don’t know if you understand what I’m trying to say there, but you know, resistance isn’t always bad, but I think, getting back to your question about future interventions. If you can let me go on, and if I’m not going on too long, I think there are three of them.
Number one, I think that we definitely need more research to better understand how things go right in the course of everyday work. No, I’m not suggesting that we shouldn’t look at how things go wrong, but I think narrowly focusing on preventing things from going wrong is fundamentally limited for improving safety.
The fact is, things go awry in the course of everyday, ordinary work. And we often are, you know, we’re constantly making adjustments, oftentimes catching and correcting things often without even knowing it. So, what elements are contributing to our abilities to make things go right, to adapt, to cope, and to recover? So that’s one.
The second thing is, we really do need more diversity in tackling these problems. I think we’ve got to take seriously psychologist John Senders‘ warning that medical mishaps and the adverse events that follow are problems for psychology, organization theory, and engineering, not just medicine. Engaging expertise from a broad variety of disciplines, creating shared partnerships, not just one-time interactions, will add to our understanding of how people in complex systems make sense of and update their understanding of unfolding events in the context of everyday work. It’s going to help us really understand adaptability.
My view is that safety is an activity. It’s not necessarily a property of systems. We build safety together, moment to moment, through our daily interactions with each other and our interactions with technology. And so, we really need to understand what goes into that. The final thing is that I think health care has privileged these small system changes. That’s good. I think it’s really important that we look at the activities at the clinical level, but we need to be looking at more than that. And we need to be looking at more than just single organizations. There are a large class of hazards that are going to require structural changes in health care as a whole, far beyond the multi-hospital systems, at the higher sociopolitical level, for example, the issue of look-alike or sound-alike drugs, technological issues, such as the little plastic things that connect healthcare devices to patients or connect healthcare devices to other devices, et cetera.
I think there’s a lot to do. And, and I think those three things are a good start.
[00:29:20] Karen Wolk Feinstein: I love what you say about, I think, collective activity, and I’d like it to be something in which. everybody that’s part of any healthcare setting takes collective pride and engagement and ownership in safety.
I think of the saddest thing once visiting a neonatal unit in a hospital and all of the hand sanitizer receptacles were empty. In what way do you tell your workers, We don’t really care about safety here. We’re not even going to fill your hand sanitizers.
But I think one of my favorite moments, I like to bike in the mountains and the wilderness. And I was on top of a ridge outside of Friendsville, Maryland, not a very busy destination. And there’s a big plant up there and there’s a big sign in front of the plant. You can change the month and the number, but it said, Zero worker injuries in say, June. And I thought, Who’s going to see this? No one passes by, this is so remote. And then I realized I have it all wrong. This is for the people who work there, and this is collective pride, and this is a sign of leadership. Their leadership thought enough to put this gigantic billboard in front of their plant in the middle of nowhere to salute their efforts and safety. And it said a lot to me; I have a lot of pictures of it. So, I know what we’re working toward. I do know, and I don’t mean to keep evoking Paul O’Neill because obviously he was somewhat singular, and his passion for safety, sadly, but leadership matters so much, and he did have ways of engaging everyone. It was Alcoa’s pride that they were the safest corporation in the world. So, thank you so much for your insights and wisdom. You’ve studied this as well as anyone, you studied the issue of patient safety in the United States and, and your book, thank you so much, provided a wonderful guide. And for us also, it was inspirational in terms of, okay, we see the problem more clearly. Maybe we have new ideas of what to do. So, thank you so much, Dr. Sutcliffe.
[00:31:42] Kathleen Sutcliffe: You’re welcome, Karen. Thank you very much for having me.
[00:31:45] Karen Wolk Feinstein: Vivian, you also published a book in 2020, The Long Fix, which focused on strategies, including the use of data and technology, to improve health care moving forward. Could you name some of the most promising innovations, some that might include data technology capable of transforming health care in the future?
[00:32:05] Vivian Lee: Sure, Karen, happy to talk about it. And there, there are so many, so I’ll just maybe skip very lightly on a bunch of topics. And then we can maybe dive more deeply into the ones that seem interesting to you and to your listeners.
So, I’d say that one basic theme, a big theme is the use of data and technologies to enable real personalized care, precision health, whatever phrase you want to use. Instead of thinking about health care as a one-size-fits-all, as we used to think about, how do we actually use data about an individual’s biology, their individual genetics, all the way to novel sensors to measure their blood sugars, to measure their carbon monoxide levels in their blood, if they’re a smoker, you know, measure more information, more data about people so that we can deliver personalized care, truly personalized care. That is one major area. You know, back when I was in medical school, we were all taught about the “standard patient.” We talked about everybody as a standard patient and that standard patient, Karen, was a 70 kilogram white. That’s it. 70 kilograms. We dosed everybody as if they were 70 kilogram white male, we thought about them, and of course that’s not what our patients look like today. So how do we think about their biological differences, their sort of psychological differences, their social and economic differences and create a personalized journey? I think that’s one big theme. Another big one is actually ways in which we can use technology to help people access care. So that’s a huge issue, whether it’s rural populations, whether it’s seniors, whether it’s people who don’t speak English or who can’t afford care, how do we increase access is a huge issue.
And there’s an incredible opportunity to using technology. Whether it’s using cell phones and the ability to text people, whether it’s creating opportunities through, say, voice recognition, you know, just being able to talk to Siri or Hey Google or Alexa, whoever you’re talking to and engaging with that way. Whether it’s video conferencing, so you don’t have to get in a car and drive a couple hundred miles to get to your doctor—many ways in which technology is being used now to increase engagement.
And then maybe the third area I would share is just more broadly helping health care get better. So, we talk a lot about a learning healthcare system. And one way in which we learn is by collecting data about our performance, how are we actually doing in terms of safety, in terms of cost, in terms of equity. And by measuring that, by sharing the information with everyone, identifying where the opportunities are for improvement, we can use data and technology to create truly a learning health system where we can get better automatically over and over again, consistently, continuously demonstrate improvement.
And so those are three areas.
[00:35:05] Karen Wolk Feinstein: I love all three. Related to the first, personalized precision medicine, I’ve heard some interesting presentations on digital twins and how the use of digital twins can help patients make decisions among the options for their care and get a better handle on what the various treatment options, what the advantages and disadvantages are. Do you think that that’s part of the future and important part of the future?
[00:35:33] Vivian Lee: Well, you know, I think that the way in which we’re going to think about engaging in our health is going to be completely transformed in the next decade or so. And it’s easy to jump to that conclusion because if you look at any other aspect of our life, if you look at, for example, how we think about our financial health, you know, in the old days, we drive into the branch and we’d meet with the teller, and in order to deposit a check, we’d have to actually go there and hand that check over in-person to somebody—remember that? And now with the digital revolution, frankly, we really are able to manage our own health.
I actually borrow a phrase from the IHI team, the Institute for Healthcare Improvement teams, where they talk about co-producing health. And I like the idea that, you know, banks have evolved to helping us co-produce our financial health. We have a lot more autonomy. We’re able to do a lot on our own now, but of course we can get support when we need it.
And I think it’s time for health care to move into this idea of co-producing our physical and mental health. And I think the role of technologies and enabling, I’ll give you a very specific example. And one of the businesses in health platforms at Verily that I’m responsible for is called Onduo. It was designed to help people with chronic conditions like diabetes, people who might want to lose weight or have high blood pressure, depression, those kinds of conditions, and really just help them manage better.
So, what happens with Onduo is we have, first, we have an app, so you could do it on your phone, you could do it on your laptop and your tablet, whatever it is. So, it’s going to be with you. That’s one, it’s accessible. The second is there’s a novel sensor, so you can actually collect new information about yourself rather than assuming everybody with diabetes has the same kind of blood sugar reaction to every meal, which they don’t, very, very different reactions to the same meal across different people.
Instead, we put the sensor on, and we measure your blood sugars. You take pictures of your meals and snacks. So, if you have a diet that’s vegan super organic-y versus, you know, all fast food, whatever your diet is, we look at your blood sugars and then we can use AI to make recommendations about what works for you and what doesn’t work for you.
And then there’s the whole piece of the human touch. You can chat with the coach, you could do video conferencing with a physician, and so you have that whole human element as well. So, it creates a personalized. You, and I may have very different microbiomes. So, for me, soy milk might be better for me, skim milk might be better for you. All of that can be really personalized at scale and create this totally different, very engaging experience of co-producing your health. Other than that, that’s really the way of healthcare today and in the future.
[00:38:23] Karen Wolk Feinstein: But I also think someday, the new frontier, we’ll have explorers exploring the intestine. I can’t believe we have so many feet of intestine and that every place on that journey does the same thing. I always say to my kids, you should become a GI doc because I’m really fascinated with what each part of that intestine does. So let me ask you a question. Also, you talked about quality and safety and moving into a new era, which of course resonates with me. And I think about a time when a very enterprising surgeon who was also an engineer wired a camera in his patients’ rooms. He wanted to know how they got infected. The camera wasn’t on the patient; it was just on what everybody else in the room was doing. And there was a huge stink. I mean, the nurses were furious. He had to take the camera down, even though it gave us a lot of very interesting information, but then flash forward. Now we have sensors and monitors everywhere. A lot of it to keep the beds full. We understand that, but now it’s just routine that there are sensors and monitors throughout a lot of our systems. I see potential there to use the command centers where these monitors and sensors are being monitored themselves, using that also to understand the preconditions for harm and to intervene before harm occurs. Do you have any comment on that? I mean, some of the ways we could autonomously prevent harm before it occurs using the technology that’s available now.
[00:40:04] Vivian Lee: Oh, absolutely. I think there are a few different ways to think about that question that you’ve asked.
I think the first one in terms of, of creating a safer environment is to prevent mistakes. Most people who come from, the people that I work with in user experience design, user experience research. When they walk into a healthcare system, they’re really pretty consistently shocked by the lack of a user-centered design that’s been put into practice. I’ve been reading this book, The Design of Everyday Things. It’s sort of a classic by this guy named Don Norman, sort of the father of user-centered design. And he, I think, what he describes when he talks about products that aren’t self-evident and don’t work so well, is that it’s easy to blame the user when really, it’s a design or system problem.
And I think we see that all over the place in health care. And many of the national strategies around trying to address safety have been very focused on individual measures, individual outcomes, like, okay, let’s prevent these kinds of infections, let’s prevent these kinds of falls rather than looking at health care as a system that needs to be really redesigned to be much more user centered, to make it almost impossible to make a mistake if it’s really designed well.
So, for example, when I was at the University of Utah, we were working, on like most health systems, making sure that we could avoid blood infections that were caused by having central lines, you know, like long intravenous catheters that would stay in patients for days and weeks and sometimes could cause infections. And so our nursing team actually engaged an industrial psychologist to come and design little packets for helping them do the dressing changes, which they had to do every day. And dressing changes are not super complicated, but you know, you forget one piece of it and then you’ve got your sterile gloves, you’ve got everything on, but if you forget the tape, you have to now take it all off and go get a piece of tape and then come back again, you know, sort of a hassle. And so that’s sometimes that would lead to contamination that would lead to these infections. So instead, they’ve created this, well, they called it themselves kind of an “idiot-proof kit,” where it was just a little folded-up kit and had everything that you needed, you know, it had the gauze, had the betadine, the iodine swab. It had, you know, the new catheter and just all the little pieces. And as you unfolded the kit, the next piece that you need just was right there, right in front of you. And once we implemented that, we actually had a zero infection rate, for months, if not years, after that. And so I think there’s a lot of opportunity for building safer processes into our actual designs. And I think that’s actually a really, really important step forward.
[00:42:57] Karen Wolk Feinstein: Well, you struck a chord here that we’ve touched on in some of our previous broadcasts. I did attend a conference on AI and health care safety, and it did turn out to be a lot of distinguished researchers skeptical, or at least willing to challenge all the things that sound good, but could go wrong. You know, all the externalities, all the problems when we introduce technology. So, you’ve been head of a large integrated health system. With the last example you gave me an opening to ask, and I know Verily has to think about this a lot. How do we overcome the resistance to some of these very technologies that could make the work of the frontline easier, but also, as you say, prevent these errors that never should have occurred?
[00:43:50] Vivian Lee: Well, I think one thing that we have to realize is that, so often when we think about health care, the example that people talk about when they talk about safety is in manufacturing. I talk about in my book that, in the past, Japanese manufacturing and U.S. manufacturing was deeply flawed. And so with W. Edwards Deming and quality control, we significantly improved performance. But sometimes that narrative, even though it’s true for very standard procedures, doesn’t always resonate with clinicians because patients are not like cars. You know, every patient is different. Every patient has different biology.
They don’t just have one issue. They might need to get their hip replaced, but they also have diabetes, a touch of hyperthyroidism, and maybe occasional depression or something like that, you know. So, you can’t treat individuals as if they’re all widgets. And, and so that’s where I think this AI piece and data and creating more intelligent, personalized pathways is the way forward. So, this is what we’ve been working on at Verily. We have a business called Verily Value Suite. We need to actually use more evidence-based pathways, but then we need to recognize that in that pathway, let’s say if there’s a standard pathway for how you take care of a patient who needs to get a coronary artery bypass graft, for example, we need to recognize that it’s not going to be strictly a one size fits all—that we will do certain things that we know are standard, like how we should make sure that the person doesn’t get a blood clot or how we should make sure that the person doesn’t get an infection. Let’s follow those, but let’s over time also collect information, let’s collect data and create a learning system so that, over time, that pathway can get more and more personalized to individuals where we have more and more specificity about how they’re different.
So, we have a standard pathway for a person who has no diabetes, but now we actually can modify that pathway if we know they also have diabetes. Oh, we can modify the pathway if we know they also have diabetes and a touch of high blood pressure and maybe their thyroid is a little bit too active, for example.
And how do we, over time, create more personalized pathways for people? I think that’s actually a very important part of improving safety, but recognizing that safety doesn’t just mean treating everybody as if they’re like a car on a manufacturing line, because that’s really not what we are.
We can start with some standards, but then we need to use the data and use the artificial intelligence to create more and more personalized care pathways. And I think that’s an important step forward to getting to safer health systems.
[00:46:40] Karen Wolk Feinstein: Well, we’ve talked about two of the three priorities you mentioned at the beginning. Let’s talk about equity. I know you personally are very committed to that. So, I think of Verily state-of-the-art technology over here and equity over there. So tell me, Vivian, your vision of how they come together.
[00:47:03] Vivian Lee: Oh, I actually find them completely convergent. In fact, I feel that technology is the way in which we can leapfrog so many of the barriers to health equity that we faced for decades now. And I’ll give you three examples. So, the first one is along the lines of what we talked about at the very beginning, personalizing healthcare. So, when we used to talk about every standard patient as a 70 kilogram white male, we made a lot of assumptions about patients that over time now have gradually become very obvious that they don’t hold and don’t apply. The more we’re able to collect information to record differences in diet to look at socio-economic challenges that people might face. Do you have transportation? Do you actually have a language barrier in terms of understanding instructions, or are you able to prepare for a hospitalization?
How do we actually address those issues and create a personalized health journey for you? That’s a hugely important step to advancing health equity. And then the second example is in the way in which we actually recognize an enormous amount of bias in our healthcare delivery systems. In fact, there was just recently a very interesting publication looking at the degree of pain that’s experienced by Blacks versus Whites in a study that looked at knee x-rays. And using what radiologists would use as the standard criteria for evaluating whether somebody’s knee x-ray shows, arthritis that would likely be causing pain found that those interpretations by radiologists were massively, were significantly off, for African American populations compared to Whites.
So, a lot of the training that we’ve undergone has been based on populations that don’t necessarily reflect the broader constituents that we’re trying to serve. And actually by layering some AI into that algorithm, the AI algorithm was actually significantly better at predicting pain in the African American knee x-rays than what the radiologists were able to do.
So, how do we actually eliminate bias in the actual practice of medicine? Well, we need to have better data sets where we can actually form our conclusions, you know, like how do you read any x-ray, all the way to our AI algorithms—we need to include broader populations of people that represent those that we’re trying to serve. And then we need to apply those pathways in a standard way. Just because you’re a woman coming in with chest pain doesn’t mean that it must be indigestion because women don’t have heart attacks, right? We heard a lot about those kinds of biases in the past. We need to apply those standards in a very consistent fashion and the data and modern AI can make it even better if we do it the right way.
And then the third is just simply surfacing the data about health equities, and disparities. So, when we look at our data, for example, with Onduo looking at people who have, let’s say diabetes, we actually, from the beginning, look at what are the characteristics of the people who are eligible for, Onduo and what are the characteristics of those who sign up, and are they different? If we look at, for example, race, ethnicity. We look at their education level, we look at their income, we look at whether they live in food deserts, we look at their zip code, and we actually prove to ourselves that actually there is no significant difference. And then we look at of those who are signed up, is there a difference in those who do better and those who don’t do better relative to those same socioeconomic, you know, all the kind of areas of health disparities. And so I think it’s very important for us to use the data, to surface that data so that we, as a health system, can do better with that. We can address health disparities and achieve a more equitable delivery system. So, I think the technology and the data are the key to solving this problem, Karen.
[00:51:21] Karen Wolk Feinstein: We are looking forward to that day. And I will say that with maternal mortality, as various communities and states and a nation realized the disparities in maternal mortality, just by finally examining the date of the were available. I do think people respond when, when the proof is out there, so this will be excellent. The last question that’s on everyone’s mind is, How many people died during the pandemic unnecessarily? They died of COVID, but they didn’t need to die of COVID. So, we know of course, if the vaccines had been earlier or if people had adopted and taken the vaccines when they were available, that would have saved lives, but there are other things that could have saved lives and a lot of disability. I’m sure Verily is thinking about this. The pandemic was such a game changer. I’ve seen estimates. I think you saw Kaiser Family Foundation. I think we’re talking about maybe a quarter of a million people, lives and might’ve been saved depending on, on what you’re factoring in. How has the pandemic changed some of the work at Verily Health?
[00:52:40] Vivian Lee: Well, you know, I think the pandemic really has cast a really harsh light on our healthcare system. For my book, for The Long Fix, when the paperback edition was coming out, they had me write an epilogue just about the pandemic. And I had a chance to really reflect on the fact that the people who suffered the most are the people who were already suffering in our healthcare system.
You know, remembering the earliest days of the nursing homes, frail seniors. They were already suffering before the pandemic. But the pandemic just cast this bright light on this crisis that had already been brewing—on health disparities, on the huge socioeconomic gaps, huge race and ethnic gaps in terms of outcomes—those already existed before the pandemic. On the lack of a public health infrastructure, generally that already existed before. So, I think most of us feel like this is a moment where maybe the long fix doesn’t have to take as long where we can marshal all of our resources and our energy. And, you know, in the words of Winston Churchill, “Never waste a crisis.”
So, let’s leverage some of this energy and focus on the health of our society and say, okay, you know what? It’s time to make a better healthcare system. One that can serve us not only in pandemics, but even in times of peace, make better health care and better health for everybody.
[00:54:02] Karen Wolk Feinstein: Well, Vivian, we’re so happy that Verily Health had the good judgment to find someone who really understands the broad system challenges and puts that to work with an understanding of what technology can offer and advanced analytics. You’ve run a large research institution while at the same time running a large clinical system, and I think we’re very fortunate that you took some time out, looked at the whole system perspective in writing The Long Fix, and now you’re going to solve some of our problems, introducing the technologies available in our modern era. So, I can’t thank Verily enough for you being there and you for taking this on. I think we have a bright new future. Thank you for being with us today.
[00:54:56] Vivian Lee: Thank you, Karen, and thank you for all of your leadership and all of your energy and passion and everything you do. You’re an inspiration to all of us.
[00:55:05] Karen Wolk Feinstein: To learn more about the effort to establish a National Patient Safety Board, please visit npsb.org. We welcome your comments and suggestions. If you found today’s conversation enlightening or helpful, please share today’s podcast or any of our other podcasts with your friends and colleagues. We can’t improve the effectiveness of our health system without your help. You, our listeners, friends, and supporters are an essential part of the solution.
If you want a transcript or the show notes with references to related articles and resources, that can be found on our website at npsb.org/podcast/. Up Next for Patient Safety is a production of the National Patient Safety Board Advocacy Coalition in partnership with the Pittsburgh Regional Health Initiative and Jewish Healthcare Foundation.
It is executive produced and hosted by me, Karen Wolk Feinstein. Megan Butler and Scotland Huber are my associate producers. This episode was edited and engineered by Jonathan Kersting and the Pittsburgh Technology Council. Thank you, Tech Council! Our theme music is from shutterstock.com. Social media and design are by Lisa George and Scotland Huber. Special thanks to Robert Ferguson and Steven Guo. Thank you all for listening.
—
Subscribe on your favorite podcast app: Apple Podcasts | Google Podcasts | Spotify | Pocket Casts