High-Consequence Success, with K. Scott Griffith – Episode 447 of The Action Catalyst Podcast
- Posted by Action Catalyst
- On December 26, 2023
- 0 Comments
- author, Business, collaboration, leadership, management, reliability, risk, Stephanie Maas, success, systems
K. Scott Griffith is the founder and managing partner of SG Collaborative Solutions, LLC and author of the world’s first Collaborative High Reliability improvement programs, recalls averting catastrophe for the airline industry, working for NASA, and tackling risk A.S.A.P., and explains the sequence of reliability, the essential attributes of a highly reliable organization, socio-technical improvement, unlinking staff from systems, normalization of deviancy, doing risk assessment on white rice, and mistaking “The Big Bang Theory” for a documentary.
About Scott:
Scott Griffith is the founder and managing partner of SG Collaborative Solutions, LLC. Scott gained his reputation for world-class collaborative skills through success in working with high-consequence industries across the globe. He initially came to prominence in the field of aviation, and is widely recognized as the father of the airline industry’s highly successful Aviation Safety Action Programs (ASAP), which led to a 95% reduction in the US fatal accident rate, his work has made high consequence industries more reliable across the globe. In 1998, he received the Admiral Luis de Florez Award from the Flight Safety Foundation for his outstanding contribution to aviation safety.
In 2000, the United States Surgeon General David Satcher invited Scott to advise a Department of Health and Human Services committee on Blood Safety and Availability. From there, Scott applied his unique approach to focus on improving outcomes across multiple values in healthcare organizations, collaborating with hundreds of hospitals across the country. In 2006, he retired from American Airlines and dedicated himself to Just Culture performance improvement integrations at numerous large healthcare systems, airlines, railroads, energy companies, emergency medical services, fire and law enforcement agencies, leading numerous joint state- and nation-wide collaborative Just Culture projects.
Scott, along with his business partner, Paul LeSage, are the principal architects of the Risk Reliability™ model of socio-technical improvement, bringing the science of reliability to healthcare organizations across the United States. Their collaborative engagements have produced dramatic healthcare results, supporting improved outcomes across a wide range of values, from patient safety and clinical outcomes to privacy, compassion, fiscal responsibility, customer satisfaction, and operational excellence. He has pioneered the development of multiple predictive risk management strategies, including socio-technical probabilistic risk assessment (STPRA) and both Learning and Safety Management Systems (LMS and SMS). He has worked extensively with management, labor, and government officials and is widely recognized for his ability to help organizations achieve consensus results in support of common goals.
In addition, Scott has 25 years of experience at American Airlines, first as an international captain, then as the Managing Director of Corporate Safety and Quality Evaluations. In recognition of his contributions to global aviation safety, Scott has received numerous awards and citations from both government agencies and the private sector. He is the three-time recipient of the Federal Aviation Administration’s Good Friend Award, at the air carrier, regional and national levels. Throughout his career, Scott has worked closely with government regulators in several industry sectors and is leading the national effort to implement the Collaborative Action Partnership improvement program in the healthcare and law enforcement industries.
Scott holds a MS in Physics from Texas A&M University and a BA in English and Physics from Texas Christian University. His master’s thesis contributed to the research and commercial development of the airborne windshear LIDAR project under grant from the Federal Aviation Administration (FAA) and the National Aeronautics and Space Administration (NASA).
In The Leader’s Guide to Managing Risk, he brings the secrets of his success to any business, organization, or individual striving for sustainable results.
Learn more at SCGPartners.com and LeadersGuidetoManagingRisk.com.
The Action Catalyst is presented by the Southwestern Family of Companies. With each episode, the podcast features some of the nation’s top thought leaders and experts, sharing meaningful tips and advice. Learn more at TheActionCatalyst.com, subscribe below or wherever you listen to podcasts, and be sure to leave a rating and review!
LISTEN:
SUBSCRIBE TO OUR RSS FEED: https://feeds.captivate.fm/the-action-catalyst/
SUBSCRIBE ELSEWHERE: https://the-action-catalyst.captivate.fm/listen
__________________________________________________________________________
(Transcribed using A.I. / May include errors):
K. Scott Griffith
Hi Stephanie, how are you?
Stephanie Maas
Hey, doing great, Scott.
K. Scott Griffith
My pleasure.
Stephanie Maas
So I’m going to dive right in.
K. Scott Griffith
Let’s dive in. And we’ll we’ll swim in deep water.
Stephanie Maas
So in your background, you built this reputation for world class reliability in high consequence industries across the globe. So can you put some legs under that table for me?
K. Scott Griffith
Absolutely. So the term high consequence industry, it’s a little bit of a misnomer. It originally was used in those industries where catastrophic failures could result in the blink of an eye. So think about a plane crash. on a on a more human level. Think about a police officer and the blink of an eye things could go catastrophically wrong, a surgeon on an operating table, a nurse administering a medic medication. So those are industries where the consequences of failure are, are sometimes immediate and catastrophic. So my reputation started in aviation, where I was the Chief Safety Officer at the world’s largest airline. And I developed a program known as ASAP which led to a 95% reduction in the industry fatal accident rate. From there, I was invited by the Surgeon General David Satcher, years ago and 19, actually, in 2000, to come to Washington and meet with a group of healthcare professionals under the Health and Human Services Department, and explore the potential for the aviation success to be migrated into the healthcare industry. So for about the last two decades, I’ve worked in multiple industries, including healthcare, aviation, law enforcement, emergency medical services, and nuclear power. I wanted to take the lessons that I’ve learned and the strategies I’ve developed to any business, because at some level, any business or organization is high consequence to the people involved, the owners, the employees, that the shareholders. So what has worked in those high consequence industries can absolutely work in any business.
Stephanie Maas
You said you have a 95% improvement that is SIG nificant. So tell me about what are some of these principles that you found really translated, regardless of the industry?
K. Scott Griffith
That’s right, so so so it was it was astonishing, in the sense that a group of industry professionals came together with a common goal to keep the public safe to keep planes from crashing. And so what we were successful doing is bringing in a regulator that had previously been very rules based enforcement posture, to become a more risk based oversight agency. So the Federal Aviation Administration, we helped move them from a position of it’s all about the rules, to it’s all about how we manage the risk. The other part of that collaboration was clearly the airline executives and leaders, and then the Labor Association. So we worked with different unions across the country, from Pilot unions to flight attendants, and mechanics and air traffic controllers. And I created a program that combined each of those entities into a collaborative endeavor, known as the Aviation Safety Action Program. And one of the central images are metaphors of our success was that of the iceberg. So what we typically see when a plane crashes, or a rule gets violated is yet just the tip of the iceberg. It’s a cliche, but it turned out to be a very powerful metaphor in convincing the regulator that a crime and punishment style of enforcement was was not giving them a full picture of the risk in the National Aerospace system. So what we did was we created this safe haven, this reporting program where employees could report into a program that was collaboratively managed regulator, airline and labor and from that we changed the culture virtually overnight. And we started getting reports not just of events and violations, but we started to see risk below the waterline in the every day successful outcomes that posed significant risks. And that’s one of the central messages for any business that most businesses measure results. Not At the risk involved in those results, if all we’re doing is measuring outcomes, we’re restricting our visibility to what we see above the waterline. And the real risk. And the real opportunity lies in the everyday systems and the everyday activities of our people that are sometimes risky. But when we get positive results, we we turn a blind eye to the risk taking behaviors and the risky systems.
Stephanie Maas
So you just talked on this for a second. And I want to come back to that from a leadership perspective. You mentioned creating a culture where people were willing to come forward with concerns around risk per se. So I love to hear, how do you culturally, from a leadership perspective, help your people embrace that?
K. Scott Griffith
You must build a trusted program, I had one leader it I won’t mention the name of an organization, we were talking about the issue of employee burnout. And it’s a significant challenge, particularly coming out of the pandemic, we see a high degree of burnout in a number of areas, but I had one when senior leaders say to me, Oh, my employees come tell me when something’s wrong. Every year, I have a holiday Christmas party, and they come up and they tell me and I challenged him to say, well, they may not be coming forward every day, they may not be coming forward with their real concerns. So what we did, which was unique was we built that trusted system into a program. And that program was actually described with a set of rules and conditions, if you will, that described how the program would be managed. And it offered employees a guarantee that if they came forward in good faith, rather than being punished, that we would work proactively to address the risk, whether that risk was in the the behavior of the employee, or whether that risk was in how that employee was trained, or whether that risk will lay in the system and the environment around the employee. And from that we started to, I guess another metaphor is we pulled back the curtain on risk that was taking place every day that we weren’t, hadn’t previously been able to see. So I developed something. And I mentioned this in the book called The sequence of reliability. And the first step in that sequence of reliability is to see and understand risk. In most places, when things go wrong. The first place organizations look is the behavior of the individuals involved. And that’s really too late in the process. The risk has been there for a while, but we haven’t seen it because we haven’t seen bad outcomes. Think about driving our car, just give us an example we can all relate to. Let’s play a little game. Do you have a car? Do you have car insurance for the car? You drive? 70? Idea? Yes. Okay, so I’m gonna pretend like I’m your insurance agent. Would you agree that if we could find out how you drive every day, day in day out, that would give us a better profile of the risk of you driving a car than if we just look to your recent accident rate? Or record? Check. So the game I want to play is would you do me a favor? Would you call me anytime you go over the speed limit? Or would you tell me when you’re talking on the phone? Or you get distracted? And maybe you even text while you’re at a red light? Would you just call? Let me know. So I can build a profile on you to understand how risky you are? I’m going to say no, you’re gonna say no, most people aren’t going to come to their boss at the end of the day and say, let me give you a list of all the risky things I did. Because the way you’re going to measure me is on outcomes. If I get the job done, and I get I get rewarded, sometimes for risk taking behavior, because my boss can see what I do day in day out, they only see the results I produce. So I’m incentivized, most people are incentivized to get results. Now, we’re not saying results don’t matter. Absolutely, results do matter. But we have to be careful that we’re not building excess risk into our system by rewarding people for their outcomes.
Stephanie Maas
Very interesting. I think from a human perspective, we all want to be that way. But when push comes to shove, and something goes wrong, those are usually the first couple things. They go out the window, especially compassion. So talk to me about that.
K. Scott Griffith
Yeah, so So compassion I mentioned in the book as a as one of the attributes, particularly in fields like health care, but any any service industry, if if, in the work that I do, Stephanie really is under I guess the category of I help organizations become highly reliable. And if you say to me, Well, what does high reliability mean? I can define that as consistent high performance over an extended period of time, in a small number of attributes, and by attributes, I mean, the field that I started my career was as a Chief Safety Officer. Sir, is a very important attribute for any service industry, whether you’re Disneyland or an airline or a hospital, being safe is is essential to being reliable. That that’s not simply enough, because if you’re a patient in a hospital, and they are highly reliable at keeping you safe, but they treat you with disrespect, or they don’t treat you with the compassion that you deserve, you won’t consider that to be a reliable organization. So there’s a really small subset or set of what we call attributes of high reliability, that are universal, you have to be safe, you have to care about people’s privacy, you have to pay attention to infrastructure. So if you’re a hospital, a safe hospital, but you get shut down by a cyber attack, everything you do is going to be affected. So infrastructure is important. Equity, Diversity, belonging, it’s not enough just to be reliable with one segment of society, we were open to the public, so we have to be equitably reliable. So there’s a small set of attributes that we define. And so compassion plays into that, when we start to work with organizations, where they typically typically look for results like safety, we say, let’s broaden that perspective to look at what are the attributes that you have to be good at, in order to be considered reliable, and therefore to be a sustainable business?
Stephanie Maas
So is that this term, which I’d never heard before, socio technical improvement?
K. Scott Griffith
So I have to tell you, I’m a geek by nature. So that’s a term that, that comes natural to natural to me. But when I looked at the first time, I didn’t understand but a socio means people from the Latin and technical, we would say, applies to systems and the the environment that people work in. And in today’s technologically advanced world, whatever business you’re in, you have people working inside in with systems. Now, when things go wrong, where do we typically turn the human, even though that human is working with technology, or working in an environment, in a culture, what organizations find challenging is how to unlink or separate the system contributors from the human contributors. And we often do that in the wrong way, we often just strike at the behavior. And instead of looking at the system that we put in place to manage the risk and the opportunity. So sociotechnical is a geeky word for people working inside systems. And you have to be good at both. If you put outstanding people in a poor system, you won’t get great results, you could take a great actor and give them a lousy script, or a pro quarterback and put them in a system that’s not very well developed, you won’t get great results than the contrast, if you take an average individual and put them in a very well designed system, you’ll get better results than others will get. And so the second step in what I have called the sequence of reliability, after we look to seeing and understanding risk is to build reliable systems. And we do that because now once we have seen and understood stood the risk, built a reliable system. Now we can focus our attention on making the human reliable that the employees reliable. And we do that through something called performance management, where we train them and we help develop their knowledge, skills, abilities, and proficiencies. And then we focus on those factors that influence their performance, the system, personnel factors, the environment and the culture. And then finally, we then focus our attention on their behaviors and behaviors come in two categories, errors and choices. And guess which one poses the greatest risk in our daily lives, the errors we make, or the choices we make? Which would you think is most consequential to the outcomes?
Stephanie Maas
Well, I’ve got two teenagers right now. So I’m going with choices.
K. Scott Griffith
Absolutely. Choices, absolutely choices. Now, people tend to think that it’s the errors we make, think about how many risky choices we make every day, specially teenagers, and most of the time, our risk taking choices turn out to be good results. So What lessons do we learn when we drive over the speed limit? What lesson do we learn when we talk on the phone? Mom, I don’t have to wear a helmet when I ride a skateboard because I’ve been doing this for two years, and I’ve never fallen? Yep. Oh, by the way, here’s an interesting statistic that we should all pay attention to the way our society manages the risk of drunk driving is through the legal system, the legal system, as privileged as we are to work in a nation of laws. Our legal system is not designed to manage risk, because in order to enter our legal system, either as a plaintiff or defendant, there has to be evidence of harm. That’s a terrible way to manage your teenager waiting for harm to have And therefore you step in. So the way we manage drunk driving is a police officer will pull you over if they suspect that you’re driving intoxicated, or there’s a car crash where we take your blood alcohol level. Well, on average, the National Highway Traffic Safety Administration told me that on average for every drunk driver arrested, they have driven drunk ADA times previously without having been caught. So most of the time in our society, that risk is out there interacting in the socio technical world, and we don’t see it, man. That’s a stunning statistic, isn’t it? One thing we all have in common is that we all make mistakes, and we all make risky choices. The funny thing about the human brain and this gets into neuroscience is that we learn most from our most recent experiences. So when we do something, no matter how we were trained, when we get by ourselves, and no one’s watching in, we cut a corner, and nothing bad happens. Oftentimes, we learn the wrong lesson from that successful outcome. We do these things repeatedly, because we don’t see and understand the risks. And we learn the wrong lessons from our successful outcomes. And so we end up surprised when catastrophe occurs.
Stephanie Maas
So where did your passion for all of this come in?
K. Scott Griffith
I mentioned that I was a geek and I is an example of that. Have you ever heard of a TV show called The Big Bang Theory? Absolutely. I watched it for two years. And I thought it was a documentary. I didn’t know. Those were my people. So in addition to being a pilot, I had been in graduate school as a physicist, and in 1985, I was doing a walk around inspection on a boat, a walk around inspection is a preflight activity that pilots take on before they get in a plane to go fly. So they literally walk around it to make sure that there’s structural integrity on the airplane. I’m walking around this airplane, and I look up in the sky, and I see another plane adult, a wide body jet coming into land, and it’s in distress, and it gets so low that it hits on top of a car on on highway 140. It bounced, and as it’s coming into land, and next thing I see the wingtip strikes at above ground water tower and the plane cartwheels and it just explodes. You know, I was stunned by that. A few moments later, I was knocked down by a gust of wind. Several seconds after that there was a torrent of rain, and I ran back up on the airplane, and the plane is rocking, and people are starting to panic. And so what happened was, that plane encountered a deadly microburst, which is a downdraft of wind, that the pilots couldn’t see, because it was separate from the clouds and separate from the rain. It was a clear threat. So I took a leave from the airline Long story short, spent about a year working on a contract for NASA, as a physicist to build the first airborne prediction laser system known as LIDAR to scan in clear air what the Winfield was doing. So the reason that plane crashed was because the pilots couldn’t see. And they didn’t understand the risk, we could only manage what was in front of us that we could see. And all we could see was the tip of the iceberg.
Stephanie Maas
What a crazy intense thing to experience.
K. Scott Griffith
It was crazy. And in most businesses, most organizations, particularly those that are regulated, we hide our risk from the regulator, because we don’t want to be sanctioned. We don’t want to be fat. It’s like when we’re driving down the road, Stephanie, what do we do instinctively, when we see the police car, instinctively, we slow down. By the way, on any given day, studies have shown that only 90% of us are driving the speed limit the rest of us are driving over. That’s our norm normalization of deviancy. It’s normal to deviate in our society. But I have found that to be generally true in every industry, in every business I’ve worked in. It’s not that we’re bad people. It’s that we’re living with competing priorities.
Stephanie Maas
So what would you say to encourage leaders that, you know, look, some of us quite frankly, ignorance is bliss. What I don’t know, is it going to hurt? And I’ll deal with whatever happens when it happens. What do you say to leaders who want to stay ignorance is bliss?
K. Scott Griffith
Well, that’s probably the best question I’ve heard in all year. Stephanie. And here’s my answer, talking to a leader, particularly a CEO. And I understand CEOs have people who handle risk management, what CEOs are all about our opportunity and sustainable success. If you’re a CEO, you want to go out and capture the art you want to build a business. And once you build it, and take advantage of those operate market opportunities, you want to be able to sustain it. The world is filled with successful businesses. They crashed because they couldn’t see the risk ahead. Edyta, we’ve seen businesses not adapt. And we’re not saying be risk averse to the CEO. We’re saying be risk intelligent, go grasp the opportunities in an intelligent way. And so to do that, you have to be good at some of the things you’re not known for, you’re not good at as a business. So if you’re a company that depends on technology, and you’re out there providing some product or some service, and it is not your specialty, you hire people to manage it. But if you don’t become reliable with that platform, everything you do is in jeopardy. You have to be risk aware, which involves situational awareness, positional awareness, cultural awareness, so you have to see and understand the environment you’re in, then there’s something called risk tolerance, you have to assess where your tolerance level is for risk. I mean, there’s a financial analogy here when you’re young, and you can afford to take risks, because you have a lifetime ahead of you to recoup it. When you get to be my age or older, you may be more risk averse in that respect. But in every business, there is risk that’s in front of you that you may not be able to see, one of the things that leaders face, which is often and troubling is that they are surrounded by people that often tell them what they want to hear they want to be paused, right. So you want to empower employees and managers who will see risk differently. A frontline manager has experience and expertise, but only the frontline employee is the one that seeing the risk on the assembly line, for example, it’s the frontline employee that is exposed to the risk, you want them to be able to share it, and it filter its way up to the top leadership. But to do that, you have to have a culture that supports it. And most business books, stress the importance of leadership and culture. Well, you can be a strong charismatic leader and lead an organization in the wrong direction. And what do most CEOs not all, but most CEOs, when they come in, they want to set a new tone, there’s a new path here, they want to make their mark, which may be different than the previous CEOs. But what we all want, and particularly shareholders is sustainable success over the long term. And to get that you have to go beyond leadership and culture into building systems that become reliable. And that each manager each CEO, inherent, reliable systems, again, you put great people in a system that breaks down, you won’t get great results. So there’s a sequence to it, let me just summarize it step one of the sequence of reliability is see and understand risk. You can see a risk and not necessarily understand it. Or like our drunk driving example, we can understand it, but not necessarily see it when it happens, you have to do both seeing and understanding risk is step one, building reliable systems is step two, helping people to work to reliability is step three. And the fourth and final step is hardwire organizational reliability. Most business books start with the last step first.
Stephanie Maas
It’s so cool. It’s almost almost like we’re seeing this evolution of leadership. And to your point in the past, we’ve always seen leaders hailed for their charisma, their ability to rally, but I bet if we went back and really studied super successful leaders, this risk intelligence, risk awareness, all these things, they were also super successful, because they were able to manage that as well. super interesting. Okay, in the spirit of time, I’m gonna shift gears. So I’m gonna call you the risk guy. So you’re this risk guy? When does this side of you get on your nerves? Oh, have I just shut my brain off for 10 minutes? Let me eat this two day old sushi, but I can’t because I know the risk.
K. Scott Griffith
Oh my gosh, well, I was talking to my son last night. And we were talking about the foods we eat. That’s another example that I think anyone can understand. Right? We, you know, we’re trying to manage our health through our diet, you know, and food is part of our well being right the foods we eat have dramatic effect on our health and how we feel and how we act. And and I tend to focus in, at my stage in life at at trying to encourage those around me to make smart choices. Now, that’s hard because the science keeps evolving and changing. Remember when red wine was thought to be great for coronary artery disease. And I was really rattling around that right now. The latest research is saying yeah, there may not be a safe level of alcohol consumption now. We were eating a dinner last night and the food that I wear it came with rice and I looked down at the food and it was white rice instead of brown rice was as you know, as a whole grain. And my son said you’re not gonna eat that. Are you dead? And I said, Well, probably Nobody said, You gotta lighten up. He said, You got to enjoy your life a little bit. And so, okay, we see it that we understand it, and we’re going to do it anyway.
Stephanie Maas
So it’s the natural deviation, right?
K. Scott Griffith
That’s right. Teenagers are a good for helping you keep a perspective on on that.
Stephanie Maas
I love it. Thank you so much for walking us through this.
K. Scott Griffith
Stephanie, I would just say thank you for the opportunity to speak to you and your audience today, it’s been a pleasure, I would say the most important message I would take away from our conversation is that risk is all around us in our life, it’s a part of life. And bad things don’t just happen randomly. The technical term for how bad things happened is they’re probabilistic, meaning there’s probabilities associated with the randomness in our lives. But with a little bit of effort, your life can be so much better balanced, when you see and understand. And again, you’re not going to avoid eating white rice. But you’re going to understand the risk and everything you do. And you’ll build systems in your manage people. And ultimately, the organizations you build and work for, will become more successful. So with a little bit of effort, it’s almost like Maslow’s hierarchy of needs when you understand that this sequence can transform your life or your business in a positive way. And it applies to the the big issues of our day, like climate change, and all the other risks that we face. But if we if we work collaboratively, those challenges can be overcome and we can live better lives.
Stephanie Maas
Thank you so much.
K. Scott Griffith
Oh it was a pleasure. Stephanie, you’re a great interviewer too. It was a lot of fun. Thank you.
0 Comments