The Human Behavior Podcast

Why High Functioning Teams Fail

The Human Behavior Podcast

Send us a text

In this week's podcast episode, we are joined by one of our Advisory Board members, Dr. Joan Johnston, to talk about why high functioning teams fail. With over 30 years of experience working with the Department of Defense, Dr. Johnston is an expert on decision making and simulation training. Throughout her career she has made a significant impact on advancing the science of Learning, Team Training, Decision Making under Stress, Performance Measurement, and Organizational Development.

For this episode, we use the tragic incident involving the USS Vincennes where the guided missile destroyer accidentally shot down an Iranian passenger plane after mistaking it for an F-14 fighter jet, as a focal point of the discussion. Dr. Johnston walked us through the critical errors that were made during that incident, the role that stress plays on communication and decision-making, and she shares her insights on what it takes to build more resilient, high-performing teams.

This episode is a powerful reminder of the human factors that influence decision-making and how easily things can go wrong—even with the best of teams. But it also provides a hopeful message: with effective training, strong leadership, and a commitment to learning from past mistakes, we can build teams that are more resilient, adaptable, and prepared to handle the challenges they face.

Thank you so much for tuning in, we hope you enjoy the episode and please check out our Patreon channel where we have a lot more content, as well as subscriber only episodes of the show. If you enjoy the podcast, I will kindly ask that you leave us a review and more importantly, please share it with a friend. Thank you for your time and don’t forget that Training Changes Behavior!

Episode Links
https://www.ahrq.gov/teamstepps-program/index.html

https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2019.01480/full

Support the show

Website: https://thehumanbehaviorpodcast.buzzsprout.com/share

Facebook: https://www.facebook.com/TheHumanBehaviorPodcast

Instagram: https://www.instagram.com/thehumanbehaviorpodcast/

Patreon: https://www.patreon.com/ArcadiaCognerati

More about Greg and Brian: https://arcadiacognerati.com/arcadia-cognerati-leadership-team/

Speaker 1:

Hello everyone and welcome back to the Human Behavior Podcast. In this week's episode, we are joined by one of our advisory board members, dr Joan Johnston, to talk about why high-functioning teams fail. With over 30 years of experience working with the Department of Defense, dr Johnston is an expert on decision-making and simulation training. Throughout her career, she has made a significant impact on advancing the science of learning, team training, decision-making under stress, performance measurement and organizational development. This episode we use the tragic incident involving the USS Vincennes, where the guided missile destroyer accidentally shot down an Iranian passenger plane after mistaking it for an F-14 fighter jet, as a focal point for the discussion. Dr Johnson walked us through the critical errors that were made during that incident, the role that stress plays on communication and decision-making, and she shares her insights on what it takes to build more resilient, high-performing teams.

Speaker 1:

This episode is a powerful reminder of the human factors that influence decision-making and how easily things can go wrong, even with the best of teams, but it also provides a hopeful message. With effective training, strong leadership and a commitment to learning from past mistakes, we can build teams that are more resilient, adaptable and prepared to handle the challenges they face. Thank you so much for tuning in. We hope you enjoy the episode and please check out our Patreon channel, where we have a lot more content as well as subscriber-only episodes of the show. If you enjoyed the podcast, I'll kindly ask that you leave us a review and, more importantly, please share it with a friend. Thank you for your time and don't forget that training changes behavior. All right, hello everyone, and thanks for tuning in this week and welcome to our guest, dr Joan Johnson. Joan, thank you so much for coming on the show today.

Speaker 2:

You're welcome. I'm really excited about this.

Speaker 1:

Yeah, we're excited to have you on here, greg and I both are so today obviously the topic is sort of why high-functioning teams fail, what occurs in different situations. So you can have really highly trained folks who have a lot of experience and then you put them together and they're in a real situation and things can go catastrophically wrong. So that's kind of the big big picture of what we're going to get into and I'm going to give everyone sort of a background on the actual incident we're going to talk about. But I'd like you, joan, first, can you kind of give all of our listeners and Greg's fans, as I call them, give us a little background about yourself and your work background about yourself and your work.

Speaker 2:

Sure, I received my master's and a PhD from the University of South Florida in Tampa and after that I started my career immediately with the US Navy at the Naval Air Warfare Center Training Systems Division, as it's called today, naval Air Warfare Center Training Systems Division, as it's called today. I was always interested in stress research and I based my dissertation on the study of stress. And when I got to the Navy, it turns out that there was this incident that occurred in the Arabian Gulf what we'll describe a little bit later, we call the Vincennes incident. That was really the reason for hiring me, and the problem that we studied was to address stress and training. But you know, up until that point I hadn't, you know, I hadn't really thought about working for the military or having a military research psychologist job.

Speaker 1:

But many of my friends had come to work at NAVAIR at NOCTSD, we call it now and had really exciting opportunities for doing applied research, which is really what the focus of my education was about is being a scientist, practitioner, doing science but also doing things that I talk about with a lot of folks is is, you know, a lot of people don't realize just how much research the department of defense does and how much money it spends on it. And and it's like you said, there, there's sort of when you get into the science of all this stuff, right, there's sort of the academic side and there's the application side, and and and both are needed in a sense, right, but. But what happens in a lab and in theory and on a university is one thing, and what you're talking about here is like okay, this is an actual application of this stuff. How does this play out in the real world, in real situations? Because that's the most important part, right. So we'll kind of jump in and I'll give the background on the incident that you're talking about, because it really led to. It seems like you were kind of coming in at the right time, right place with your background and talking about stress to go into these big programs that sort of came out of this.

Speaker 1:

But for those who don't know, back in 1988, an Iran Air Flight 655 was mistakenly shot down by a United States Navy guided missile cruiser, the USS Vincennes. So this is late 80s, it's middle of, you know, iran-iraq war was going on. There's a lot of disrupted air travel in the region, so there's increased military presence by a bunch of nations, including us, so not unlike what consistently happens in that area. So the Vincennes was this guided missile cruiser, right? And so it's part of the Navy surface fleet operating that area. And so what happened on this day was this Iran Air Flight 655. It was scheduled commercial flight from Tehran to Dubai via this place called Bandar Abbas, which is, like, right in the Strait of Hormuz. It's sort of like this is geographically significant area, so it links the Strait of Hormuz, you know, links the Persian Gulf with the Arabian Sea and it kind of comes out at this point of Dubai and it's right by Iran, and so it's this, this area where it's like, prone to potential situations, right. So basically, this flight is is, you know it's approaching the airspace by the USS Vincennes.

Speaker 1:

They detected the aircraft on its radar. There's a lot of this, heightened tensions in the area, and so the crew mistook the civilian airliner for, you know, an attacking Iranian F-14. So they think it was, you know, they believed it was engaged in some sort of hostile activity. So they think they're under threat. They issue their warnings, the aircraft under what they're supposed to do, and then their communication attempts unsuccessful. So they concluded all right, this is a hostile target, let's shoot it down. They launched two surface to air missiles and they struck the airliner. So all 290 people, innocent people, on board, died, passengers and crew.

Speaker 1:

So this is obviously a huge international incident, a huge embarrassment, you know, hiding tensions at the time in the area, all that stuff. But we're not getting into the sort of geopolitical aspect of this and what that led to. What I want to get into is what this led to within the Navy and the DOD as a whole, because that's kind of what you're talking about. There's a whole bunch of programs stood up from there that kind of set out to sort of define and determine and look at it just like any incident where there's going to be a major investigation. It's not just about what happened, it's okay, let's identify everything and let's figure out how to mitigate this from ever happening again. And so I'll pass to you, joan, and kind of give the background and understanding of what was the research that sort of evolved from that incident.

Speaker 2:

Right, sure. So I was working with some really outstanding scientists at the time who wanted to really do right by Office of Naval Research to fund a six-year program of research that would focus on improving the human factors design of combat systems for these kinds of ships, as well as improve on the training and human performance of the sailors that were involved in the shoot down. In the Combat Information Center in these combat ships there are at least 40 personnel who are working extremely hard to manage the day-to-day understanding, the common operational picture of what's happening outside in their, around their battle group. And you know, even today you can see that these, these battleships are extremely important to defending freedom around the world. And so it took a lot of work between the human factor scientists and the training research scientists on our end, uh, in a collaborative effort to identify what the research questions were, because at the time, in 1990, I mean today, you know we built on all of that research and have some really great programs today, but back then we really didn't have much to go on.

Speaker 2:

I mean, there was some significant work that the Army had done on decision-making under stress and teamwork, but in terms of studying it in a situation where you have all these technology interfaces that the warfighters were using in the ship.

Speaker 2:

There really weren't any studies of that and so we embarked as a very large program of research that ONR oversaw, and our specific piece of it was for training and decision-making and developing performance under stress, specifically focused on team performance. So we had Dr Jane Cannon Bowers and Dr Eduardo Salas who were the architects of this research program, and so we had many academic scientists and small businesses who had been working on these problems for a number of years and it coalesced into a program and research design where we worked eventually to have combat teams that were in training facilities, like at Surface Warfare Officers School in Rhode Island, to help us run scenarios similar to the kinds that were encountered in the Vincennes incident with shooting down a potential airliner or possibly a threat and study their performance, develop training strategies to improve performance, to prove leadership and really focus on decision-making under stress for teams and improving team performance. So there was a very huge effort going on in the 1990s to establish principles and guidelines for understanding human behavior at the time and making changes to that.

Speaker 1:

Yeah, and this is like you said. I mean, there was some research there, but nothing comprehensive. This became a multi-year I think you said six-year study and trying to figure all this stuff out and I want you to kind of jump into some of the stuff that was found with this incident, what you guys found. But before that, can you quickly sort of define what you, what you mean when you say stress, like you're studying stress, but what does that mean in the context of what we're talking about? Because that's a term you know, I feel stressed or this is stressful, then there's decision making under stress. So like, let's, let's get a clear operational definition, if you can, of what you mean by stress.

Speaker 2:

Yeah so. So stress is is a combination of information, overload, time pressure, the demand to you know perform at a very high level. In this particular instance, you know we focused on the time pressure that the warfighters are under the sailors because when you detect a threat outside you know an air threat, a surface threat, an air threat it's only a matter of like, less than a minute to respond if it becomes a threat to your battle group. And so we had to really focus on that kind of a problem and we talked to subject matter experts about their experiences with it, how they reacted to it.

Speaker 2:

The reaction to stress is typically, you know, a very you know become sort of funnel your attention, you become overly focused on what you're doing. You kind of lose track of time and what other people are doing. And in particular, when you're working as a team, it's one of the really a number of really important team activities is monitoring your team is really doing, so you can anticipate what they need, provide backup, correct errors and do those really good teamwork behaviors. If you're not able to do that because you're under stress, you're distracted and you're not focused. That breaks down. Your good teamwork breaks down. Good decision-making breaks down. It kind of freezes and people don't know what to do.

Speaker 1:

So there's yeah, that's what it is yeah, no, and so you're talking about a lot of influences and contributing factors. So, like there's, I mean you're bringing in everything. Obviously, one time is always huge. When you say 60 seconds, I don't care what it is If you only have 60 seconds to make a decision, it's just not a lot of time, especially when the stakes are this high. But then you know you've got like what your level of experience is, you know what your roles and responsibilities are. You've got the communication issues just generally, within that humans, all humans have.

Speaker 1:

Then you've got all these things and what I see happen because you guys obviously you didn't come into this saying, hey, we're the scientists, this is how things work you went to the SMEs, the subject matter experts, the people with the actual knowledge who've been doing this, and try to unpack it. And I see that done well, sometimes, sometimes done poorly where it's, and sort of try to unpack it. And I see that done well, sometimes, sometimes done poorly, right, where it's like we got to unpack this expertise. Because what happens is because part of what you're talking about is like my cognitive load when I'm in that situation, right, how much I can process at a time, how much I can attend to. It's actually a lot more finite and smaller than most humans think it is.

Speaker 1:

But because, like you know, driving is a perfect example. We've all been driving our whole life, but you know. So we think that we're operating at a level that of we know what's going on, because we kind of do, the more experience we have right, the more time I have to understand things. But if I'm doing that and talking on the phone or sending a text message or something like that, it's like all that goes out the window, even in a completely sunny day.

Speaker 1:

I'm driving the speed limit, I'm actually now outrunning my headlights, I'm outrunning what the human brain technically can process in its environment, and I don't even recognize that part. So one of the things I notice is like we're already starting at a point because people go like, yeah, well, this is my job, I was trained to do it. I was like, yeah, well, this is my job, I was trained to do it. I was like, yes, you were trained to do this and apply this skill set and whatever you have to do. However, you are already operating, even under normal circumstances, kind of past the point of like human cognition a little bit. Does that make sense or is?

Speaker 2:

that kind of accurate? Yeah, they, yeah. One of the things that I, you know, I had firsthand experience with is when the CIC operators are sitting at their consoles and doing their work. They, you know, they have headsets and they communicate through those and they also use chat and other forms of communication at this point, I think. But there's channels of communications that are different in your right ear than in your left ear, but you have to monitor them because important people are talking to you and you have to communicate to other people through your different channels.

Speaker 2:

This channel switching and this requirement to listen to two things at the same time Everybody knows that you cannot really pay attention to two voices at the same time and what they're saying different to you. So you either hear one thing or you're going to hear the other. But these guys and women have developed a skill to be able to manage that. But with more time, pressure and stress it becomes even more difficult to monitor those channels. So just that one thing alone. I mean I put those headsets on and I would listen to the voice traffic on board the real ships and I could only take it for maybe five or 10 minutes. It's just overwhelming.

Speaker 2:

So you have to kind of build up this skill over time. So that can be. That can be a real challenge and quite stressful.

Speaker 1:

So so, with this overarching kind of theme of today of of why, why do high functioning teams fail? Like what, what are the sort of behavioral problems that are typical, like coming out of this research and everything you've done? Like what, what are the? What are the typical things that you saw in this specific incident? And then, over time, just kind of like what, what always seems to be top of the the you know, you know, going into something, it's going to be a few of these things.

Speaker 2:

Yeah Well, we kind of divided our study into looking at decision-making and teamwork and then stress management. Those are like the three legs of the stool that we eventually focused. Those are the categories of competency areas, and when I talk about decision-making, that involves the team. You've got the. The leaders can making decisions and that a really good teams will have a tempo to them. They understand the task, how they have to perform it, the information that they require to do their job and the information they have to pass. They call it the detect-to-engage sequence in combat. And so there's this process of identifying a threat, prioritizing your threat, determining what you're going to do, making a plan and then executing a plan. Poor performance winds up being a lot of just kind of chatter. You know people really aren't talking about the problem, they're just sort of chit-chatting or not really focused on it and they're not passing the right information or not even detecting that something is wrong. And that's this part that you guys really focus on, which is this advanced situational, where you have to have an awareness among your team members or we call it, you know, team essay, situational awareness, where you are observing your environment, you're looking for problems, you're identifying anomalies, you're passing that information on to the proper chain of command so they can build a better operational picture. If you don't do that, I mean the very first thing is people actually they might over-chatter, but sometimes nothing is being said and sometimes I'll be listening to maybe a more novice team and I'll think to myself oh, where's where they're not talking to each other? If you don't talk to each other, you know nothing's going to happen. So that's a really big problem. Is passing if you don't pass information at the right time to the proper people, it's, it's the information is going to get lost and and you only have your. You know your time window continues to shrink. So that's a really big one.

Speaker 2:

Is the communications around the decision-making process. There's also a planning process where you know you prioritize what you're going to do and then you're planning what you're going to do. If there's this advanced like planning process that teams can that really good teams can do and it means anticipating what might happen and making the right assumptions about what could happen. And there are a lot of alternative explanations for why there might be a threat or whether it's even a threat. And this is where it came in with the Vincennes.

Speaker 2:

There were people who just had this bias toward saying it was, you know that was a threat, because there are certain pieces of information they were looking at but they weren't putting the picture together. The leadership on the ship, on the Vincennes, was wasn't really in a position to really take it in and build the operational picture. Ships outside that were in the battle group were communicating with the Vincennes and they were saying, hey, that's just an airliner, that's not a threat, that's not a fighter plane that you think it is, so shut this down. And so this process of making the wrong assumptions that can really create a problem for decision making, assuming that this is this is the way, this is it. People see the picture a certain way and they're not taking in the information from other team.

Speaker 2:

You know knowledgeable team members, expert team members and questioning whether you know they should really be doing something that could be life-threatening to you know, just civilians. So that's another piece of the problem. And finally, just making a decision and actually taking action and not giving yourself enough time to be able to make the right decision, priorities, giving orders and making recommendations. Those kinds of things don't happen in a poor team. They simply sort of let things pass by and they don't think it's important. So those are some of the ones.

Speaker 1:

And you're bringing up. These are things that happen in all kinds of different situations, like that one's powerful, right, so so, meaning the someone had the right answer in this, this bigger overarching team of no, hey, this is a passenger jet, like no right, that's kind of no different than, like you know, two or three police officers on a kind of intense scene where something's happening, when someone's going like, oh, this guy is fine, he doesn't have a weapon or something. But then from my perspective, I'm already, I'm already down and in thinking this right, and so that can play out in a number of different ways and you know, the there's certain there's, because you're talking about this sort of well, the sorry, the military uses like ooda loop a lot, right for the john boyd's ooda loop, observe orient to side act, and it's kind of like this oversimplified it. It's just kind of become like a tagline, but there's that. Well, first of all, john Boyd is a genius and he was like he was thinking on levels that I don't think most humans can think at. But it's sort of become this like oversimplified thing because you're talking about there's individual performance restrictions that I have, there's cognitive barriers there's whether that's the information or my level of training or experience, but then that sort of can become exponentially worse as there's a team involved, because now there's everyone suffering from those same things individually and now, if we're not communicating across from each other, there's a big one.

Speaker 1:

Like you said, like you've seen some novice teams where there's like no comms, and then there's some higher function teams where they've got really good comms and then, like I've seen, even some of the tier one, like military units doing specific things where they almost never talk because they've trained and rehearsed so much together that they can look at the other teammate and know what he's thinking right. So so that's like an elite level, but it's only sort of you're you're. They're still, um, limited by what they can and cannot do. They're still limited by information processing. They're still limited by their channel capacity and what they do they just for this specific thing. They've done it so many times and been worked together and and that they can. They don't. They don't need as much of that communication. They don't. There's less noise in the environment because there's only a few things that they need to focus on.

Speaker 1:

So it's not really like they're, they're superpower brains. It's like they've just reduced all this cognitive load and gotten rid of so much noise over time that, like they're allowed to focus on those things. But that's that's really difficult to do. It takes a lot of time and a lot of resources and energy to get there. And so you know, cause cause this, this sort of stress and these different cognitive factors and these behavioral based factors like this then affects my coordination and communication across those teams. Right, it affects how I'm not just individually operating but operating as a team, and you kind of said something right there like it's going to lead to like a wrong assumption, and that's the biggest thing that I've seen. Can you sort of elaborate on your experience in that or what you've seen, like what happens, like how does it get to the point of failure, like what are typical things I see and how people make those wrong assumptions?

Speaker 2:

Well, people are really pre-programmed to see things a certain way. That's kind of how our brains work. There's different kinds of biases that people adhere to. You know sort of there's the confirmation bias, where you think that if it's something you're thinking that it is and you see something that confirms your bias, then you're likely to, you know, sort of move in that direction and think that's how you would explain something or make a decision.

Speaker 2:

And in some cases you could blame confirmation bias on what happened with the Vincennes that there were threats in the vicinity of the ship every day and they were coming from a certain country, thought, you know, it was just very easy to fall into this assumption that when they looked at the information on their radar screens that they mistook it for being, you know, a fighter jet instead of an airliner. And it was as simple as well. The actual aircraft is ascending, it's leaving an airport and it's going up to a high altitude, just like any you know jetliner would. But in this case they had accidentally honed in on an aircraft that looked like it was at an Iranian, you know, military base and they thought this aircraft was descending and coming toward them and that was kind of old information. They hadn't really updated it, so that. But the thing is it wasn't that information kind of just sort of spread and became the focus. So there's the bias.

Speaker 2:

And even it was immune to people who were saying that's not what it is. That's not what it is. We're telling you that's something completely different. You know, don't do what you're doing, and that information was ignored. So that's a really problematic and I mean it happens to all of us. Sometimes you'll be looking at something and you think you see something and it's not really that thing, and you know later you're like what was I? You know what was I thinking? So it's very hard to break that, especially when you're in a group of people who are, you know, confirming that bias. You know providing information. You look for information to confirm the bias.

Speaker 1:

Yeah, yeah, and and then, like you said, that that spreads throughout um, um, the, the team, and it spreads throughout the, the operation.

Speaker 1:

And I'm thinking of so many different like law enforcement type examples in this, where it's so prevalent, where even what you're saying it's like well, we ended up, you know, shooting and killing this guy because he was gonna attack us, and it's like, but, but he was running from you for the last half an hour, like it's like he was trying to get away, and then the situation changed and then we've sort of created this inevitability and forced it into this binary situation and sometimes, and it's all because everything you're talking about is like these are the contributing factors, um, that that led to this incident, right, so you can talk about the geopolitical climate and what was going on in the threat environment and how things work there. That that's complex enough, but these are the ones, in a sense, that we're adding to it and sometimes needlessly complicating a situation, but they're also I look at it as like these are also the things that we can control in some ways, right, and I think that was a big part of this research, if I'm not mistaken.

Speaker 2:

Yes, we, we had a very large program focused what we on, what we were calling critical thinking, that, that the you know, to counter these cognitive biases, uh, which were kind of some way forced on the, the sailors. Sailors, because you're looking at equipment, you're looking at radar displays and you have to make inferences about that information. They're not perfectly clear what it is, but when we watched really talented and skilled teams, listen to them talk. It's a common practice for really good teams to speak the critical thinking. They see a problem. There would be an aircraft on radar and immediately it would be reported and the potential level of threat would be reported.

Speaker 2:

Another team member would pick that up and confirm that information, or they would come back and say there's not enough information about it. What do you think it is? And all the other team members would contribute to building a clear picture of what that potential threat or non-threat might be, because everybody can bring information to the table that's valid, that is drawn from their own radar systems, from their own experience, because they've got the right information about it, and then making sure that information gets sent up the chain of command. So it's really wonderful hearing the best teams simply being you know team members have to be assertive, that you have to be assertive with the information that you have. It has to be heard. You have to know that it's been heard. So there's a lot of acknowledgement among the team members that they've heard it. They agree or maybe they disagree, but it's a very fluid and easy process to do once you allow that to happen in a high performing team in a high-performing team.

Speaker 1:

Yeah, and she reminded me of an experience working with, when I was a young Lance Corporal Marin in the Marine Corps, first deployment and then working with a specialized unit in the military and we're working together on this raid. That was happening and we obviously had sort of a smaller element and responsibility level about security and stuff like this and you know they're going over the plan. And then I'm like hey, well, what about this? Uh, and they kind of hear me talking to someone and they stopped like wait what? And I was like, oh well, well, I know this about that area and because they knew they had the best stuff, the best training, the best assets, but they knew that, like that, there was the sort of like the mutual respect, like you know the ground truth better than we do. We don't, we're not walking these streets every single day. And it was like really powerful to me to see that as a young Marine, because normally it's like hey, shut up, do what you're that? That cause they're looking for, they're actively looking for that little piece of pocket lint or piece of information that could complete cause they understand it could completely change the outcome of the situation or the trajectory, and and it's, the information's there, they just didn't know it, and so I see that a lot.

Speaker 1:

And now you're talking about different team dynamics, about you know, am I creating this culture where we can talk about this stuff and open it? But you know there's so many different, like we said, contributing factors to this. Can you sort of? Because you just talked about critical thinking, that's term is used a lot, right, and you kind of gave some examples of what that meant. But what does it mean in these situations where it's decision making in extremis, lives are on the line. You know there's minutes or seconds to make decisions. Like, what is critical thinking in that moment? Because I've heard that all the time hey, you got to think critically. Like people just say, hey, you got to think critically, and I'm like what do I? Just like curl my brow and think harder.

Speaker 2:

Like what does that mean? So what does it mean by critical thinking in these situations? It's interesting that it's really there's a scientist that we had on our team, dr Marvin Cohen, who is just so brilliant, and he and his colleagues set about doing what they called a cognitive task analysis to understand what expert decision makers in these kinds of situations do in terms of their critical thinking. And, like you know, in other words, you have it's time critical. You have maybe a minute to make a decision. How does your critical thinking process work?

Speaker 2:

I mean, they studied airline pilots and they studied, you know, battle commanders in the army. They studied battle commanders in the army. And there's these people. Human beings have to use shortcuts. There's no way you can make a decision without having some kind of shortcut from your experience. But it's various decision makers.

Speaker 2:

Actually, if they have a little bit of time and they know they have a little bit of time they can extract more information about the problem to make a more accurate, get a more accurate understanding of what the what the conditions are, what the context is that would be driving a threat towards them, and so that they look at, you know, the geopolitical situation. You know that they take good, you know, validated information from their environment, from their teams, from what they're looking at, to build an accurate picture of what would be driving a threat and if anything comes up, that's like it's it's. It's what they call a basis for assessment. The basis for assessment is all of this information. Versus this information, you know, there's things that would, would run against an assumption that maybe it's a threat.

Speaker 2:

And so what? What are those things? And the weaker, weaker the argument is against the threat and the stronger the argument is for a threat is how they sort of weigh or use that information to make their decision. And certainly if you're doing this under pressure, you have to be pretty quick at it, and these experts do get pretty quick at it. So that's a really important factor. Is they actually in their minds? You know, this isn't something they have on a computer screen and we worked on designing computer interfaces for the basis for assessment and I think the Navy has adopted some of those ideas. But it's really you know, the inner workings of your brain, trying to be more balanced and and have a balanced perspective, as opposed to just jumping to one of those biases that would make it just so much easier to make a decision.

Speaker 1:

And that's the biggest, you know, the easier to make a decision. That's that's kind of it right there, because that's kind of, in a sense, the the easier to make a decision. That's that's kind of it right there, because that's kind of, in a sense, to how we're wired, like our brain is constantly anticipating, like it wants to anticipate, it doesn't want to be surprised, right, so it's anticipating likely outcomes. That I'm in as I'm going through my environment, especially in these situations and you brought up this like you know what, what, what I I look at it as is is, if I'm trying to make sort of a decision on this stuff, I have to be able to like conceptualize everything that I know, given this context that I'm in right now. Right, I have to say, given what I know right now and you even said it a basis for assessment, like we call that like a baseline, right. What are we starting with?

Speaker 1:

And I, we had a great um, we, we had, uh, it was a law enforcement officer in one of our courses and he um, you know he does a lot of interdiction work and so he's out there. There's a very specific role of what he's looking for right and what he's doing. And he came up and he was like I I it was like the second day training or something like that and he had this great comment he's like I've been training my newer folks, all wrong, and I'm like what do you mean? He goes, he goes. I've been teaching them everything that I know and and what to look for and all of these indicators that I've seen before in my past, and I'm really trying to get these different examples, just like you guys give examples and tell stories, and this is what I saw and this is what it keyed me into and he goes.

Speaker 1:

I'm realizing now I think that's the exact wrong way to do it. What I should be doing is teaching them to get really, really good at identifying what's normal, what's typical, what should I expect to see in a number of different environments, in a number of different domains, in a number of different domains, in a number of different situations? Like, how do I get really really good at normal? Because then once I like perfect that, that baseline, that basis for comparison, the, the incongruent signals, those anomalies, like those indicators, whatever it is, they'll almost pop out automatically, like I won't have to look for them. It is, they'll almost pop out automatically, like I won't have to look for them. They will. They will appear to me because it's it's going to be different than what you know, I was taught or trained. Now it's going to be similar in a lot of ways, but I can't take a photo and memorize a photo and then say go find that out there, right, because it's going to be different no-transcript.

Speaker 2:

It's really nearly impossible to do anything like that by yourself. You really have to surround yourself with people who are providing you with accurate information. You can't see everything yourself. You know, and you can't always parse out what the right information is and what's wrong. I mean, sometimes you're on your own, like if you're a police officer. Yes, you could be on your own.

Speaker 2:

I'm talking about a military situation where a lot of you know combat is going on around you and you really do need, you know, more people to help you figure out what's going on. So, yeah, you can really get lost in the minutia and not really bring yourself out looking at the big picture because of the level of stress that you're under. Unfortunately, I think I've seen more examples of poor team performance than I have seen a really good team performance. But when you see really good team performance, it's impossible to not recognize it. And so, yeah, the thing about critical thinking, the basis for assessment, is that with time pressure, there's good decision makers will be able to, as I said before, you know, take a little bit of time, you know, and if you can free up enough time to make a better decision, you know everything's going to work out a lot better all along, and that taking your time actually means having team members anticipate what information is needed to make the poor information, and they filter that for you as a decision maker. They send that up to you in the chain of command and that information has been filtered in a way that is not going to throw you off or put you on the wrong on the back foot.

Speaker 2:

What can really mess up that communication, that flow from the lower level up to the higher level, is if the decision maker, the leader, is basically trying to drag information out of people, like tell me what you're seeing, what does that look like?

Speaker 2:

You've got to let your team do their job and you have to provide them with priorities and guidance job, and you have to provide them with priorities and guidance. But the worst thing that you can do as a team leader is to just push information out, to pull information from your team members and prevent them from doing their work in getting information up to you. So the flow needs to be from the lower to the higher levels and team leaders can't really be the bottleneck preventing that information from getting to them. You have to trust your team. That's the whole issue of trust. Trust means I trust that you're going to give me good information because I've been training you how to do that, and you can trust me because I'm not going to push that information away. You know you can trust me because I'm not going to like push that information away.

Speaker 1:

You know I'm going to let you give that information to me. Yeah, and that's a big one. You know what we like to call you know high functioning teams that you know operate at the speed of trust, right, it's just. I know I'm taking you at your word that that what you're saying is true and what you believe, and I can listen to you and I still have to put my own filters on things, but, but I'm not going to sit here. Well, yeah, well, why, why is that? And what you're kind of seems like you're getting into?

Speaker 1:

And, greg, I do want to throw to you, because this is, I think, the longest we've gone on a podcast without you talking and I know that's only because it's Joan and so but, but, like you're, what you're really talking about is almost like it's. It's it's, um, not focusing on what I need to look for or what are these certain things. It's like asking the right questions, because when it gets to taking this from you know we'll get into it a little bit minute here is like taking what we've learned right, taking what we know, and and putting that into some sort of training, right, what's the takeaways and how do I get better at those things. But it's it's like, you know, everyone wants to know what the answers are. Everyone wants to know the checklist or what I need to do, and it's kind of like we always approach it as you got to know what questions to ask.

Speaker 1:

You have to know what. What are the right questions to get you to a more reasonable kind of assumption or conclusion so that you can make a better decision. Because if I ask the right questions, then I get a better feel for what the actual problem is, which may be different than how humans communicate, because this communication issue we talk forever just on that, about how humans communicate and maybe even especially in this, what they say and how they say it and the language you use, it all frames, um, how we process that. You know it's, it's so there's a, a bunch of complexity in there that we haven't even really addressed, but is is extremely powerful. But I do want to throw to you, greg, because, because you've been, you've been quiet and so that I get I get nervous when Greg gets quiet, so I don't know.

Speaker 3:

The reason I'm quiet is if I had a microphone, I would body slam it and drop the mic, because you guys have epitomized exactly what my comments would have been. Everybody that knows me knows that I'm fascinated with Joan's brain and the way that she thinks and the research that she's done. So when you have an expert on you, let the expert talk. So I'll limit my remarks. I'll make them real fast. Joan, you know I love you. Uh, the reason you were hearing similarities is because of in the late seventies and early eighties I was inspired in my work by Dr Marty Seligman, and Seligman led me into the world of uh, uh, you know Eduardo Salas, who led me to Bowers and Barab, and Lance Hannon and Cohen, and so all of these were at the forefront of their decision-making under stress research and the problem was there was nobody in comp work that was doing that. And I say, tie this directly back to the podcast we just finished on the 21 foot rule, because you'll say, oh my gosh, that's what he meant by that. That's exactly what it is. It takes a catastrophic incident for us all to look down and in and then to conduct the research. So you're talking about stress in extremis ambiguity and I'm saying for the street level people that are still listening and I know you are, because Joan's fascinating to listen to we're talking about about playing jeopardy with the timer counting down loud and in addition to that, instead of having two other opponents, you have 40, and in addition to that, you're riding a unicycle in a minefield and in addition to that, you're trying to sing row the boat with the audience. Okay, now that's the level of complexity. That's in police work, that's being a school resource officer, that's's being on a hostage scenario, that's you know whether you're HR or whether you're a copper on the scene. These things happen and space time is different. You train for those situations, but now you're in the situation and it's just not exactly the same. And what happens is I want people to remember on the Vincennes why it's such an important study and why Joan's such an important guest is the Vincennes, why it's such an important study and why Jones is such an important guest.

Speaker 3:

The Vincennes is no different than what happened in Oxford. It's no different than what happened at Robb Elementary. You had high-functioning people. There was nobody on the Vincennes that didn't give a shit. There was nobody on the Vincennes that was drunk at their duty station. There was nobody. That just said you know, today I'm going to make an arbitrary rule and stick by it. What happened is you had high functioning, experienced veterans that were put into a situation and one or two of them saw it slightly differently and stood down and didn't comment.

Speaker 3:

And then what happened is that starts adding. That's like a carburetor icing, it's like the thing that brings down an airplane. A little bit of frozen area now manifests itself, greater and greater, and now you don't have lift and thrust anymore and you know what it did. That in a minute. And when you see police shootings, when you see a school shooting, when you see these situations that we have the highest level of risk in this is the type of critical decision that needs to be had. So going backwards, taking this giant step backwards to take a look at this situation, I think is uniquely important.

Speaker 3:

And if I could just say one more thing, brian and I'll shut up and get back to Joan, because I absolutely love listening to her I was in a situation I sent you a video of Joan you might not have seen the video and I was in a Southern airport. It's a huge airport, there was many people around and the fire alarm went off. I started videoing as I was heading for the fire exit. Okay, I was the only one heading for a fire exit and what happened is right in front of me, that I got on the video. The jet bridge, because of the fire alarm, locked where the people get into the jet bridge and it stopped all the people from getting on the plane. So you had a dozen people trapped in the jet bridge with this fire alarm that was going and who was trapped there?

Speaker 3:

The first people to get on the plane, which are those that need more time with the global entry frequent flyers. So you had the highest level trained people that fly all the time that were jammed in with the wheelchair person and the old person and everything else, and all I saw was fear and indecision. Why? Because nobody had anticipated this is a likely outcome. Nobody looked at that and had planned for it. Well, what types of things may we encounter? And it's exactly like the Vincennes too. Stakes were different, not as many people died right, but by the same token, the lessons learned are cogent and as important then as they are important today. Would you agree with that?

Speaker 2:

Absolutely. Yeah, yeah, there's just the translation of this. The findings from all of this research are tremendous. They are generalizable to many other situations that we see today, and I just think there are really good training solutions for dealing with this.

Speaker 2:

One of the things that we focused on was something we called event-based approach to training, where we created realistic scenarios to train with, and those scenarios made sure that they had events in them that would elicit the behaviors that we wanted people to practice and train to.

Speaker 2:

And we created these scenarios so they addressed these what they call black swan problems, or situations that don't occur very often, but when they do it's, you know it's do or die. So we didn't pick scenarios that would like only happen in a hundred years, but but the Vincennes was never anticipated to be something, whatever happened. But we you know we it was important to focus on those kinds of problems and we learned that those kinds of encounters were a lot more common. Actually, there were other incidents in the Persian Gulf that occurred, with an American ship getting hit with a mine that had been set. There was the Stark incident I think that one had been fired on, and we see that today the ships are being fired on in the Red Sea. So there's really opportunities. There's really good solutions for training that can be used, that there are existing training strategies that can be used to improve teamwork, to improve decision-making.

Speaker 1:

So so that that's sort of like this, this, the kind of so what I kind of want to get out of all of this and and you know I mean there's a lot you would cover in your, your career, but you know it's sort of the what can we do with this information? And I mean you just talked about like this event based approach, because I want you know how can I use this stuff in training, stuff in training, these limits of cognitive performance have been studied and been shown, like the DOD has, I mean, how much research on in all of these different areas, whether it's individual and team performance and in different domains, like so what, what can I do? How do I use it? How can I mitigate barriers? Like what are some of the indicators I need to look out for? I mean, these are, you know, knowing, with, with, with from your experience, like what are those things? How do I use this?

Speaker 2:

So one of the most important or the most important thing about training for mitigating these problems is to have a really effective after action review, and it's a diagnostic process that teams use to go through what they did during a particular training scenario, whether it's a simulation on a computer system or if it's a live exercise. That after action review has to have. A number of things happen for people to learn how to improve their performance. One of them is when you have this event-based approach to training, you use that as your after-action review baseline. You work your team through what happened, what they remembered happened and how, set goals to improve their performance on those specific types of behaviors. So for training teams, if you're really focusing on team training, there's four dimensions of teamwork there's information exchange, supporting behavior communication and what we call initiative slash leadership, and each of those dimensions have specific team behaviors that are critical to good performance and in the context of a scenario, whether it's a military scenario or a school shooting scenario, those critical incidents that occurred in those scenarios become the center of discussion among the team members to talk about what they thought happened, what did happen that went well, what didn't go well and what they could improve on in terms of those four dimensions of teamwork and that instantaneously improves your tactical performance.

Speaker 2:

We know from research that teamwork processes improve, tactical performance improves and at the end of the AAR, the idea is to set goals and continue running through critical incident scenarios. With these event-based situations, it's important to be able to observe what the teams are doing during their training exercises. It's important to observe the kinds of behaviors that you're looking for, and those behaviors are collected by trainers, who can run through the after action review and provide feedback to the team members while they're having these important discussions about what they want to improve on. So it's a nice back and forth dialogue between instructors, trainees, all because you know they're really focused on learning to do a better job, and it's a proven process and I would like to see it implemented more often. The military does use this approach and they have, for many years, improved on their approaches based on research. So there's research-based science that informs the way the military does training and I think it's just something that should be leveraged in other places.

Speaker 1:

Yeah, I agree with that. A lot of times the process for that isn't as formalized as it should be. In a lot of situations it's just okay, let's sit around and talk what happened. And a lot of times we focus on the wrong things in those aars. Uh, especially when you're talking about law enforcement work or any investigations to any of these big major incidents like the school shootings, it's basically like all right, who's who's to blame? And you know what it's like that. This is not helpful. Like this is. This is not help going forward. We're just assigning blame because that makes us feel better and we can point to this person and say they screwed it all up. It's like no, no, no, no. There is a lot. There's organizational issues, there's policy issues, there's legal issues, there's communication issues, there's an issue of resources and resource management. I mean there's so much and we don't want to get into the complexity of it because it's it's hard and it's really hard.

Speaker 1:

But but everything you just talked about, especially with that sort of AR model and and having those discussions, like that's actually very easy to do at a very low level, like it's not, you know, I mean I do that with my family, okay, like here's why you freaked out over this to the insurgent, like all right, and then you can be mad, that's fine, but you're not allowed to be mean to mommy, like you know, like that. That part was where you went wrong. Being upset I don't care if you're upset, that's fine. Like you're not in control of your emotions, so that's going to happen to you, uh, but this is this part's acceptable. This part isn't and it's like it seems that it's informal, but it's. It's really not, and that's where the learning occurs. And and so I and you're talking about things that any team anywhere can control and get better at this. You don't need to have, um, you know, a million dollar training budget to have and ask the right questions and modify your policies and procedures and your tactics and doing those and asking the questions, like you just said, it's proven that you're actually going to get better tactically. You're going to make better decisions. You're going to utilize the training that you already have at a higher level than you did before, because you're getting rid of a lot of the potential barriers to success. You're addressing the foundational issues that affect all humans in every situation, whether it's something as extreme as I have 60 seconds to decide whether or not this is an aircraft that's going to blow up my ship or it's. You know, I got to figure out which kid is I need to focus my resources on in the school, because he's having the most difficulty it's. It doesn't matter what those situations are.

Speaker 1:

And you know, I'm kind of wondering, like if you're in all of your experience, like are there certain things, are there certain times when you, like, you've walked in or you've seen something and, like you, you already know, like right off the bat, like here's what likely happened or here's what the breakdowns were, because those are the consistent breakdowns and then, like, what are those sort of key takeaways out of this? Like what, what are those topical things if you can think of, like those top three? You know I hate doing that everyone's like give me the one thing or give me the three things, and it's really difficult to do. But I know I'll give you an example.

Speaker 1:

When I go into work with a team, a unit, whatever law enforcement agency, private sector, whatever I have a feeling right off the bat, sometimes even before we get there, just how they communicate whether or not this is going to be a tough one. Or hey, this is a high-functioning, functioning team. They're on the same page. Like I have my own um indicators, sort of. But like I'm curious, what, what yours are, what you've seen well, one thing is um, you know, we this was.

Speaker 2:

This was discovered in the study of pilots and co-pilots, uh, which kind of predated the vincennes incident but pretty much has been a parallel area of research for many years too. And that is really good teams, whether they're just two people or more, they take advantage of the downtime to plan and to provide a guidance for what they're expecting to do. So plan, plan, plan. There's a lot of communication for good teams they're always talking about what needs to be done and how they're going to manage that, what roles and responsibilities are going to be. So that's really critical to see that. If you don't see them talking to each other, if they just kind of hang out and not do anything, look at their cell phones or something and they're not really doing any kind of critical planning, then that's a sign. You know. It could just be a sign of, you know, being a novice team, but sometimes, you know, there's the problem of leaders thinking that they're all knowing and all seeing and not really encouraging the people who report to them to be more open and honest and assertive and reinforcing assertiveness in their team members, because that's where you build trust and you're willing to trust what people are saying to you. So that's part of it too is building that trust. So leaders are always good, really good.

Speaker 2:

Leaders are great at providing guidance and setting priorities and whether they're going into a scenario or even, you know, of course, during training, is that sometimes they'll be like all right, stop, stop, stop, stop. You know this is what I need you to do. I want you to do this. I want you to do that, even if it's like positioning yourself in a certain place, like when we were working with infantry squads. They would be making their way through a training village and you'd be seeing and hearing the squad leader say, okay, I want you right over there, or I want you to go around that building, you know, like physical placement, to get the best picture of what's going on. So guidance is really important in that.

Speaker 2:

Another really important one is teams to look for is that supporting behavior is people who are willing to take over doing something when it's not necessarily their primary job because another team member can't do it. So even in the planning phases, good teams will say well, if I can't get to this place by this time, you know, I know that you can, you know. So if you get there before me, that's okay. So so this planning process means being able to, um, you know, figure out where team members agree that I can help you doing this, you can help me doing that. And accepting support, accepting backup from somebody, without um getting upset about it, that's another one.

Speaker 3:

And then just using really good, yeah, yeah, joan, you know it goes right to what we talk about all the time, about educating for certainty and training for the uncertainty. I mean that speaks volumes and I want to interject. Only I have to step out. Brian and Joan, it was wonderful being on the call with you. Please don't slow down because of me, joan. One more thing about what you are talking about is research, and I make an admonition again. Out there I hear evidence-based all the time, but evidence-based can be anecdotal evidence and then we all are led to believe it. That's a form of bias. Research is the tool, because then you sometimes uncover stuff you didn't expect or didn't want to know, and then you can truly like to Brian's point ask and write questions. Then you conduct the research. Now you have a clear path, or at least a way ahead. It might not be the most clear path, but at least it's a way ahead to confirm your suspicions. So thanks, joan, I'll see you again.

Speaker 3:

Thanks, brian, for allowing me to jump out.

Speaker 1:

Yeah, I know you're going to jump out, but but yeah, I got a couple more uh things. I want to run past you, joan, but thank you, greg Um so be on the show, you guys.

Speaker 3:

Thank you so much.

Speaker 1:

Thanks, greg. Um, you know, we have, uh, this. You know, if I'm I always try to put myself too for for those folks that really enjoy learning about this stuff and and wanting to implement some of these things. Um, you know, I'm listening to this podcast episode. Um, we're listening to this conversation. It's how, how do how do I really do this at at my level, like, what are, what are some of those important things I mean you're, you're giving, you know, planning and prioritizing tasks and and roles and responsibilities, and and obviously there's a huge leadership component. Um, you know, are there, are there other, any other sort of like general takeaways that that you try to get across to any of the folks that you've worked with or or done research on before, that you continue to pop up every single time in these situations that are like these low calorie things that don't take a lot to fix.

Speaker 1:

I'm trying to get some more, because this has been incredible. I've got a page of notes here and I want to get some of this stuff into our Patreon subscribers. I'll have some of this kind of summarized as well as some of the takeaways, but you know, the complexity of decision making is is huge. There's there's so much there, and then you've got different ways of looking at, especially now with this rapid adoption of technology and different technological solutions. You know everyone wants the mathematical formula for for arriving at really good conclusions and getting the best answer by inputting and analyzing all of this data. Right community, that kind of differ on this, which is one of the things where I like a lot of what Gary Klein does, because he's a big proponent of expertise, meaning you got to go to the experts in these fields. In a sense they build up this intuitive decision-making process over time and you got to unpack that and you're never going to get build some sort of statistical model on what the best answer is just by putting in all this information, because it's just that it's a model. It's not the real thing and and and unless you're at the real thing, it's kind of hard to do so because of all that technological stuff and and the newest, like you know, data science that's trying to come up with these things.

Speaker 1:

I see it a lot in health care too, where they're trying to just take all the information and have the computer go hey, test for this, it might be this, it might be that, and, like the really good doctors are still beating those machines because, but sometimes they can go wrong. Right, the the expert can go wrong. So it's like I have this, this level of expertise, but you're, but I can still get it wrong sometimes. But it's still really important to unpack what I have. I see it as almost it gets confusing for someone at the user end to go hey, like big brain scientists, just tell me what I need to do and I'll do it, like I have no problem adopting some new strategy if you're telling me. But it's like it's kind of not that simple. So what? What can I do, listening to this sort of this, this episode, to operationalize this information, contextualize it? In my domain? That is a really hard question. There's a lot.

Speaker 2:

Well, I mean, it's really. I know Garyary, you know he talks about you have to go to the experts, the the top experts unpack it. But, um, I I don't think you have to go to. You know those people are are rare, uh, but I think that people my, my experience is you go to people who do the training, who been in the job for a while, and we call it it's a critical incident assessment years.

Speaker 2:

You know what problems have you encountered that you know are hard. You know it's hard to make a decision and you're working in a team and you've got, you know, a lot of different things going on that are, you know, creating problems. It's a stressful situation and you know, tell me like the top most important problems that you have to address with this kind of a job. And you really you get the critical incidents and then you start to unpack the critical incident. Well, you know what conditions make it really difficult for you to make a decision and the picture pretty much pops out for you to make a decision and the picture pretty much pops out.

Speaker 2:

The reason why I focus on teamwork behavior so much is that what you can do with the four teamwork dimensions is ask people, people you know tell me incidents or situations that are important to you, like so, for example, with with leadership, and do you have, can you give me examples that are situations that are difficult to maintain leadership and you can build some training around that so you can say, well, like if we had give me a scenario, you can use a tabletop situation. You can just create a tabletop scenario where people sit around the table and you have that scenario set up so you can work through it and say, okay, at this point in time, this is what's happening. This is where people are. They're you know they're not communicating with each other or they're not providing backup or support. People aren't prioritizing or providing guidance from a leadership perspective.

Speaker 2:

What can we do with this kind of an incident or critical situation to improve on that? And it really just means people sitting around a table and discussing the pros and cons and the problems that they encounter in ways in which that can get them back on track. So I don't think you have to. You have people, have to be open to each other and be willing to take criticism, because there's you know, opportunities to.

Speaker 2:

You know people fail at what they're doing and they have to look, you know, open up. A really easy thing for a leader to do is to start by saying hey, here's where I made a mistake. And we found that with the Navy teams and with our Army and Marine Corps teams when we set up the discussion to where the leader kicks off by saying, hey, you know, I feel like I made a mistake by not providing information or not providing guidance at this particular point in time, and this is what I should have done or this is what I can do. Even that kicks it off, and people are so much more willing to join in the discussion than if the team leader just points at people and says well, what did you think went wrong? What did you think?

Speaker 1:

went wrong.

Speaker 2:

It just creates this barrier to discussion. So breaking down the barriers and being more open to admitting mistakes is a really big deal.

Speaker 1:

Yeah, and that alone is a whole podcast episode on how to do that.

Speaker 1:

But I mean there's so many, you know. It just instantly reminded me of this. It was a training event out at 29 Palms when I was in the Marine Corps and so I had my sniper team and it was a big company size infantry company size maneuver exercise with indirect fire. So there's mortars and then there's heavy machine guns and we're up in this position and we provide some with the Barrett 50 cal. So you get some long distance stuff. We don't really have a big role in it, but we're a part of the observation to help call in.

Speaker 1:

But what happens is you have to plan all of that stuff out right. So they had this, the company you know, xo, this first lieutenant had put all this time and effort into creating this very complex, highly organized, you know fires plan. And I had to go in there and sit on the brief and he'd make sure, walk me through everything, everything what they're doing right. Here's all the plan, right, just your normal preoperational stuff. And so it gives me the packet of information, everything that I need. I'm like cool, got it. And so he did a ton of work. All that planning, all that preparation, the briefing, everything's phenomenal right. But then he decides that he didn't want to carry, you know, a full size radio on his bag and only brought the smaller one.

Speaker 1:

And the problem was where he was at during that operation, where he was going to be calling all that stuff in. Well, he couldn't talk to anyone. He couldn't talk to the people he needed to, because of our elevated position where we were at, I was the only one who could coordinate all that stuff. So I'm sitting there with my team and I'm getting my guys their practice. I'm like, all right, here's where the set is. You see, this call it in, because they knew how to do it. They just hadn't done it before. So I was like go for it. And so we're calling in all the stuff that he's supposed to be doing, listening. But but, and we're, we're at such an advantage because we kind of aren't, we're not in the situation, we're not walking through this field where they just, you know, blew up this massive bangalore, you know, blew up this massive Bangalore, you know mine to clear it, and they're coming through like we're not down in there, we're up to like literally chilling, you know, like, so we have this bird's eye view. And so you know, obviously it was successful and it went well, but you know that Marine was defeated.

Speaker 1:

After that, right, he felt so stupid. The company commander was super pissed about it. The battalion commander's upset, like's upset, like hey, you know, you did all this and you couldn't even get this because of comms like this is like he should have known better, right. So, like I ended up like going during this debrief beforehand talking to the, because they were like talking me up and being like hey, that was incredible. You know, sergeant marin, you were able to do this and blah, blah, blah, you could coordinate the fires and if we hadn't been able to done this, this whole massive exercise would have been a waste. Blah, blah, blah.

Speaker 1:

So they're like kind of like giving me the attaboy, which naturally I want to take, but I was like hey, look, if it hadn't been for the planning and the preparation and the briefing that that person did and what they give me, I never this I wouldn't have been able to do that.

Speaker 1:

Like I I'm happy to take credit for it, but like I deserve zero credit.

Speaker 1:

I was sitting up there like leaned up against my pack, comfortable, getting some sun and having my guys call this in over the radio, like I had the easiest job out of anyone here and because they had planned and briefed me so well on it, like I literally just had to follow a script, you could have put anyone there. You know what I'm saying. And so it goes to show you what all that stuff that went into that one. It was a great example of like how you can have this sort of single point of failure where you can have the greatest plan ever and it goes wrong because you choose the wrong radio to use or something. But it's also an example of how everything goes right with the right planning, preparation. Not everything is going to go right, but with the right planning and preparation and rehearsal and absolutely, and that's the key is being able to adapt, because you've done that planning and and that was that was a huge takeaway for me where I was like, oh man, like this is how stupid mistakes can really come and get you.

Speaker 1:

So I know he still got hammered for that, but he was hopefully able to resurrect some of the damage that he had to take off some of that hate and discontent they had for that guy for doing that. But it was just an example. So I really appreciate you you coming on and talking about this stuff. Um, you do you have any like favorite resources or favorite things that people can check out or read up on or look into other than you know? Because I'll put links to like some of the theadmus and the tactical decision making under stress and the small unit decision making, you know, but those are, those are, you know, scientific papers and their research stuff, and so sometimes it gets a little, it gets pretty dry. But do you have any other sort of resources that you recommend for trainers, for people that are folks that are like, hey, I love this stuff, I want to learn more, I want to learn how to implement this stuff. I want to learn more. I want to learn how to implement this stuff. What, what can I do? What where?

Speaker 2:

else can I go for something like this? So, um, I think one of the best online websites uh, that the agencies that has a lot of this information. It's actually in the medical training domain. It's um, yep, the um. Let me see if I can think of the name of it. It's, I think it's the United States Health Research Agency. I'll have to give you more information about it, but there's a program of training it's called TeamSTEPPS, which is really a version of what I was talking about today, and it's not just, it's a generic kind of a team training program, but everything that I've talked about is really encapsulated on this website. It's T-E-A-M-S-T-E-P-P-S. I think that's it Team.

Speaker 3:

Steps.

Speaker 2:

I think you can Google it and the website has videos. They have uh checklist tools, they have uh information, powerpoint slides, a lot of background material on how to conduct this team type of team training, and it's not just you don't have to just focus on uh, you know, uh medical teams, it's.

Speaker 2:

It's basically something that you can use on any kind of uh domain. So I would. That's one. It's easy to get to, it's easy to understand and get to the right links for that. But I'll have to share any other written materials. I'll have to look them up and give you the links to those as well.

Speaker 1:

Yeah, please, and for those listening, I'll include those links just in the episode details, so so folks have it. If you're listening to this now and then I'll also have like sort of a summary and some of the breakdown of what we talked about in printed form as well for for our Patreon subscribers to to get on and take a look at. But you know, these are all great resources and I really, you know, appreciate you coming on and sharing the knowledge and experience. I just I always want people to know that, look, it's sort of like the science and the research is out there, like it's we know what goes wrong. It's it's, it's the using it to your advantage to train for that. So train for those things that we know are going to hit.

Speaker 1:

In every situation in life that you're in, whether it's an argument, you know, with your family or a high stakes, you know, hostage rescue situation, a pursuit like there are things that are going to affect you that you don't really even recognize that it's happening and it's influencing the environment so much. So if I really get better even at the recognition of it, even at understanding how this stuff affects me, I can plan and train for it to mitigate some of it. You can't mitigate everything. There's never going to be a perfect thing where everything goes exactly to plan, but you can certainly strive towards that and get better at it. So I really, really appreciate it. Joan, do you have any final words for our listeners?

Speaker 2:

Oh well, thanks for having me on. It's been a real pleasure and I hope maybe to do it again sometime soon.

Speaker 1:

Yeah, we'll get some feedback from everyone and if there's some specific stuff we want to talk to you about, we'd love to have you have you back on. But we thank you for coming on. Everyone for listening. Again. There's more on the Patreon site. Reach out to us with any other questions and don't forget that training changes behavior.

Speaker 2:

OK, yes.

People on this episode