The Human Behavior Podcast

Hardwired For Outrage

The Human Behavior Podcast

Send us a text

What if fear, rather than fairness or loyalty, is the ultimate compass guiding our moral judgments? On this episode of the Human Behavior Podcast, we explore this provocative question inspired by Elizabeth Colbert's article in the New Yorker. Joining us is moral psychologist Kirk Gray, whose compelling theory suggests our ethical decisions are deeply rooted in the fear of harm, a vestige of our evolutionary history. We contrast this with Jonathan Haidt's Moral Foundations Theory, which posits that multiple moral modules influence our judgments. Our discussion navigates through these contrasting theories, shedding light on how emotional storytelling often overshadows raw data in debates around polarizing issues like abortion and immigration.

Venturing further, we unravel how fear, an evolutionary advantage, impacts our perceptions of existential threats and moral discourse. We draw parallels between ancient survival instincts and modern challenges, such as artificial intelligence and political polarization. Using allegories like the cave, we highlight the tension between perceived safety and the unknown, illustrating how comfort zones can hinder groundbreaking achievements. As we dissect the media's role in amplifying fear, we caution against the oversimplification of complex issues, likening it to the tale of horses with fangs—a narrative that distorts scientific truths and manipulates public perception.

Turning our focus to decision-making, we probe the intricate interplay of instincts, ethics, and survival decisions. We discuss the role of training and adaptability in high-pressure situations, emphasizing how experience shapes our responses to unpredictable scenarios. By examining real-life examples, such as a controversial self-defense case, we illuminate the complexities of aligning personal instincts with societal and legal standards. Throughout, we underscore the timeless nature of human concerns and the importance of collective learning and adaptability in tackling both historical and contemporary issues. Join us for this thought-provoking exploration of the forces shaping our ethical landscape.

Thank you so much for tuning in! We hope you enjoy the episode. Don’t forget to check out our Patreon channel for additional content and subscriber-only episodes. If you enjoy the podcast, please consider leaving us a review and, more importantly, sharing it with a friend.

Link to Article: https://apple.news/AHHxMCXxbSPmYj7BmCU71RQ

Support the show

Website: https://thehumanbehaviorpodcast.buzzsprout.com/share

Facebook: https://www.facebook.com/TheHumanBehaviorPodcast

Instagram: https://www.instagram.com/thehumanbehaviorpodcast/

Patreon: https://www.patreon.com/ArcadiaCognerati

More about Greg and Brian: https://arcadiacognerati.com/arcadia-cognerati-leadership-team/

Speaker 1:

Hello everyone and welcome to the Human Behavior Podcast. This week, greg and I explore moral psychology and how our deep-seated survival instincts shape what we consider right and wrong. We'll be framing our conversation around an article from the New Yorker titled Does One Emotion Rule All Our Ethical Judgments, by Elizabeth Colbert, which features the work of moral psychologist Kirk Gray. Along the way, we'll dive into why humans are evolutionarily wired to detect harm or threats, how bias and emotional storytelling often overshadow raw data, especially in politics, and the complexities of fear-based thinking that can oversimplify or polarize entire societies. We'll also look at how training, increased awareness and a focus on behaviors rather than just motivations can give us a clearer view of real-world actions. Thank you so much for tuning in. We hope you enjoyed the episode. Don't forget to check out our Patreon channel for additional content and subscriber-only episodes. If you enjoyed the podcast, please consider leaving us a review and, more importantly, sharing it with a friend. Thank you for your time and remember training changes behavior. All right, we're going to go ahead and get started here, greg. Kind of a little bit different topic than we typically discuss. I mean, we're going to frame it in how we do things, but this was actually inspired by an article. This is more of like a philosophical discussion about sort of right and wrong and where we get our ethical judgments from. But I found it interesting because, for a number of reasons and I'll put the link up because the name of the article is called Does One Emotion Rule All Our Ethical Judgments? So whether you want to take it from an ethical judgment and a philosophical standpoint or, as we do with the behavior and what you actually do, it doesn't really matter. Interested in about it is too. Is it once again kind of ties everything back to a very primal instincts, very unconscious way of thinking that is tied directly to survival, which is what is sort of the basis of what we do. Right, we tie things back to the unconscious brain how it's tied to survival and then how your behavior manifests out of that right, in a sense sort of like how your behavior manifests out of that right and in a sense, and so that's given things that are happening in the environment, perceived threats, what you're thinking, what your life experience is, there's a lot that plays into it. But this was sort of an interesting one to start from. So, greg, I think I'll kind of give a recap and go over my thoughts initially and then to get everyone up to speed on kind of what we're talking about, and then I'll have the link in there you can go back and read it. But we'll cover a lot of it in here today. So, like I said this came from, I believe in the New Yorker right Does one emotion rule all our ethical judgments, and so Elizabeth Colbert's article in the New Yorker centers on research from moral psychologist Kirk Gray, who proposes that our strongest moral convictions all revolve around one key emotion the fear of harm.

Speaker 1:

So culbert begins by telling the story of raymond dart's 1924 discovery of the tong child, an ancient human ancestor in south africa, which upended the notion that our earliest forebearers were fearless predators. Instead, the evidence suggests they were often prey. So this evolutionary history, gray argues, may explain why we're so quick to see threats and feel outrage in the modern world. After millions of trying to avoid becoming dinner, humans developed a hypersensitivity to harm, so like an innate hypervigilance, if you will. So Colbert then contrasts Gray's view with what's called moral foundations theory, most famously associated with Jonathan Haidt.

Speaker 2:

And.

Speaker 1:

Haidt's approach claims people have multiple modules like loyalty, fairness and sanctity, which shape their moral judgments. But Gray counters that one overarching concern perceived harm drives almost everything. Even in scenarios that seem victimless, most people instinctively suspect that someone is getting hurt, even if they can't say exactly how. Now, according to Gray, that's because we're wired to assume harm lies beneath any moral transgression. So that's kind of where kind of interest and goes back to really this evolutionary way of looking at it. But the article also examines why societies are so divided on issues like abortion or immigration. Both sides often invoke vulnerable victims but focus on different ones. So, for example, the fetus versus the woman or local citizens versus immigrant children. And Colbert notes that when people try to sway others with facts it usually fails. But vivid, harm-based stories work far better at engaging emotions. Finally, she raises the question of whether a primal instinct designed to confront saber-toothed tigers can handle today's complex threats like artificial intelligence or political discourse. In the end, fear of harm emerges as a possible unifying explanation for moral outrage. It also points to a deep-rooted challenge our instinctive tendency to see and fight about danger. That's seemingly everywhere.

Speaker 1:

Before we jump into the discussion, discuss some of the points from the article. There are a couple things that I would kind of want to start off with. So first, from a linguistic perspective, when the term threat or threatened is typically, or when it's used, it typically is someone perceives or feels that something is a threat, and this subjective aspect of the perception is important to understand, because there's no difference between your brain Meaning if you feel that something is a threat and this subjective aspect of the perception is important to understand, because there's no difference between your brain Meaning. If you feel that something is a threat to your safety or survival, then for all intents and purposes it is a threat and you will behave accordingly, regardless of whether or not it's true. It's true to you and that's all that matters to your brain.

Speaker 1:

Secondly, as a behaviorist, I focus on what people actually do rather than why they say they do it. Your unconscious mind makes the majority of your decisions for you, and so it's not always clear why we make certain decisions or arrive at various conclusions, or at least it's not as clear as we think it is. It's also why we hear me refer to you and your brain as sort of two separate entities, physiologically very strong and complex relationship between your conscious mind and your unconscious one, and they work together in a manner that we kind of still don't fully comprehend. But for the purpose of training, education and discussion I kind of think it's important to distinguish between the two, because it helps delineate between what you can and cannot control. And then I had a note in there to insert some joke about stoicism, but I apparently never went back and thought of one. But I'll let you do that.

Speaker 2:

But that's sort of the Well stoic wouldn't allow it.

Speaker 1:

Yeah, so but no, but yeah, and that's sort of just the introduction to you know everything from the article and how we approach things. I just want to, like, kind of delineate a few things up front, and there's a lot we can get into. So before I go on or anything, greg, I want to, I want to kind of throw to you as well.

Speaker 2:

No, no, that's great. And so, brian, you did such a great job of framing the discussion, so I'd like to just throw in a couple of definitions so we can all talk about it, meaning everybody at home and us as well. So first cautionary tale I remember being called out by Green Gunny and he came up right in the middle of class and he said unconscious means you're asleep. And it was like okay, brian, some people you're not going to be able to spar with, no matter where you are, or what the?

Speaker 2:

argument is or how intellectual it may seem. So that's the first one. So a couple quick things. So definition of moral psychology Moral psychologists put aside questions of how people should make moral judgments to examine how people do make moral judgments Important distinction, important distinction.

Speaker 2:

Definition of moral foundations theory. There's two, so there's the moral foundation theory and then the moral foundations theory. So, in other words, one card, one ring to rule them all, okay. Moral foundations theory. So in other words, one card, one ring to rule them all, okay, or many cards. Many examples the mental structure, the modules, and people reach ethical decisions on the basis of either a mental structure or a series of mental models that have been pre-wired into our brains and that involves feelings like being vulnerable, being empathetic, a resentment of cheaters, a respect for an authority figure, all of those things that we call file folders, that we know that people have right. But this again, it's somebody else's attempt to categorize something in a way that works for them. Do you get what I'm saying? Oh, we don't call that a snow cone here, we call that an ice dish, right, then we've got a definition of bias. So bias merely your tendency to respond one way compared to another when you're making any number of life choices. I prefer a red blanket, brian is blue. I prefer a sports shoe, brian's more of a ballerina. I like salads over fast food. Those biases are going to consciously and unconsciously impact all of our choices, the door we go in when we start our car, whether the radio is on or off. Those are just things we do. So a bias isn't a bad thing. It's how we approach things, because of historical precedent, because of ease of movement, because of any number of things. So what's an implicit bias? And this is where we got to jump on it just for a second.

Speaker 2:

Implicit biases have also been called implicit prejudices, or attitudes, or negative attitudes. You got to remember. If it's implicit, it's one bias that you're not consciously aware of. So it means that you choose against a specific social group. Well, our consigliere, sean, sent me a very interesting video of dog herding ducks, and a number of the ducks were white and a number of the ducks were black. Now, this has nothing to do with race. What it has to do with is self-preservation. So when the dog was trying to cut the ducks, the ducks hid with other ducks that looked like them.

Speaker 2:

So, a white duck would clearly stand out in a group of black ducks and vice versa. So we have to understand. Sometimes an implicit bias is beyond our control, brian, and we're there because we don't understand what it had to do with our upbringing. So we don't know 3, 3000 years ago what it was and the reason we turned the doorknob to the right or to the left right.

Speaker 2:

Make up things to say that you know, we come into the church and we touch the holy water that's on the right. We're not sure that that water was there because it was a fire risk and before it was used to, you know the candles. You understand what I'm saying. So what happens there? And you brought up something I just took a note on when you're talking about ethical or behavioral. So both of us are behaviorists and that's not an insult, because, philosophically, when we think about threats, we think about an existential threat and that means your question of your life choices, even though you're not in danger or you might not be in danger but you're projecting a potential danger, do you see?

Speaker 1:

what.

Speaker 2:

I'm trying to say, and what I mean by that, is that's a very selfish fear of harm to me, and this is where I want to deep dive just for a second when Dart was writing about the predatory transition from ape to man it was profound it was very influential at the time. And then he figured out wait a minute with more studies. Theory was wrong.

Speaker 2:

So, as a good scientist, you've got to update, you've got to change your theory when new science comes in right. The Tong child changed that. So instead of being that apex predator, guess what? He was running for his life, most of his life, and he ended up getting eaten. So Gray, in Outraged available for Pantan folks if you want to look up the book Gray argues that most of the problems of contemporary society can be traced to that being prey. That's simple. That's something that we love, because that means that all our thoughts and feelings arise from that mental model that was evolved eons ago and that was hardwired for our survival. And we know that. We know that we're much more likely to choose survival-oriented thinking and predicting anarchy and anxiety rather than saying everything's fine, having the everything's okay alarm going off. But Gray also says harm is the master key of morality.

Speaker 2:

and and this is where he starts going off, because he says our ethical judgments are governed not by, uh, these modules, but by one overriding emotion, and, and that you know, the this fear emotion is, you know, making us hypersensitive. Well, what he's accounting for again is is a very in my opinion only is a very selfish view. What he's saying is these things that I fear, whether they truly cause harm or not, are enough for me to go off and be who I am and see you can't do that because the existential threat to a scientist means that a living, non-living future event that threatens something else.

Speaker 2:

And it's realistic. Like we don't understand if a meteor is ever going to knock us out of orbit. We don't understand if global warming is going to change stuff and trigger a nuclear winter. Do you see what I'm trying to say? So one is an irrational fear and humans don't do irrational fears unless they have some other external scheme of pressing on them.

Speaker 1:

That is just my opinion no, and and that's this is kind of the the what, what I, the part I want to get into, because you know, you, when there we have such a strong evolutionary sort of advantage by, by doing that, being fearful of things, right, and so then you can make the well, you know, that's plato's allegory of the cave, that's everything. It's like, okay, I'm gonna sit back here where it's warm and I know what's going on. I'm not going to venture out, because a lot of people that venture out die. But occasionally they do really well and they thrive. Not only thrive, but become, you know, leaders of some new thing or create something that never existed before. So there's always that play and it's, it's this fear.

Speaker 1:

And when you get into how this is why, too? Because you know you're talking about existential threat. Well, an existential threat is literally something that would threaten your existence. But now, if you get stuff and this is why, because this was all originally talking about these others, about writing about politics and politicians, which didn't even matter to me, it was just the interest of their underlying theories, right, because it's funny, because even in their writing I found it a little bit ironic, because it's kind of like wait a minute you're saying all these things that this person's bad, but you're kind of just, you're actually helping prove your theory here, but you're kind of getting it wrong and how it's getting laid out here getting laid out here exactly and and and what it.

Speaker 1:

What it, what I look at it is is, you know, because that's a subjective measurement of something that's an existential threat like, well, I that that's to a human being. That could be anything. What's a threat to you is nothing to me. What's what's a huge threat to me is nothing. I mean literally. I can.

Speaker 1:

One person eats a peanut butter, they'll die, like that's an existential threat to that, where everyone else is fine eating it, right, but but what I'm saying is that that it plays with so emotion-based too, especially when you know there's all of these seemingly things that are very complex, that I don't fully understand, coming at me. Well, that's fucking scary for my little p brain. You know what I mean. It really is. So I always default to that fear, and so what they're trying to say is like okay, that's sort of like where morality in a sense comes from, yeah, and like, like I went, like I see what you're doing here, I see how you're connecting the dots, but I think you're taking something that's complex. You're addressing the, the complexity of it and and how long this has happened, but then you're kind of providing too simple of an answer for me.

Speaker 1:

You know what I'm saying it's like you're, you're, you're, you're oversimplifying this part of it. So there there's, you know the the things that I found interesting, obviously because we talk about it and you know, like the article talks about what they call like fear wiring and harm detection and that, but that's, but that's, that's pattern recognition. Right, that's why we're primed for that kind of stuff, precisely, and so this whole evolutionary backdrop is like yeah, you're, you're dead on that, this is where these things come from. So the recognition of how these things fall in, like I'm making those unconscious choices of what's scary and what isn't, especially now, like because, this was written about politics.

Speaker 1:

Well, the saying used to be like all politics are local. Right, like if you're in the political world, like kind of like, you still got to get out there and do this and you still get. You know which is technically true. Right, still got to get people to vote and all those things. But what it's? What's? What's changed because of the speed of communication, how connected everyone is, like now everything is a national story and everything has to do with the united states, and it's like how many of these actually affect you in your, your backyard today, like in your family and your life.

Speaker 2:

But if I tell you it does, and if I put my thumb on the scale and if I make a dark cloud and lightning, then all of a sudden it does, brian. That's what I don't think is fair here, and I mean?

Speaker 1:

what do you mean like, like, what do you mean by? Let me explain that. Yeah, let me explain that Okay.

Speaker 2:

So horses, elk and elephants were once vampires.

Speaker 2:

And they're vampires that changed how we write stories and scare children and make movies. Okay, so male horses have fangs, known as canine teeth, and all male horses have four canine teeth, two on the top and two on the bottom. And maybe a quarter or less of female horses have some canine teeth, but they're usually small. They're benign. So historically they were used for fighting Horses fought other horses. They didn't and don't help the horse chew. These canine teeth start with a mature horse four to six years of age.

Speaker 2:

Elk, prehistoric elk and past elk once had tusks and even now, when you hunt elk, their modern canine teeth are remnants of those tusks and we call them ivories or, as Jaeger would call them, whistlers. Ancient elk had the tusks for defense and established dominance, much like antlers established dominance in the tribe. So a scientific study, the actual research that's conducted, to answer a question. If I pick and fucking choose from that, what I can do is come up with a story. Well, the only reason a horse would have canine and the only reason an elk and a gosh damn elephant would have them is because they suck the blood of the other animals and that led to a downfall or whatever else. Brian, it's unscientific to the point that it's non-scientific, and this is what I mean. So he wrote Outraged, and here's his own quote. When I wrote Outraged, it was written at a time of extreme political polarization, and it's coming out just days before the polarizer in chief, donald Trump, is set to be inaugurated. Okay, brian.

Speaker 1:

So that's what I'm talking about, Listen exactly, I do not care.

Speaker 2:

We work at the behest of the government and when we do the commander in chief is our boss.

Speaker 2:

It's just simple, okay, and so the idea is that when you start saying that listen, I know he's in power, but we've got to do everything that we can to fight against him, what you're doing is fighting against your fellow Americans, you're fighting against our livelihood, whoever's in power. So there was no good demeaning and bashing Biden for not showing up because he was, you know, in mid Alzheimer's or whatever it was. But it also wasn't good for anybody to hide that. You see what I'm trying to say Transparency and knowledge Science is one of those things that when we shine a light on it, it still exists.

Speaker 2:

It's not smoke and mirrors and it's not a parlor trick. So what Gray's done here is he bring up some incredible arguments, but then what he does is he throws all of these other politics based gesture in with a solid argument, and that's the only reason I I just called horse shit when I started deep dive and outrage those philosophical discussions a lot, because it's kind of like you you get, they're so general that you can throw anything you want in there, yep, and then I've never walked away from one of those conversations going, wow, that's enlightening.

Speaker 1:

It's always like that was a waste of time. Now, unless I will say this, there are like in some roles, right. So I look at like there's a lot of like military leaders and you're talking about different philosophical ways of approaching things. I actually understand it a little bit more in that context, obviously being from the military, but also because they have. They're dealing at nation state level with the geopolitics of a country, with changing administrations, changing public. They have a mission to do and a lot of times it's not clear and they're always trying to figure out what the next thing coming is, and so, in order to do that, you can't make up random shit and go well, I think it's going to be the klingons and they're going to come out of here, like you know, like you have to base it on something. However, you're in uncharted waters in a sense going well, we don't know what's going to happen or what the next step is. Maybe the strategic initiative here isn't very clear, so you have to fall back on something so they'll have these moral discussions on what the values are and the ethics, because without that and the commander's intent, like you're kind of, you're, you're, you're screwed.

Speaker 1:

But a lot of places are just having these like kind of philosophical things and it's like, well, you have to frame it around something. And if you're going to frame it around something, then you have to include everything around something, then you have to include everything in there and then you have to look at those laws or those things that you're the evolutionary things that we know haven't changed or don't change very quickly, and go this is, and you have to look at that as the constant. You can't then dig down into these different weeds of what's happening today because it's so micro, micro, it might be a blip on the radar and in 10 years, like no one's even, it's not even a thing, because there's been some either major thing or we've realized hey, wait a minute.

Speaker 1:

You know, that's why we'd always make the joke like this was you know the what? The third, the third presidential election in my lifetime.

Speaker 2:

That was the most important election in the history of the country, like history of the united states statistically that's not even fucking possible.

Speaker 1:

Like like you're. So you're telling me I was alive for the three most important ones out of our almost nearing 300 years of existence for this nation like that's. That's highly unlikely based on about about civil war where 600 000 americans were killed fighting each other. Like maybe that one was slightly more important, I'm not bashing gray, because brave brings up some good points and we'll discuss good points.

Speaker 2:

But I will tell you this a selfish fear helps no one. So, for example, I was in a you know me, I'm always going toe to toe with somebody somewhere and I'm in city market and somebody was just talking smack about immigration. First of all, there's laws. So we live in a country of laws, so read immigration laws very specific. There's different rules that apply to immigration and protecting your borders than other laws, and if you're not educated on them, don't go spouting off on them. As a matter of fact, if you take a look at the base of the Statue of Liberty, there's a give us your, you know, great speech that's down there.

Speaker 1:

Yeah, okay, but you know what was on Ellis Island People stopping you and asking you for ID and who you were from and everything, and people were getting sent back out.

Speaker 2:

So we forget certain parts of our own history.

Speaker 1:

We do.

Speaker 2:

And we choose the ones that suit us better than us.

Speaker 1:

That's and that's my. That's the interesting part to me about these conversations, because you even said it. You said you know selfish fear. Well, technically, all fear, most fear is selfish because it affects me, which affects the society. Right, we are thinking that way in a sense of survival of ourselves in order to survive the species.

Speaker 1:

So so that's the inherent evolutionary trait in everyone. And then what, what gets really subjective, is what then, is a, is an existential threat or what's a fear? Because of all I can, if I can, link a few points together and see, well, if this occurs, then it's going to be chaos and it's going to go. It's like, oh, oh, my God, it really is. And that's where that storytelling comes in, and this is what they even brought up, and how easy it is to polarize people, even though we're technically wired, in a sense, in the opposite direction, where we want to get along in order to make things. But we do need to have some common threat or enemy. And since that's not happening in our environment, we're still hyper vigilant, in a sense, to go well, but, but it's in my dna that there are threats out there, because there are, it's just. It's. It's just. The development of the world has has changed significantly, exponentially faster than than the, the operating system we're working on so I think.

Speaker 1:

But but also to do that, like every generation has had this, this is more than humans are supposed to know or understand, or this is too much, or like you know what I'm saying? Everyone keeps going back to this. Well, you know, back then we didn't have these problems.

Speaker 2:

Like, yeah, back then plato was bitching about young greek kids fucking who didn't know their history. Same problems and it was graffiti everywhere and crime and the price of eggs was the most on people's mind. You're right and this is.

Speaker 1:

This is the important part of memory, because I remember, even like when we had dr leah de bello on a while back, like who's great she's? Like no, it's. It's called the adaptive unconscious for a reason, and I was like yeah, that's right it.

Speaker 1:

It can adapt very quickly, like your unconscious brain is constantly adapting and it's setting a new baseline and it's changing. That's why, when, like all this, something comes out and goes, oh my God, this is horrible for people, or oh my God, this is really good for people, it's like, dude, it's only been a few years, like we don't, we don't know.

Speaker 2:

We still don't know. We don't know enough about the brain, we don't know enough about things, and when we learn new things, we have to adapt our conscious thinking how we think of those. So, brian, I'm guessing you're 40 or early 40s 42 this year.

Speaker 2:

Okay, there you go.

Speaker 2:

So that's how much I know about my fucking business partner and future CEO of the company. How stupid am I? You know me. I don't like numbers. But think about this for a minute, brian. Since you've been able to reckon, you know that there's a thing called the doomsday clock, or other people call it different things, and they move the numbers and they make a big show of it. So you know, we're eight seconds to annihilation, we're 65 seconds to annihilation. We've never been more than a minute or a minute and a half in 40 years that you've lived on the face of the planet. But you know what? The funny thing is? We're still here.

Speaker 2:

Okay, four o'clock in the morning, when I get up and head down to the gym, first thing that's on television is match game, and you know the jokes that we're making on match game. They're making jokes about the price of groceries, how they're off the his, the price of fuel, how, uh how, politicians are untrustworthy. And when a new broom sweeps clean, everybody on the other side look, brian, certain things never change and therefore there's certain hardwiring that we have in our brains that is adaptable to current conditions. There's your adaptive conscience, but what we do is we pick and choose. I read a LinkedIn article just yesterday and the person was saying the thing that's going to save police work is changing how you teach cops to a Socratic method. No, no, you have to have a basis for a Socratic method.

Speaker 2:

Okay, socrates himself would have showed you how to do that. No.

Speaker 1:

I agree 100% with Socrates was an intellectual bully and also didn't believe in writing things down.

Speaker 2:

So if you're writing down Socrates' he would. He would make fun of you, he would laugh. It's so stupid. But the idea is that if you're telling me that you want semi-Socratic, I'm in your corner. What's semi-Socratic? Okay? So what do you mean by that? Well, how would that make somebody think, well, what are other ways we would approach that? Brian, that's always been around and we don't attribute that to socrates. That's an intelligent mind is never bored and wants to learn more about things. So your intuitive mind is constantly thinking harmless things are harmful because it's a defense mechanism, it's a a predisposition that all humans have. Okay, but we abandon. We abandon that sometimes, and when we do, we fucking die. People swim out to help their animal. I remember in Avon the dog fell through the ice. They were playing with the Frisbee. The guy went out and people drowned. People drowned trying to save a dog. Now, brian, that's so counterintuitive. And then somebody would say, well, I love my dog, I love my dog too, yeah. And and somebody would say, well, I love my dog.

Speaker 2:

I love my dog too, yeah, and my dog's laying right next to me. But you know what, if it's between she and I, drowning, I'm not going with her. I mean, those are type of things that keep the species going. Procreation is another one, war is another one.

Speaker 1:

Because bad ideas have to be punished somewhere, right, well, and it's a different setting because you know, you're right with, like the, you know, you hear, you know a Marine or someone jumps on a grenade when it gets thrown in the group, it's like, yeah, because they sacrificed themselves. It's like, but that's part of the training and initiation process you go through. It's like, no, you don't have to physically train to jump on a grenade. Like, hey, we're going to practice jumping on a grenade today, but the idea is it's instilled to, you know, to. Someone has to walk, point right, someone has to do something, and so you'd rather take it all for the team rather than exposing everyone to that, right, but but that's a, that's a process of of sort of training and, in a sense, just part of what it is.

Speaker 2:

But that's what I mean. Like, nobody can thrive alone, nobody, and you can meaning there are.

Speaker 1:

There are examples of people overriding your basic need for survival. For what, though?

Speaker 2:

for the survival of others or the greater good of the team.

Speaker 1:

You're spot on brian, and some, some of the big things, and we, we get into this all the time because one of the big things in the article too is which is obvious, we know and most people do is, like you know, storytelling versus data. Right, like you, you know, nothing works better than a story is always powerful. And, of course, when they're going, politicians are obviously that's the. They have to create a story or an image or a way to get things sticky. And so, with this guy too, is bashing certain ones like well, but, but, but they all do that.

Speaker 2:

That's, that's part of capturing again, you can't pick and choose because everyone has done and storytelling is the oldest form of of knowledge and skill transfer.

Speaker 1:

That's how it is, that's why it's so focused. So, but that's also why, you know, because humans, humans implicitly understand stories and I can relate to it. I don't understand statistics. No one does really good data science and statisticians still don't have a full comprehension of, because it's so counterintuitive to how humans think sometimes.

Speaker 1:

So it's like, well, I can show you all the numbers or I can just tell you this story and that story will always be more popular and that's where the urban legends come around, that's where everything. And so, like you, you, you're, when the thing about the politics, one is like, yeah, but like you're not supposed to make policy off of stories, you're supposed to make it off of data and what will work best and looking at it. But that doesn't sell. So that's why you have to tie something to it and so, but sometimes if they'll pick the wrong ones or use one that doesn't apply, but it doesn't, and once, once the narrative's out there, it doesn't matter anymore. But that's true, like even why we try not to do the hey, one of these days this is going to happen.

Speaker 2:

You better hire us because we want.

Speaker 1:

It's like no, like exactly that was our first principle that we would never, ever, never, not gonna be the fear-based advertisement. No, because you don't need, because one it, it, it, it works. Yes, but it's also creates fucking more bullshit how many times were that ones have to say opposite. When people like, anything can happen, no one fucking can't, no like.

Speaker 2:

And when you said tell me, science says it can't well, greg, that's.

Speaker 1:

It goes right back into this. If you tell me that anything can happen when I walk out that door, I'm fucking terrified because that's really difficult. I can't manage everything.

Speaker 2:

My Rolodex is going to be spinning a mile a minute, so I'm never going to be able to land normalcy here. I just have to find it. If not, it's chaos. It's constant chaos, and you know what. You want your enemy to think like that. You want them to never understand what your move is. But if you're like that, brian, that's unsustainable and that means you're going to be a victim soon.

Speaker 1:

Yeah, no, and that's the whole thing. Is that? That's what leads to being overwhelmed? And now you're going to fall back on.

Speaker 1:

Obviously, you know what they call the harm bias or harm perception or vulnerability assumptions. There's some great terminology, you can call it whatever you want, but we're talking about those same primal instincts. It's the same thing that you have which allows immediate group cohesion, people together or or rip each other apart, and it's and it's like, really, is there the, the when? If you're tying it back to evolution, if you're tying back to the system, you're generally going to be right, but what happens? Is then you meaning, if that's really what's causing it or that's where it's coming from, if you're basing it off of these these are autonomic human reactions like you're probably heading in the right direction. I think these things go wrong, and there's plenty out. There is like when you try to make an overarching, general, like moral, you know judgment on someone, that's where things go wrong, right, that's why, like if you look at even our first principles, and the first one being people, the same all over the world.

Speaker 2:

All over the world.

Speaker 1:

Do we have differences? Well, yeah, of course, but if I go in general across time of that humans being here, look, there are some basic things that humans need and beyond that, it's kind of made up after that, because everything after that changes with time and with culture and with, you know, advancements in technology. But there's some core elements here that affect every single human being and they affect us all the same way. So why don't we start there and then go from it? And then, once you get into, obviously, well, plus, they're the different moral degrees, so like that's heavily influenced by different religions, by different cultures, by different, uh, traditions, right? So it's like what are the core things? Well, everyone believes like, hey, you know the golden rule, right, treat other people as you'd want to be treated, like that, that's, you're not going to go somewhere where they're like no, treat people like shit on your neighbor every day, exactly, exactly. Even north korea doesn't do that.

Speaker 1:

come on yeah so sorry north koreans.

Speaker 2:

Yeah, I don't know if we have a big after me at gonadsen yeah, exactly but uh but look, you're, you're, you're right on, and and let's talk about a moral, ethical truth. That's also in the behaviorist philosophy Respect is easiest to build with harm-based storytelling. That's why, when you see the downfall of a deadly attack or an explosion or a car wreck or anything else, it's much easier for me to sell you an airbag or a gun or a concealed holster or training. Okay, so we know that that's true. And then Gray cited the study from 2021 that showed that strangers who were offered anecdotes stories Okay, it turned out they were much more willing to engage with the researchers than those that were offered data. And you said that people don't understand when they see the data, even if it's the best chart in the world. And the group that got the stories also treated their interlocutors with more respect.

Speaker 2:

So what we're saying? There is two truths. Yeah, is it fair? So, for example, the reason I mean is it fair to humans is that you're again putting your thumb on a scale. An injustice collector that's going to kill, okay, needs to control the narrative. You caused this.

Speaker 1:

You made me do this.

Speaker 2:

You were the one that left me out, and look at what I endured. Same thing with the family annihilator. Same thing with the last three weeks, where each time a teen killed their entire family and then committed suicide. How do you think they rationalize that, brian?

Speaker 1:

Yeah.

Speaker 2:

They look at what's going on in the media and on songs and in movies and all that other stuff, and they create a narrative. Now, whether that narrative is intertwined with reality doesn't matter, because it becomes a reality and that's where you started the show becomes a reality and that's where you started the show, it. It is so powerful to humans that you can write a false narrative, but your brain will believe it just as if it's a real thing and then, well, this is what, this is.

Speaker 1:

The other, you know, one of the significant parts about this and understanding like, yeah, you know, people are going, okay, yeah, I get it. It ties all back to, you know, survival and fear. It's like, yeah, but but once you do that, now all of my actions are morally justified. I've now myself, I'm taking moral like, no, it's okay, because this is evil, and so now it doesn't matter, and, which is funny, which is why I need to be exterminated.

Speaker 1:

Well, this is to be ostracized exactly which is actually why I think the, the military kind of discussions about this stuff are important, because they understand no, that doesn't like just because these people are bad doesn't mean just ends, don't justify the means. We can't kill everyone in the village because there's a handful of bad guys in there Like I mean, that's the whole thing. It's like we can't be like them, otherwise we're just them if we do those same things and then there is no more.

Speaker 2:

No matter what uniform you're wearing, no matter what cause or banner that you're marching under, You're exactly right.

Speaker 1:

And then also, too, goes back to why I focus on behaviors and what people actually do not, why they say, well, I did it, because this, it's like, because it goes right back to the family and other thing, it's like OK, like maybe everything you said that was wrong was true.

Speaker 1:

Maybe all of those things did happen that doesn't mean you, that doesn't justify you killing those people, though, like, like maybe maybe your feelings are justified, maybe what, what what is happening to you is is unfair and bullshit. But, like a lot of people have dealt with that and they didn't go kill their family, like what I mean you you don't look into the sort of sympathy side of it from that person and go, oh well, they had all these. Okay, well, are you going to address those issues? And young kids then Is that what you're getting out of this? Or are you just trying to justify their actions? Like you, you know it's just wow, look at all the issues they had growing up in this.

Speaker 2:

Yes, they had a chemical imbalance.

Speaker 1:

You can't conflate that but that. But that doesn't mean then, okay, well, we got to change how we do everything. Right, it's like, right, no, we, we can get better at identifying those people earlier on and saying, hey, this is someone that needs more help than everyone else. I, I'm all good with that. But then you can't come after the fact and then use that as some sort of justifying action or justifying reason. It's not, I mean, that's the whole thing. I'm all good with the analysis and talking about it, but how you use that is incorrect. Your application of this going forward is like you don't get to make these then statements, it's just because don't get to make. You don't get to make these then statements Like it's, it's, it's just because I coupled together some great reasoning. Does it mean my, does it mean my hypothesis or my solution is right? It just.

Speaker 2:

That's why we talk about the scientific method, that's look, the, the. There's a line that perfectly epitomizes what we're talking about and about, and, and gray says that we have, it could be argued, been surprisingly good at muddling through modern times with the impulses we inherited from our troglodytic forefathers first of all.

Speaker 1:

What a great line yeah, he's trogs all the time, all the time, right, I just remember shelly I go. What did you just call that person? And she was. She walked over to the board and wrote troglodyte and the definition and she goes learn this. And I'm like, oh my God, Learn this early.

Speaker 2:

Exactly, that was back in the T3, if you remember. Yeah, so the essential, the most compelling claim of moral psychology is that people make ethical judgments on the basis of intuition rather than reason. Well, you can't say that, because you threw the word ethical in there. Yes, okay, because ethos does not apply to survival thinking, because when you think of survival, the electrochemical neurotransmitters, your brain, the adrenal cortex, your limbic system, they override prefrontal cortex thinking and all of a sudden, you now exclude certain things that your body and your mind work together to say they're not, as mission, essential to my survival. And therefore saying that that's an ethical judgment, brian, that's wrong because it's not an ethical judgment. As a matter of fact, killing another human being is against all ethical judgments. But you know what? Sometimes we have to allow it and sometimes people do it, and sometimes people do it for the wrong reasons. There are people that claim self-defense that are in prison now. Right, because in their mind at that time they made a selfish choice.

Speaker 2:

I remember a caper that I did where a kid broke into a car Kid. Remember that Kid? I remember a caper that I did where a kid broke into a car Kid. Remember that kid. And I call a kid anything from 13 to 21. Kid broke into a car. Homeowner comes out, car alarm starts going off. Kid runs from it. Homeowner starts getting him down, boom, boom, boom. Finally shoots him. With the third shot he's DOA, a DRT kid because he was doing a property crime inside and we arrested the homeowner because it was outside the curtilage of his property. It wasn't a self-defense situation and the community and that guy shit on us. Now, two streets over, they go. That guy should spend the rest of his fucking life in jail, as did the parents of the kid. Do you see what I'm trying to say, brian? This isn't a moral dilemma. It's not an ethical judgment. What it is that our instinct and intuition, combined with our brain, survival chemistry is always going to choose what it thinks at that time and place is best for us.

Speaker 2:

So therefore, if we skew the information coming in, if we change that and make you irrationally fear something, then that'll lead to prejudice, that'll lead to bias the bad kind, that'll lead to conscious and unconscious fear of things that we shouldn't be fearing, you know. And and that'll, that'll malinger, that'll last a long time.

Speaker 1:

It's like rust never sleep, and that's that's why I mean, the law is already, it has you know ethics and morals and values imprinted into it.

Speaker 2:

That's where, that's what the law is the constitution and the bill right?

Speaker 1:

it's not just about a legal well.

Speaker 2:

The legal standard comes from our collective, you know, societal judgments on things over time and how they change what they look at, and and internationally, globally yeah, that's the thing is shaped, our thing someone goes hey, these guys are doing it here and it's actually working really well, and they're like, oh, wow, okay.

Speaker 1:

And then sometimes that can go wrong. It's like, well, we're not like that society at all. So you're comparing apples to oranges, so like that won't work here. But like the, the theory behind it is sound, but maybe that sound because they have a very, you know, homogenous group that they're working with, whereas in the united states, which changes your view after a while too.

Speaker 2:

Yeah it.

Speaker 1:

But you brought up a good one and you said that you know ethos doesn't apply to like a survival situation. I'm sort of paraphrasing, I think yeah.

Speaker 2:

But you're close.

Speaker 1:

But you know, in that let's take that example right there, because you know the kid breaking into the car and then running away and then getting shot and killed, and people like, well, you shouldn't be breaking the car. It's like, yeah, you're right, but he but, but does he deserve to die?

Speaker 2:

for that, there's no. There's no, that sentence, you know I'm saying I, I mean what?

Speaker 1:

what do we? What are we talking about here? Like we don't do that anymore. We realize there's different. You know classifications of criminal activity and what's. You know where they fit in and you know, maybe it, maybe it's unfair, but that's constantly changing. It's like, yeah, you don't have the right to do these things, and so I. But but again, I agree that you can't. I when, when someone's making judgments like that, yes, you can say their values as a person, their ethics plays into that, but but that's shaped their, their perspective, but it's, it's sort of detached from the specific decision.

Speaker 1:

You get what I'm saying in a sense like, like you, they didn't say well, you know, because I was taught this way, because my daddy taught me this, because I went to this school and learned that, like I'm going to make this decision, no, like the, the decision's been made. And then all of those things may be informed it at some level, but maybe not. I mean because you see really bad people do good things sometimes. You see really good people do shitty things sometimes. So it's like you're in. It's so contextually based in the moment with the individual circumstances.

Speaker 1:

And everyone talks about that. Oh, you have to take in the totality of the circumstances. But then they go. Well, what about their ethics and morals? And it's like well, does that even apply in the situation? Because this was something. Fear, if it's fear-based and it's survival-based, it's going to be limited in capacity to what options, you know, your brain comes up with now.

Speaker 2:

Maybe now you're talking about a bias, though, brian. Now you're spot on a bias because things I don't know that influenced me when I was 11 years old in East Detroit come to play and I don't even know they're coming to play my fear of my uncle and a situation and a smell and a sound that transports me back there. And now I'm feeling anxious and uncomfortable in a situation. Okay, I'll buy that, but that piece of a larger puzzle is going to be unpacked sometime later, not during the incident. During the incident, I'm going to do what it takes to get out of the incident and that's why we can't claim psychology or moral psychology or an ethical judgment in a spontaneous nanosecond decision.

Speaker 1:

You see what?

Speaker 2:

I'm saying that's why we have to understand that biases can be both good and helpful and they can be horrible and detrimental.

Speaker 1:

Well and that's my point too is like when you have those discussions and it's like, okay, given these set of circumstances, or what are your thoughts, would you rather do this or that? Or you have to choose between this and it's like, okay, sitting here talking about it and discussing where you're, that's one thing, but that's that's not how they play out in real life. And and this part of the reason I found this interesting too is it leads right into how we look at solutions, whether it's a policy decision, whether it's a training solution, because some of the you know people will say well, if you have a higher level of training and a higher level of experience, and you don't just know how to shoot, but you know how to fight, you know how to do this, you know how to fight, you know how to do this, you know how to do all that and you're really proficient at this, you've been through it it's like, okay, I'm not saying that's bad, I'm saying that's all great and yes, you're, you're less likely to get overwhelmed. I would say, as a general statement right, the higher, more training experience you have, the less likely you are to become overwhelmed, unless you come in and it's, it's. Everything is like meaning that may not help you at all sometimes.

Speaker 1:

So if we focus too much on just doing that, like it, you may still get yourself in a situation where that doesn't help. And if it's a spontaneous, like you know, biologically primed response, you're going to fall back on fear. So, whether or not you have the wherewithal of the training to handle the situation, it's, it's always going to bias this outcome or it's always going to bias this way of looking at it in your perspective. And so it's. It's putting things into like weighing things out when you look at all these different contributing factors, because everyone's like, okay, well then it's morals and ethics. It's like, yeah, but that it can be like in a thousand years if we look back, will that be the same answer?

Speaker 2:

you touched on that earlier in my argument. It's not the gift of time, and distance means more than just a survival, uh in a firefight. It means that you have to sometimes distance yourself from an argument and 360 it. You got to do the Holberman. This is the point that we consistently try to get to these gosh damn trainers.

Speaker 2:

Look, I remember the early days of the infantry immersive trainer and you go in there and you get gunned down by guys in a hidden position and and the Marines walked out of there completely dejected.

Speaker 2:

What did you learn? You learned that you got a simulation zipper stitch on you and that you lost half your squad leader and you walked out all demoralized and it was like time out. What are we teaching people here? What are we doing? And so the idea is that your training, your understanding of an issue, your experience, all the things cumulatively that you do in your life, good and bad, are going to come to play, consciously and unconsciously, when you have to make those judgments. So the idea is, the more file folders you have and the better, the better weighted certain file folders are for logic science, the better your decision making will likely be so. If your sense making is less flawed and if your problem solving has been tested, then you're likely to come to a better decision and be more resilient, because you'll be more adaptive. And adaptive and adaptable are two words that I'll use interchangeably.

Speaker 2:

Interchangeably- in this instance, and not in future instances. And so when you tell somebody after a scenario in a machine, or you add realistic smells or you shoot a person and now they feel it in their leg, you really think that you're making a decision and what that person's going to pay it forward. But it's not, because what's going to happen is that person is going to have that additional file folder not a panacea, it's not a cure-all. So so by doing that, you're going to have that. Wow, I certainly don't want to get shot. I know I don't want to get stabbed, just like watching a video, just like getting punched in the nads when you were younger. Right, but that doesn't become a life philosophy. The brain is much more complex than that. So the brain is constantly looking out 5 and 25s for danger and it's also looking at future danger.

Speaker 2:

So when the elections were coming up, there's a certain half of America said Trump is the antichrist. When he comes in, everything's going to change, and the first couple of days they're right, because everything he did played into their narrative. The other people that are on the other side of the thing oh, can't wait. You know, got Biden out, this is going to be great. Guess what? Brian?

Speaker 1:

they're right, yeah, they're right, but you can't be. You can't.

Speaker 2:

Okay so. So what we got to do is we got to back off that ledge. Give ourselves the gift of time and distance and say what exactly is helpful, because if we think what's helpful, what's helpful for us, for society, for humanity, those are the type of things, those are the type of good choices.

Speaker 1:

Yeah, and of good choices, yeah, and and what the the way? Because you talk about the time, distance and we talk about sort of information processing and you know people like to say we say too like we're getting more information coming at us than we're typically used to and it's not that we can't process a significant amount of information. I think it has more to do with just a it's it's more things competing for our attention more than we recognize right.

Speaker 1:

so, like our phones and computers and all that stuff, it's like because we've slowly adapted and started using it. It's like, well, it's actually taking a lot more of your attention than you realize. I think it's that because one of the things that they bring up and people talk about this and so do I and I, and where they say there's a sort of like mismatch between instinct and complexity, like, okay, I have this natural human instinct, but it's simple, it's like it's simple, it wants to it. It has a bias for simplicity going is this going to hurt me and kill me? Can I eat it, whatever it? But but if it's not that, if I can bypass that part, then I can process everything, then my brain can understand this right Stages.

Speaker 2:

Exactly Modules.

Speaker 1:

Because it's a big one and everyone talks about when all this information comes out and then we find out this and see that proves this point and everyone tries to point to these different especially when it comes to technology stuff, of how it changed and social media changed. This it's like okay, and I go stuff of like how it changed and social media changed this.

Speaker 2:

It's like okay, when I go back to you know, everyone said well, the invention of the printing press.

Speaker 1:

You know that changed the revolution, yeah, revolutionized the world. It changed, you know, actually had all these implications for the catholic church, because you know, martin luther, that's when the protestant reformation happened and he, you know, nailed his thesis, his problems with the catholic church at the door. And it's like, well, everyone's like well, that kicked off all this. Because now all these people saw this, it's like, yeah, but when they saw that they didn't go, oh, my god, what is all this? They went, see, I fucking knew something was up.

Speaker 2:

I had this feeling too, meaning like it wasn't a new revelation, it's just someone had finally written it down and everyone went yeah, I fucking see that too, and it was probably nine other Martin Luthers in different parts of the world that were nailing their treatise on a door somewhere that didn't get the press, and that's what I'm saying.

Speaker 1:

So when these things come out and everyone oh my God, now it's AI is the biggest thing. It's like okay, well, did the internal combustion engine get rid of manufacturing? Did it get rid of this? Did it get rid of this? Did it get rid of, like, every new machine that comes out? Did it get rid of it? No, it opened up an entire fucking economy and new things that no one had ever thought of before. But because we always look at it as this, impending this big complex thing, it's like it's that clock again.

Speaker 2:

Tick, tick, tick. This is it. This is the end of life as we know it. And and, brian, you said something I want to make sure that our listeners are paying attention to. You talked about attention. Look, all of the information overload that is hitting our kids and hitting us, and everything with your phone and the nudes and the music around you and all this stuff that's going on challenges our cognitive bandwidth, which means that we don't deep dive on critical issues. What we do is we skim. If you want to get through all of it, okay, you have to be careful about that, because what that does is that diminishes your capacity to discern what's essential and vital and what's just amusing and nominal. And if you don't do that, then everything is important, brian, everything is an emergency and everything is dangerous you know, because we have a you know very, very wide, but shallow.

Speaker 1:

You know amount you know in terms of our knowledge base right, it's for humans, it's very wide, but it's very shallow, like we can know about all these different things, but only at such a level of depth, right, because there's not enough time, whereas, like a machine or something can have a ton of depth, right, but, but it's, it's very narrow, right it can give me the specifics and deep depth.

Speaker 1:

So it's like this, this sort of way of looking at it, or our perspective, our brain wants to simplify it for immediate understanding, but like there's so much more to it and so if I just stick like you're saying, if I just stick with this like immediate headline, headline, headline, headline, like well then I don't actually get the detail it's making me dumber.

Speaker 2:

So so look, max is is 18 months to two years and and mac will be one on february 1st. Okay, why is that important? Because if you look at the animal kingdom, a fucking giraffe lands on its feet when it's shit out and it's still trying to graze when there's a placenta. But with human kids you would never take Max push him out the front door with a flashlight and say go, okay, we have to understand that.

Speaker 2:

For everybody that wants to argue this claim of moral, psychology and ethical judgment, you have to understand it takes us a long time and a juvenile's brain is different and a teen's brain is growing, and so what lessons? And those are back to our stories, brian. What narrative do we want to follow? So if we follow like people constantly are bashing religion and it's always been that way Because some people are going, ah, religion's just screwing our kids. Or it's just taking our money, or it's doing this.

Speaker 1:

And now it's like swinging the other way culturally sometimes. And what's happening?

Speaker 2:

is that religion gave us something to aspire to. Even agnostics said well, maybe it's not God, but it's a way of living, yeah.

Speaker 2:

It's a form, it's, it's something. So I would rather have something than nothing, and no person can alone become anything. It's always a team. Nobody was born aware. Uh, Einstein was working at a patent office and he was working on great stuff. But guess what? He went to lunch with a friend. He discussed things with people, he was writing stuff. So whenever we see these outliers, Brian and Gladwell can eat me. But when we see these outliers, we constantly want to prop them up and we want to say these people are special.

Speaker 2:

Somehow Now they become that and I hate using the term because people have stolen it and used it for different. They become the lone wolf and when you're that nail that sticks out, gichin Funakoshi says pound it down, because it's not good for the tribe, it's not good for the team and it's not good for society. So if your first reaction to buying Greenland is this is fucking ludicrous and this is is the dumbest thing in the world and somebody should do that, then you're not giving yourself the gift of time and distance. Look if it's going to come up on your radar and create anxiety. You got to decide is it worth it? Is the juice worth the squeeze? Is the cost benefit analysis good enough? If it has to do with your family and your livelihood and prices and costs and getting a job and stuff, then immerse yourself in it. But then, and only then, will you be making judgments on the base of reason rather than intuition. Because if not, you don't have reason, because reason is earned and it's learned and it takes time. That's all I'm saying.

Speaker 1:

Yeah, yeah, no. And you know, the historical precedent is always one of the best things to look at. I mean, people say, yeah, well, that's the newest thing. The historical precedent is always one of the best things to look at. I agree. People say, well, that's the newest thing. Well, not newest. But when people are like, well, yeah, all these people who don't believe in God or don't thought religion was bad, but then they're now picking up and it's like, oh my God, the simulation theory. This is all a simulation we're living in. This was built by someone, created.

Speaker 1:

You're all coming to the same collection like so there's some higher order here.

Speaker 2:

That's brian. That's a balance of reason and and it has nothing to do with your instinct or intuition. That means the more we know, the more we can know, and, and that's an important distinction, that's why the police academy is more needs to be, in my personal opinion, needs to be more about problem solving and defense making than it is about the decision. The decision will come, that's why we're fighting the the shoot, don't shoot mentality because, it's more about getting to the right decision and and not making the right decision no, that's it, that's.

Speaker 1:

That's a great point, because that's what all these you know. Everyone's focusing on what the decision is, and it's like you haven't clearly articulated the problem. How are you already at a decision?

Speaker 2:

precisely like why do you think that that we like reading case law? Because we like how people think, how they got to that point to make those decisions and that's our approach.

Speaker 1:

When people are talking about well, they learn to make those decisions. And that's our approach. When people are talking about well, they got to learn to make these decisions, I'm like no, like humans actually know how to make decisions and they're going to do it in a logical manner.

Speaker 1:

They can do it with empathy, they can do it with. They don't have to learn that stuff. That stuff is a part of who we are, in our DNA. What we suck at is the sense-making and problem-solving part, because that's where all the things go wrong. That's where the fundamental attributions, error, errors come in. That's where the cognition biases come in, that's where confirmation bias, it all comes in. At this initial, initial point and you're already talking about what to do it's like hang on, like, you'll figure out what to do. Trust me humans have, because we're still alive. Right, you'll figure out what to do. If you're given the, the appropriate and correct set of fundamental assumptions, the, the appropriate and correct way of looking at things and a general intended outcome, that is good, right. If you go into every situation like, yeah, I'm going to fuck this up, like, okay, like, yeah, you're going to. Or let's go like, you know, hey, let's go fuck this dude up, it's like, yeah, well, that's what's going to happen, then you know, hey, let's go fuck this dude, this dude, up.

Speaker 2:

It's like, yeah, well, that's what's going to happen, then Like, but if you're going in both of us have survived in different cultures and different settings, in different lands, in different environments, meaning permissive, semi-permissive, non-permissive, without having a complete understanding of the language and the culture, have tens of thousands, if not by now a million, yeah, okay, other people that have done that, and I'm not just talking warriors, I'm talking vacation tourist.

Speaker 1:

Yeah.

Speaker 2:

Teachers, you get what I'm trying to say. A lot of other people are kidnapped victims.

Speaker 1:

High school yeah.

Speaker 2:

So. So the idea is the world is set up with us to be able to learn from our environment or be eaten by it. And this is right back to that original argument that gray missed when he was looking at the overarching theory, because when we take a look at darts account, it was flawed. Brian, I'll agree with it, but his principles were correct. What he was trying to say is hey, this kid is really the town child is really profound and prolific and important.

Speaker 2:

Here's why and what it shows is he's the apex predator. No, really profound and prolific and important, and here's why. And what it shows is he's the apex predator. No, wait a minute. When we dug a little deeper in the cave we found out he was fucking lunch, but the idea is that it was sound. So what we're talking about is reasonableness. We're talking about, like, for example, de-escalation. There was a video that I saw that somebody purported they were talking about training and it was a philippine hospital, if I'm right emergency room, and a guy had a knife and was waving it around and the security guard that appeared to be unarmed was kind of like hey, relax, and sat down on the bench and crossed his feet and talked the guy out of the knife within a couple of feet of the guy. Okay, not my style, that wouldn't have been the way that that ended. And then a bunch of people, thousands of people, were going. Oh man, that guy was collected. You know what?

Speaker 1:

Brian.

Speaker 2:

He might've been shit in his pants. We don't know that. He might've not known anything other to do. The other person might've said hey, fuck it, it's not worth it.

Speaker 1:

There's so many come, or he had some other insight that he knew nothing bad was going to happen, exactly.

Speaker 2:

Because this guy had done it every Wednesday at two o'clock for the last. So what we can't do is we can't point to somebody and say that that's what we need to do, because that's. There's no time in your life or in your future that it's going to be exactly the same as that. That's why you've got to train for sense-making and problem-solving, to drive the decision, and that'll make you more adaptive and resilient, because if not, then what you do is you're just adding file folders. Oh, okay, now there's a new cpr. Okay, wait a minute. Now. Now it's how many compressions and brian, when you get to that, you got ob.

Speaker 1:

You're gonna know and, and that goes into, I mean so much becomes based on okay, this situation occurred. How do we never let this situation occur again? Okay, exactly, if you, you can't one, you can't one, you can't do that, but you don't need to, like you're never going to see that specific thing, like it goes back into okay, you're going to make a policy and new law and you're going to base it off of this one case, or are we going to base it off of something more?

Speaker 2:

Perfect example.

Speaker 1:

Like. It's like what are we, what are you trying to do here? Because there's other unintended consequences. There's going to be other second, third-order effects that you're not accounting for. There always is, you can't account for everything.

Speaker 2:

Let's reinforce the glass, but the shooting takes place in a parking lot. Let's reinforce the door, but nobody has the keys to the lock. We're playing a game that you can't win.

Speaker 1:

You can't win.

Speaker 2:

That's not sense-making and problem-solving. What that's doing is that's saying this is the problem, misidentifying the problem, and so we're going to throw all our money at that. You know, school resource officer, there's your answer right there.

Speaker 1:

Or picking the one and this goes back to even the political discussions too, it's like or picking the one thing that I think was the biggest contributing factor and go, we just didn't have these, this wouldn't happen, it's like oh okay, we can't get past that one thing.

Speaker 1:

Okay, like you're. Well, it's just like a very it's just too simplified of a word the way to look at things. You can't oversimplify things that much. That's not how it is. Like you deal with all of these things beforehand and focus on how do I get better at understanding the world and the environment and what I'm in, because then once I know the rules of the game, right Okay, now I get this, now I see what's happening. Now I can make the most informed decision, and most people do, like that's the thing they do every day, almost every day.

Speaker 2:

Most people make the right decision.

Speaker 1:

No matter what it is, whether it's like any of our different clients, even if it's a private sector or law enforcement, it's like, okay, I'm going in, going like these people want to do the right thing we get. People want to make the right they want, they want to be right, we want to make the right decision, we want to do the right thing. So If they're this initial part of understanding, the situation is better.

Speaker 2:

I have a bias.

Speaker 1:

I have a bias for doing the right thing.

Speaker 2:

You have a bias for action and the action will be sound because you followed a format. So when I talked about the elk being a vampire, it was a bitch slap of reality and what I want to do is I want to shake people up so they think about it. So I'll give you one more brief one. Brian, I was unpacking some stuff for a kid in Iraq, a young Marine that was all bent out of shape because it was his door. He didn't boot it, the other person didn't got lit up, and so I was like walk me through what happened all that day.

Speaker 2:

And they were like well, we were going from cloth to cloth, going from courtyard to courtyard, and the first courtyard we found where it must've been, 30, 40 people had shit and pissed in this one area that was behind this wall. And he said then we went into the next courtyard and you know what? We found a bunch of damaged ammo. Some didn't have primers, some were bent, and it was all piled up like they were throwing it away. And we went to the next place and it was fruit and vegetables and they were all damaged. It was all fetid stuff that had, you know, worms and maggots in it and all that other stuff. And then we went to the next. And what he's doing, brian, is he's laying out a sense-making that he missed. He never got to the problem-solving stage to know that what he was following is the pre-attack phase, where the unit that they were about to come up against, that they didn't know was going to ambush him, was going through and making sure all the ammo was ready and they were topped off and everybody had eaten and shit and was ready for the fight. And you know what, brian, when we took him back through that road, when we said, look at all these things, he became one of the best instructors that you could ever think of, because it was all about sense making.

Speaker 2:

And where's it got to lead? It's got to lead to the problem. What's the problem? Somebody's setting us up here and you know what? Smelling, smelling it, seeing it, feeling it, tasting it gives me the gift of time and distance. That's all we're about. We're about, look, if you break these things down to their most base element, most of life is pretty gosh damn easy and it's not anxiety producing, even though you're bent. You're forced into being that way because historically you were predated and eaten, and that's why I like the Tong child example in dart and and I think that we had a real good chance for gray to make something profound, but as political biases caused damage that I don't think can be overcome, yeah, no, and and that's that's always it.

Speaker 1:

It's like you're you're going from. The other issue I have with any of these is like there's always some oversimplification or justification of something to fit.

Speaker 2:

A much bigger thing, Right right and to go see.

Speaker 1:

so this must be the solution. This is the reason why it's like no, because you're on one. You're on shaky legal ground, You're on shaky fucking scientific ground. You're on like. You're just kind of like you're trying to come up with something, but, like you know, what I can use every single day is baseline plus anomaly equals decision because I can sit here and go wrong I can go well, what's the baseline?

Speaker 1:

well, what's the baseline, what am I? And then now, when I get really deep into that, then it's like oh okay, this situation is either a little bit more complex or a little bit less than I thought it was.

Speaker 2:

I mean that's the whole thing. So I need more time and distance, or I need less time and distance. Right, Come on.

Speaker 1:

It's that, you know, I mean I don't know. We covered a lot and you know I thought this was a good one for us to kind of discuss some of like the philosophical and moral and ethical things and how that plays in and how it may inform. But but you know the individual person, most people can't sit here and articulate. It's like when people get all into these manifestos that someone writes like they're all over the place. They're fucking nonsense. You know why?

Speaker 2:

because they're a fucking confused individual like that and they haven't seen any clear answers stringing things together because it sounded good to them.

Speaker 1:

Like you know some someone who loves posting motivational quotes. It's the same thing, but it's a manifesto and it's not motivational. It's really bad stuff.

Speaker 2:

You know what I mean but then if you took them back and said you know where that come from, who the? No, they don't.

Speaker 1:

They're not making a rational or reasonable or a logical decision we're attributing values to influences that that likely had less of an influence than people think. Yes, it is what what my big point is. On all these and with any type of moral debate stuff. It's like there's there, you can have a nuanced discussion about a situation, but you can't reflect philosophically when you're in the middle of something. You can't reflect on what you you can.

Speaker 2:

Can't at the time either.

Speaker 1:

Yeah, well, even if, like, that's why it's like you can't reflect, like what's the purpose of doing an after action on something it's like, well, you can't reflect. Even during training, I can't reflect on what I'm doing.

Speaker 2:

While I'm doing it, I'm doing what's court for and I can go back is for the judge and the jury and the evidence and stuff and brian. Sometimes for an incident that took nanoseconds, it's months and months of preparation and then the trial lasts weeks, if not months. Come on, that's what that is. That's exactly what you're talking about. That's life's after action review.

Speaker 1:

Yeah, yeah, all right. Well, we covered a lot, you know, and kind of got our thoughts out there on this as best as I guess I could. But again, know and kind of got our thoughts out there on this as best, as best as I guess I could, but again it reinforces, like this is great, this is interesting. I love seeing why people do certain things agreed. But I'm just more concerned about like what did they actually do, not what they said, or why they said they did it. It's like just look at the, the actual thing that was done, what were the behaviors?

Speaker 2:

behavior.

Speaker 1:

Behavior over theory, exactly, and that's who the person is, it's like, that's it. It's not what they say, it's not what they think, it's not what they feel, it's what they actually do. And so, because there could be misalignment on all of those things, but cognitively, with individuals, but I think we'll do some extra stuff for the Patreon folks. But I think we'll do some extra stuff for the Patreon folks. You know, if you want to, yeah, the Patreon listeners, I think we'll give some. We've got some good little, I guess little, I don't know what you call them limited, objective experiments, little simple tasks you can try out in the moment or when you're hearing stuff or you're looking at stuff, and that'll give. We'll give some kind of exercises for our Patreon folks to do, to kind of reinforce what we're talking about.

Speaker 2:

I guess I can't wait to hear what they might be.

Speaker 1:

Yeah, all right, all right, everyone. We appreciate you guys tuning in. If you like the show, please share it with a friend. That's the best way to get it out there and then reach out to us, of course, and we've got a ton more on our Patreon, so if you have any questions we'll hop on and answer there. But we appreciate everyone and we've been super busy and got a lot going on, so we're we're going to be more consistent here as we go forward, but just got a lot happening right now. But thanks everyone for tuning in and don't forget that training changes behavior.

People on this episode