Dec. 22, 2022

222 [BEST OF] How Minds Change: the Surprising Science of Belief, Opinion and Persuasion with David McRaney | Partnering Leadership Global Thought Leader

222 [BEST OF] How Minds Change: the Surprising Science of Belief, Opinion and Persuasion with David McRaney | Partnering Leadership Global Thought Leader

In this episode of Partnering Leadership, Mahan Tavakoli speaks with David McRaney. David McRaney is a science journalist, host of the You Are Not So Smart podcast, and an internationally best-selling author of books, including his latest, How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion. In the conversation, David McRaney shares the origin of his fascination with persuasion and his thoughts on the basis of how we form our beliefs. David McRaney shared examples of psychological factors contributing to the rigidity of our opinions and beliefs. Finally, David McRaney shared the many challenges in changing minds and how minds change.  


Some Highlights:

-David McRaney on why he became fascinated with the origin of our belief systems

-Where do our beliefs come from, and how do we form opinions?

-David McRaney on why all reasoning is motivated 

-Why we modulate our responses and how that impacts teams and organizations

-The status game and the role it plays in all aspects of life, including at work

-David McRaney on the impact of pluralistic ignorance on groups and organizations

-Why we are more worried about disagreeing with "us" than with "them"

-David McRaney on the fear of social death and how it impacts team interactions

-The reason why some become part of conspiracy theory communities

-David McRaney on what leads to the "false flag operation" explanation

-Impact of minimal group paradigm on decision making

-David McRaney on why are we persuaded by some points of view and not by others

-Why and how people change their minds

-The one question that can help you become more persuasive

-David McRaney on the most crucial question to ask ourselves before seeking to change someone else's mind

Connect with David McRaney

David McRaney Website 

David McRaney's You Are Not So Smart podcast

David McRaney on LinkedIn

David McRaney on Twitter

How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion on Amazon


Connect with Mahan Tavakoli:

Mahan Tavakoli Website

Mahan Tavakoli on LinkedIn


More information and resources are available at the Partnering Leadership Podcast website:


Mahan Tavakoli:

Welcome to Partnering Leadership. I am so excited this week to be welcoming David McRaney. David is an author, journalist, lecturer. He's the creator of the blog: You Are Not So Smart, which then became an internationally best-selling book. Later, followed by: You Are Now Less Dumb. And he is the host of a popular podcast: You Are Not So Smart.

It's one of the very first podcasts I started listening to more than 10 years ago while I was traveling internationally. Listening to some of his episodes repeatedly, he does such an outstanding job with his episodes, with his books. So, it was such a joy speaking with David. Spending most of our time on his most recent book: How Minds Change the Surprising Science of Belief, Opinion, and Persuasion.

It's an outstanding book. I have listened to it a couple of times. You read it a couple of times and would highly encourage you to do the same after you listen to this conversation with David. I've learned so much from David on how minds change, and he specifically says how minds change, not how to change minds. In the conversation, you will find out why. 

I also enjoy hearing from you. Keep your comments coming, There's a microphone icon on Enjoy hearing your voice messages there. Don't forget to follow the podcast Tuesday conversations with Magnificent Change Makers from the Greater Washington DC, DMV region, and Thursday conversations with brilliant global thought leaders like David.

Now here is my conversation with David McCraney.

David McCraney, welcome to Partnering Leadership. I am thrilled to have you in this conversation with me.

David McRaney: 

I'm thrilled to be in this conversation. Let's talk about stuff

Mahan Tavakoli: 

David, we will talk about stuff because I have for more than 10 years been in love with the content you produce, you're writing, your podcast. I still remember listening to your podcast: You Are Not So Smart. The first time I heard it, it was mostly from curiosity. 

And then I fell in love with both your approach and your humility in doing the podcast. But before we get to that and your outstanding book: How Mines Changed, The Surprising Science of Belief, Opinion, and Persuasion. Would love to know whereabouts you grew up and how your upbringing impacted the kind of person you become?

David McRaney: 

You know, back in the day, especially like 10 years ago when the first book came out, the podcast, I was very hesitant to share this part of my life. I just wanted people to think I was from the internet or something. And I was very careful about not letting my eyes get too long. Lack, rat, matt, fat, fight.

I still have problems with words like want, I wanna go, won't. In a previous era, I would've joined the circus or ran away from home or something. I would've certainly. Found a way out sooner than I did, but thankfully I was just in that cusp of millennial Gen X, where the internet came along and I was able to find all the weirdos and the others as they say through that and had some nice people who were like, Hey, check out this, check out that.

So, I grew up in Mississippi and my family's all still in Mississippi, and I grew up in a very red state, but also very fundamentalist region of that area. In the book I talk about, as the LGBT issues were shifting in the United States, I remember as a young man, intensely anti LGBT acute attitudes. 

Where I was at, I talk in the book of how I witnessed my uncle who is a gay man who opened a flower shop. My first job was delivering flowers with that flower shop. He was being bullied and harassed and I remember he called my father and we went there and my dad roughed up the landlord who was bullying him.

But it was very well understood that we were not to share that with anyone ever for fear of what would happen to my uncle. At the same time, there was a lot of anti-evolution, anti-science, all sorts of stuff. And when I finally worked my way up through college and got out and was doing work as a journalist in newspapers, and then for tv, I write in the book about how, You are not so smart, it didn't exactly start this way, but it certainly was part of the Incepting moment was we had a meteorologist talk about climate change on the air, and I was running the social media. This was back when social media was very new and I was part of the team that was trying to make sure every comment on Facebook was okay.

You can imagine the nightmare that, that might be, especially today. And I was doing this sort of fact check Whack-A-Mole as people were like, I can't believe you've let the weather man now I can't even watch the weather. You're on the agenda and all that stuff.

So, I was doing this fact checking and sharing links and everything and one of our news crew went out, our news crew was out there in the van that, you know that it's the news, and someone approached the van and said, Hey, who runs your Facebook page? And they told them and they said, Is he at the station right now?

And they said, Yeah. They said, Thank you. And he left. And then they called me on my desk phone and I remember the phone call. They were like, Hey I think we really messed up. I was like, what do you mean? And they said, I think somebody's headed your way. And sure enough, they did. They came, they were in the receptionist area trying to get into the building, trying to get back there where we worked.

We had to call out law enforcement and at the station of their patrols. That was one of the first things that clued me into there was something going on with misinformation and tribalism and people's reaction to being corrected and all those things. So, all of that stuff together along with a lot of other things from my upbringing slowly led me to want to get involved in a journalistic way.

Plus, I went to school for psychology and I had all this psychology knowledge, at least the kinda psychology knowledge you get before you switch majors that isn't always popular on road trips and parties because people are saying, Here's this and there's that. I'm like actually that's just called the rule of large numbers or I know that's called para dolia actually. You know, the kind of things where people are like, Ugh, why do you have to ruin everything? So, I tried to find an outlet for it and you are not so smart. Ended up being that thing,

Mahan Tavakoli: 

David lot of other people going through the experiences you went through would have resorted to anger, resentment, and more pushback. One of the things that I find fascinating with a lot of the work that you've done is your curiosity. Where does that come from?

David McRaney: 

I don't know how innate this is or how nature nurture this is, but I've always had this sort of alien sociologist feeling. Like, when I was really young, I was in Bible school and there's somebody there reading these stories to you out of these coloring book style stories, and one of them was about Noah's Ark. And I remember very plainly the woman who was showing us this stuff and telling us about it, she was talking about Noah's Ark and I was looking at picture and I remember asking, Hey, how come the animals didn't eat each other?

And I was like a little kid and I was not questioning the faith, I wasn't questioning anything. I just thought, Hey, how come there's no details here. It seems like it's a logistics problem. And my father had a lot of old sci-fi books he kept in a liquor cabinet. I read them. I had already had that in my mind.

I was an only child, so I was an only child in a trailer in the woods. I was very isolated. A very hermitage leaves of grass upbringing. An isolated area. As an only child, I was living in the cabin, in the woods lifestyle that people think they're gonna go do one day.

I just started out that way and my mom was a big romance novel reader, so I had books around me of that kind, lots of them pulp books. And that was my portal into the adult world, the outside world. Luckily, I also had PBS, so I had a lot of Sesame Street and Mr. Rogers, and there was this intense curiosity, but what's going on out there?

At the same time, are there clues in these books as to why I do not feel the same as the people around me? And that moment about the Noah's Arc was a big one. Cause I remember my father is a Vietnam veteran. He didn't go to church, but my mother did. He didn't go to church for all sorts of PTSD reasons.

He was not trusting of certain types of authority. She told me when I asked that question, she said, Oh, we don't ask those questions. And I just felt embarrassed. I didn't feel angry, I just felt embarrassed that I'd been made to look foolish. So, I went home and I told my dad, Hey, this is what happened.

And he said, Hey, if you don't wanna go back there, you don't have to go back there anymore. I was like, Oh, great. Now I don't have to waste my time there. I can do other stuff. That probably took me off of a certain path that took me off of a certain kind of cultural shaping. But the curiosity thing has just always been there.

How we make sense of things, a subjective reality versus objective reality. It's just always been an intense fascination from storytelling, first of all. And then that's and I've spoken to other people who had similar childhoods as mine and they often talk about how if you had sci-fi books as a kid, it was really a way into thinking differently about things. Cause Sci-Fi books often explore social issues. They often explore the idea of subjective reality versus objective reality and then you walk out into the world around you if you're in South Mississippi and nobody cares and nobody's talking about that stuff.

Also, in my early university classes, I had a great psychology professor, Gene Edwards, and she brought all the goods cause she had been a practicing psychologist for many years before she was a professor and I just could not get enough.

Everything that she introduced, I was like, this is it. It's philosophy, but quantify this philosophy tested and that excited me to no end.

Mahan Tavakoli: 

David, that questioning that you had and you have nurtured is one of those things that because of tribal identities, when I associate with and interact with people in one tribe versus another, everyone thinks that they questioned the other tribe doesn't question. And it's interesting in all of the tribes we end up in, there are rules and assumptions that if questioned, we are pushed out of the tribe.

And one of the things you keep emphasizing is just that fact that belonging trumps accuracy in all cases.

David McRaney: 

All cases. We are motivated reasoners. If you're not familiar with that term, I'm sure you may have heard it in the past, anyone listening, But I think a lot of these things, whether it's cognitive dissonance or motivated reasoning or confirmation bias, they often have these sort of folk definitions that have made their way to the surface of public consciousness that have gotten off of what the actual phenomenon is in psychology. 

With motivated reasoning, the easiest way to describe this, and I've developed this over the last few weeks, I really like this framing, which is when someone's falling in love with someone and you ask them why? So, what you're doing right? When you ask somebody why what you're asking for them, Produce your reasoning for me.

And if you wanna boil that down even further, you're asking them produce justifiable, plausible reasons for the thing that you have just stated, this emotional state or whatever. Okay, you love this person. You're falling in love with somebody. Why? Oh, the way they talk, the way they walk, the way they cut their food, the music, they'd introduced me to the movies that they watch together.

Now, when that same person is breaking up with that same person and you ask them, why are you breaking up with them? They'll say often, the way they talk, the way they walk, the way they cut their food the awful music, they make me listen to the dumb movies we have to watch. So, reasons for become, reasons against when the motivation to cherry pick from all the evidence available changes.

So, the motivation has changed but the conclusions seem different. The reasons that were previously for, become reasons against, but it's not always apparent to the person who's experiencing all of that, how obvious that is. Thanks to a couple of things like the introspection illusion and naive realism and all these other things that I've talked about over the years.

All reasoning is motivated. There is always a drive, a motivation, a desire. There's always an end goal, even if it's not apparent to you. Even if that isn't salient or articulated. In pursuit of that, we'll often cherry pick evidence to provide justifications, rationalizations, and in some cases just conclusions. And when you meet other people across any kind of disagreement line, we often just dump the justifications on each other and assume the other side will just go, Oh yes, of course, because the reasoning chain that we went through, that slog through trying to find something justifiable is invisible to us, at the moment that we're tasked. And it's also invisible for the other side.

 So, it feels like if I just show them the stuff, they should think what I think, but that totally eliminates giving them an opportunity to reason the way you do. So, all that being true, the highest motivation, the strongest motivation, the motivation that drives our behavior more than any other, the one that we will sacrifice our life to seek that motivation. 

There's many different definitions for it, but I think the best framing came from Brooke Harrington, who told me that if there was an equals mc square of social science, it would be the fear of social death is greater than the fear of physical death. So, it's SD greater than PD. And another way of looking at that would be like if you, it's your reputation or your status that you're worried about. And if the ship is going down, that's what you put on the lifeboat and you'll gladly let your mortal self-sink to the bottom if your reputation survives. So, people will often do things for fame or status or legacy, and via that motivation, they will destroy other parts of their lives. They're willing to sacrifice other parts of lives. And in most extreme cases, you'll have things like anti masking, anti-vaccine or just war where people are willing to like the motivation to be a good member of my group to signal to other people that I'm a reasonable, trustworthy individual.

It seems odd. It seems odd that you would be like if I die, I'm no longer a person who gets to enjoy that reputation. But you're thinking in some sort of philosophical, greater than an animal thing. These are algorithmic responses that are baked into the hardware of our brain. These are the results of millions of years of evolution and natural selection to guarantee that we will behave in a way that benefits the group because we are social primates, and once you start seeing us in the frame of social primates and not even social primates, we're ultra-social primates, we survive by forming and maintaining groups. We survive by pursuing group goals, by deliberating in a way that we all work toward the same thing and we are very aware of when somebody is troublesome. We're very aware when someone is not pulling their weight or they're causing harm or they are a trouble maker in any regard. Like, we're very careful of that. 

And one of the most important things is trust. It's very difficult to coordinate and operate in units and groups and institutions if there's any question about the trustworthiness of individuals. And that trustworthiness comes in many forms, like it could be about the competency of that person within a certain context, it could be the history of their behavior in certain domains. 

It could just be about I wonder if that person's only out for themselves and they're trying to manipulate the situation. You know, the game of Thrones things that goes on. So, I guess in short, yeah when you hear that whole man as a political animal thing it comes down to the fact that we're all social primates and we're very careful about our reputation management.

So, what is the point about this? Why is it important to know that? You may have noticed recently a lot of weird stuff when it comes to conspiracy theories and fundamentalism and polarization and extremism and politics that seems to be really fringy in ways that didn't seem okay in just 20, 30, 40 years ago.

A lot of that comes from, if not all of it, from this propensity to pursue belonging goals over accuracy goals. And once we were handed the internet, it’s very easy to cherry pick all this information to find that which will signal to others that you're a really good member, a really trustworthy member of your group. And that fuels polarization. That fuels extremism.

Mahan Tavakoli:

 I just wanna underline this, David, because I've listened to every one of your podcast episodes, read the book a couple of times, listened to it on Audible in trying to understand a lot of this. And I wanna underline knowing that many of my listeners will be nodding and agreeing with that, viewing others as being that way.

David McRaney:

 You're so right. Yes.

Mahan Tavakoli: 

So that's why they.

David McRaney: 

why they do it. 

Mahan Tavakoli: 

They do it rather than recognizing that this is inherent in all of us. We all prioritize belonging and therefore we all face some of the same issues and challenges. So, I just wanna make sure that this is not people nodding and saying. So, that's why those people that are anti-vaxxers or anti masters or name it, that's why they believe the way they do, but it doesn't apply to me 

David McRaney:

 In fact, that very feeling you have of uh huh, that's what they do. That's what they do, that they, that's banging around and echoing in your skull. That's the thing I'm talking about. That inability to see that you're doing that too. If you've ever gotten on social media and started to type out something and thought maybe I better.

Who are you worried about? You're not worried about the people in the other political party, or you're not worried about people in groups that you don't belong. You're thinking about how will I be questioned, shamed, ostracized? You're avoiding a certain type of humiliation or a certain type of reputation hit, and you're not avoiding it from them.

You don't even care what they think. You disagree with them. You're okay with letting them know that. You're worried about us and you wanna stay us. You wanna stay in us, you wanna be a good member of us, so that's part of it. And a great deal from the car you drive to, the music you listen to the food you eat, to the shows you watch on television.

Social scientists can take just that information and determine who you voted for, very easily. There's almost nothing that a person does behaviorally especially it comes to purchasing things or watching content that you can't divide away from everything else you might know about that person demographically, and still very accurately determine all the other aspects of their choices because we tend to group up around almost everything, behaviorally speaking. 

And you have to be careful which groups you belong to. And then it's not just that, that something else that fascinates me in that domain is as we become more secular, at least in the west, especially the United States, the landscape of the groups to which we belong has become very fragmented to the point that people are eager to identify.

And, as far as psychology's concerned, your identity is that which identifies you as a member of your group and not the other group. And you see the rise of these identities that beforehand would've just been stuff I'm into. Veganism it's fascinating that can become an identity.

Or on the other side, I'm not a vegan becomes an identity. Like, I'm gonna put a bumper sticker showing me, shooting a deer and eating it. See, I'm not a big the Star Wars fandom has gotten to a state where people say we can't be friends anymore over the fandom that I'm in people will spend hours every day arguing on Reddit about Game of Thrones and the needle or the rings. 

In a way that isn't just yes, I like this actor, and this is interesting cinematography. It's all about identification and how does this affect me? That's another aspect of all of this that there are large swaths of our global community that have found themselves like the road runner off the cliff with their legs, just spinning in the air. Cause what would've been the identity that kept them going that kept you motivated. That put you into a sort of the brackets that help you make sense of your life and the dynamics of the people around you. 

A lot of that has become confused and epistemically chaotic. And there's a scramble at play. We're just in a phase. I do believe we'll pass through this phase, but we're definitely in a phase where there's a restlessness, a desperation to discover which groups do I belong to? And once I find a group that I get that belonging feeling, how do I definitely stay here and not get kicked out? And that's the on ramp to a lot of weird things that we are witnessing right now. 

Both in discourse and if you have anybody in your family that's falling into a conspiratorial community, the word community is the important part of that phrase. We all entertain conspiratorial thinking. It's conspiratorial communities that have more power than they have ever had to affect how people manage their personal lives, away from the internet, away from the library and so on.

Mahan Tavakoli:

 So, David, to that point before we get to how minds change, then how is it that we make up our minds? You mentioned, and I know you have a couple of great episodes on whether it's vaccine hesitancy or other factors? How is it? How come some people make up their mind that they are more hesitant with vaccines and then get involved in that kind of community? How do we make up our minds?

David McRaney: 

We all start out with this pattern recognition ability that brains have. Human brains are incredible at pattern recognition. And whenever we, in a moment of uncertainty, we feel this dopamine rush that brings our attention to the situation so that we can try to notice the patterns within and reduce that feeling of surprise and the recoil we have from uncertainty and ambiguity, should we be back in that situation again. 

So, all this is very neurological, very chemical. If you wanted to go up into psychological terminology, I like the two words, assimilation and accommodation. These terms are really powerful and it's worth adding these to your vocabulary.

This comes outta the work of PSJ who has this great work called Genetic Epistemology. I actually have one of these old musty books over here of his old works, I was just glancing at it. If you remember PSJ from school, oftentimes when you learn about his work, you learn about the stages of childhood development.

And he was bouncing out of the old intelligence research that was trying to determine at what point can you start teaching children certain things. And they were noticing that there were stages of development that seemed to come online at certain ages. My favorite one is when you, go to a bar, I always see people still getting tricked by this one where they give kids a wide glass, and then they give them a tall glass and they pour the what's in the wide glass into the tall glass.

And children are like, Wow, you just made more drink. But every time I see a cocktail delivered in a tall glass, I'm like, Yo, PSJ stages of development right here in this cocktail. But he has this beautiful architecture for understanding how we update and slowly build more robust models of reality and assimilation is dis ambiguity. 

If you encounter ambiguous, uncertain information or an experience that doesn't seem to fit into what you currently have experienced up into that point, you'll either assimilate or accommodate. Assimilation is taking novel information and trying to fit it into your existing understanding.

He would use the word schema. It's easier to use like models of reality, if you can think of it that way. So, you can interpret that which was ambiguous or uncertain as confirmation of that which you already understand. The accommodation is acknowledging that the model is incomplete or incorrect in some way, and then updating the model to accommodate.

So, one example I often say in this domain is to imagine a child see's a dog for the first time and it's like, what's that? And you say dog, and something categorical takes place. Non-human. Walks in four legs, covered in fur, no clothes, dog. And if they see like a dog of a different color, they can just very easily assimilate.

Like, okay, they come in different colors now. Thanks. If they see a horse for the first time, they may attempt to assimilate by seeing it and saying, Look, dog, big dog. And you say that's a horse. This doesn't seem to make sense. Okay, It's non-human walks on the four legs. Isn't wearing clothes covered in fur.

But okay, there are other things I need to pay attention to here and now I must accommodate, and the accommodation is categorical as well because if there are dogs and horses, that means there must be a greater category I wasn't aware of before in which they both fit. And we don't need to be taught that the brain will just take care of it for us.

It builds another layer of abstraction to make up for it, but you're truly accommodating the new knowledge. And now you have a new category, maybe something creature or animal or something like that. Mammal. Someone, needs to give you the word for it or you'll make a one for yourself. And we are constantly doing that and that's how our minds are made, if you wanna be like very reductive about it. 

Cause this is happening nonstop all day long, every second of our existence as children, there's so many novel, ambiguous experiences and we're assimilating some, accommodating others, but it never stops. Like right now in this conversation, we're both assimilating and accommodating constantly.

What happens though is over time these models become extremely complex and robust. And there's a risk versus reward that enters into this. I like to think of it as a, tight rope. If you update when you shouldn't, you might become wrong which in a previous environment that in which the brain reformed could get you eaten, it could also lead you to not ever getting to eat again. And not updating, when you should, could cause you to remain wrong, to remain incorrect.

And that could get you eaten. That could also lead you not getting something to eat. And when I use the word wrong, that's a very suitcase word. Like, if I pop the locks on it, a bunch of stuff comes out. Wrong can mean factual, moral, political, ethical it can mean something very empirical. It can mean something very attitude based; value based.

But when it comes to what we're talking about here, if I update my model in a certain way, I could become wrong, and if I don't update it, I could stay wrong. And so, you're walking that tightrope. But as your model gets more and more complex, the risk versus reward magnifies to, it’s probably better just to err on the side that everything I thought before I came into this new experience is still true.

That's why if you open the door to your kitchen this afternoon and there's a bunch of sea slugs, playing in a marching band. Your first thought is that, Oh, I didn't know that could happen. You know, I wonder where these slugs came from. Should I introduce myself?

It's gonna be okay, somebody's playing a trick on me. This is a hologram. Somebody must have slipped something into my drink. You're going to try to assimilate it. You're not going to immediately accommodate, Oh, I didn't know that could happen. You're airing on the side of assimilation instead of accommodation.

But that's because you're just motivated not to make the mistake and get yourself in trouble of updating when you shouldn't or not updating when you should. There are other motivations that come into play that make us even less likely that you will accommodate, and we've spoken of some of them already.

It could just be, this might mess with my paycheck, this might mess in my chance at an ally or a maid or something like that. But could also be these concerns about will this make me look like an untrustworthy individual? Will this make me look like someone who should be shamed or ostracized within me in group, my affinity group?

It's in those cases that you see people really been the map to try to assimilate things that are pretty obvious to people outside like when people who aren't motivated in that way will be like, What's what? I usually see this when it comes to false flag. 

Whenever someone sees an event, and that event would paint there in group in a bad light. That's when you hear that false flag explanation bubble up in discourse. Not to get too deep into politics, but a great example of this is the insurrection on January 6th. People whose political party or the people they consider within their political in group, they have these positive emotions when they think about them.

They have a, they hold them in a positive regard. They have a positive affect, as they would say in psychology. So, they just have a positive attitude toward people in a certain political persuasion and then they watch what happened on television and they have this other value set, which is, you shouldn't do that.

You shouldn't attack the capital; you shouldn't hurt people. So, now they have cognitive dissonance. I have a positive attitude toward this particular group, and now I have a negative attitude about what they just did. You're faced with this assimilation or accommodation conundrum, you could accommodate and say maybe the group that I'm part of, I need to think about whether or not it's always good or there are certain members of this group that are problematic.

These are things you'd have to do to accommodate, or you could assimilate, which is you could just say, didn't happen. How could that be? Those were not members of my group. Those were paid actors. They were members of some sort of government organization. Maybe that's people from the other group pretending to be members of my group.

And that's when you get the false flag thing. That's a very easy way to get out of accommodating something and changing the way you view it. You can just assimilate and say, Oh, everything I thought going into this is still them same. It's just that was a false flag and you're trying to trick me. And that's an extreme example, but we're constantly doing that.

People do this very often in romantic relationships. I'm sure you've seen people. When you're on the outside of a romantic dynamic and somebody does something that they clearly should be chided for or something that should result in the end of the relationship.

And you see your friend forgive it in some bizarre way and go, oh you know, you can hear them doing this. We see it, but we only see that on the outside, on the inside, you're so motivated to assimilate that, it almost feels like it's pulling you by the nose that it's got you hooked in some way. And it's true. It does.

Mahan Tavakoli: 

So, as people are assimilating episode 1, 5, 7 of your podcasts. I got emotional listening to it because you have a couple of clips from Jonestown. I would recommend for everyone to listen to that.

David McRaney: 

Thank you for mentioning that. That is the best episode I ever did. It's about pluralistic ignorance and yes, it has a lot of audio from Jonestown.

Mahan Tavakoli: 

That pluralistic ignorance, I want to highlight it and wanna get your thoughts because it's not just an issue with others, it also happens within organizations and communities too. 

Would love some of your thoughts with respect to pluralistic ignorance and how we can have environments, or leaders can have environments that minimize pluralistic ignorance.

David McRaney: 

I'm so glad you're bringing this up. I'm giving a lecture to a group of government officials and when I was thinking to myself like, what do I want to highlight? Cause they're worried about institutional things.

I was like, what do I wanna highlight? Very quickly landed on pluralistic ignorance. I recently talked to Jay Van Babel, who's been on the podcast many times, and we were just talking about different things that were happening in research and we got on this topic. 

Pluralistic ignorance is one of those things that it’s so vital that this becomes part of our common understanding of humanity, because you can do something about this and let me explain what it is before I talk about some sort of the interventions and the prescriptive advice. 

Pluralistic ignorance, in the episode we define it 75 times because there's so many different ways you can define it. Let me define it with an example. 

One of the biggest problems on college campuses, especially in the 1990s and early two thousand was binge drinking. The drinking to so much excess that you are a danger to yourself and others, and you black out and you can die just from alcohol poisoning. But even if you don't go that far, you just become a big gelatinous mass of dumb that causes bad things to happen to yourself and others. Now, if you ever had that much alcohol, and I'm sure many people listening has, we all have had that experience.

We were playing around in that space, especially when we're young. You don't look back upon it that fondly. You often talk about it in terms of what an idiot I was, or thankfully you were there, or I'll never do that again. It's very rare that people pursue that level of intoxication every single time they drink. If you were a new member of a college campus and you're trying to fit in, it's a bizarre situation where you're like, I feel very pressured to drink to this level of excess every single time I go out every weekend, sometimes every night. So, it was a big problem and campuses were all trying to figure out what to do about it. 

One of the things that some of the researchers I talked to on that episode did, when they were looking into this, they through a very Byzantine Securest process that I don't have to go into. They came up with this idea of why don't we talk to people about how much they want to do this? And what they discovered in that process was almost every single person they talked to said they hated it and wished that it wasn't a thing and didn't want to do it. Yet almost every single person that told them that did it. 

So, this is the essence of pluralistic ignorance. It is when most of the people in one group could be an institution or a subculture, but it could be as large as a nation, all feel like their internal attitude is private to themselves and unique and part of a minority, if we were gonna measure it. That's because, everyone is behaving as though they feel the opposite of that. So, behavior is not matching people's internal attitude but since we can't read each other's minds, the assumption is everyone else is behaving in the way that they want to.

So therefore, I am the only naysayer, or I'm part of a small group of naysayers. But the truth is, no one wants to do this. No one agrees with this norm or this behavior. Just afraid to be the only person who seems that way, because that's risks ostracism and shame and all the other social stuff we mentioned earlier in the conversation. That seems like, Oh, okay, I can see what, that can be a thing here and there but there's so much research in this since the 1940s, this is one of the, if not the most powerful forces that led to the extreme lag time between attitudes changing on segregation. Attitude's changing on women's rights. Attitude's changing on LGBTQ rights. Attitude's changing on everything from marijuana to gun control and so on. Attitudes on these issues often shift within the public, yet the behavior or the norm of a law will persist for a decade or so after that cause there's something about the dynamic in the social network where people cannot communicate their internal state without the fear of shame and ostracism.

So, one of the most powerful ways to introduce change in one of these dynamics is to, as they say in psychology, surface the attitude, surfaces the norm, surface how people feel about it. And there are millions of ways to do this with binge drinking on campuses. One of the ways they solved this, or in places that they did solve, it was just to put up signs and billboards saying, how other people felt. Just telling people how other people feel. This is also sometimes the work of standup comedians. Oftentimes, their job, if they have any job socially, is to say out loud what no one else is saying out loud and doing so as if they're gonna take the hit. But when they say it out loud, if everybody laughs and we all look at each other and go, wait a second, you think that too, it’s a way to bust pluralistic grants.

There are many different interventions and all of them basically, are doing the same thing, which is, how do I get people's private attitudes out into the open in a way that they will feel safe to express it? Or how do I do that on their behalf? And then serve as a middle man who says, Hey, by the way, I spoke to everybody and this is how everybody actually feels.

There are many different ways to do it, sometimes it's done by great movies, great television shows. They'll be considered subversive but subversive in a way that we assumed it was subversive, we all actually have no problem with it until it's on television. There's so many things that go into it now. Obviously very complex and very new nuanced. 

In the episode, I talk about how it's often framed incorrectly as the emperor has no clothes. That isn't always the best way to bust pluralistic against, and I use the Jonestown Massacre as an example because in that case, there was a naysayer who stood up and said, Hey, we don't have to drink the poison and that person was shouted down by everyone and made to drink that poison at gunpoint. And then they drank the boys and they all died along with their children. 

Yet oddly, if we had done a poll, most of those people didn't wanna do that, but there were so many things at play at Jonestown that it couldn't just be revealed just the revelation that people have different private attitudes wasn't enough. That was a confluence of many things, and that's a complex topic but I guess one of the big takeaways here is you definitely don't wanna let your organization get to the level of a Jonestown where there's so many levers being pulled.

There's so many motivations at play that even just revealing that the private attitudes won't be enough. The good news is that's rare. There are many ways to bust pluralistic ignorance that don't take much effort at all actually.

Mahan Tavakoli: 

David, you also touch on different studies, including the ones at Cornell on wine tasting where people, when they were asked individually versus when they were asked in a group, they were willing to express their opinions about wines very differently. 

So, there is an impact, in even the smallest environments with issues that are not as important to us, on how we respond based on our perception of what others people expect of us.

David McRaney: 

Yeah. Very quickly in a new social environment, we will begin to modulate our responses to what we think the expectations the people around us are going to be. We'll also quickly, onboard into what is the status game being played around us right now? Like, what do I do that will give me more respect in this particular context than it would in another context?

And what can I do to lose that respect? And you will quickly titrate and modulate your behavior to sync up with that. And you're right, it could be anything. Wine tasting, it could be people are having a conversation about the Transformers, movies, you very quickly go okay, I'm in a group of people where if I was to be the person that said, I don't like that makes me cool. But you could also be paying close attention and go, Wait, if I'm the person who says I don't like that I'm not cool. Same behavior, different context, different status reward or cost, different sanction. So, we're very quick to onboard. If a group of people are onboarding simultaneously, what do you end up with? If a lot of people pretending to act in a certain way so they can all play that game. 

My favorite example of this phenomenon is there was a vegan colony and 70 plus percent of people were sneaking away to eat fish so what they wanted to be was a pescatarian colony, but they were afraid to be found out that they wanted to do that. In your organization or institution, this is something you have to think about and whether you're a business or your part of the government, one thing you have to be aware of is there is no such thing as a social vacuum. So, how are you communicating the values of your company, governmental institution, whatever it is, how are you communicating the game to the people who are onboarding into your group? Because the way you communicate that will result in different types of behavior, depending on what they discover is the naughty and the nice of your group.

How do I gain social award? How do I avoid social sanction? The other thing is, how strong is the network that you've created? Have you created nodes that are in isolation? Like do you have a department that never talks to another department? Like ever? And have you created a hierarchy where only people in certain elite’s data levels are able to talk across groups?

You're setting yourself up for this scenario, is what I'm saying. Cause what you're creating our pockets of interaction that have their own cultural norms and values. And they will be more motivated by that than they will be the overarching norms and values of the organization. And when they come into contact with other pockets of the organization, they will feel a little bit of us versus them, and they won't feel overarching us in that situation. 

Do you have this ability to cross communicate with all levels of the hierarchy? For that matter, how hierarchy driven are you? A hierarchy that's built on risk. You don't wanna risk making the person above you fire you versus one that's based off how much can I support the person beneath me because that person has contact with the thing like if you're in government, it's contacts with the voter, contact with the constituent. If you're a business, it's contacts with the client or the customer.

Whoever's the contact person, that's like the celiac of your organism. That's the fingers of your body, of your corpus. That person who is above that person needs to be supporting that person. Cause they're the one who's doing the thing that matters.

The one that brings in the money and the influence and all the other things that are important to you. A hierarchy where the person above that person is supporting that person, and by the time you get to the top, that person's job is to support everyone with all their might, with every molecule of their being. That's their role. There are many hierarchal structures that don't work that way. They work in a way of you do what I tell you, they work like kingdoms. You're setting yourself up for pluralistic ignorance because everyone becomes terrified to say what they think and feel. That contact person probably knows what's going on and what needs to happen. What's not working.

They probably have a lot of conversations with other people at their level because those are the people that can help them. Those people that can communicate, that can use the private language of the thing that they're doing. They may be very aware that, man, the company wants us to do this. It is the dumbest thing could ever do.

They're asking us to switch to this policy. I can't believe they're asking us. I wish they would switch to such and such policy. They probably have all of this knowledge about what to do. They probably have these very strong feelings and they're terrified to say it out loud, but they only talk about it in certain company. You can tell when this is happening when pluralistic ignorance is starting to fester when people have meetings, and then they have the real meetings. 

I bet you've done this. I've done, I've, I remember doing this when I worked in a TV journalism. You have the board meeting or for me, it was the head of each department. I was the head of a department. See you have this department head meeting and we're all like, this is happening. This is happening. And okay, that's good. We're worried about this. That's great. Okay, cool. Then I'll walk away and then my friend, the person I trust, in that group or the people that I hang out with most, we go have the real meeting, Man, you won't believe what they're saying. That piece of communication is floating free from the hierarchy of the org where it needs to be moving around being disseminated. And why would a person do something like that? The same reason why you'll type out a tweet and delete it. The same reason we do a lot of things out of the fear of what will be the social cost for me expressing my attitude. What will be the social cost for me suggesting maybe that's not a good idea? That's the essence of how pluralistic can destroy an organization where everybody knows the right thing to do, but nobody's doing it. Where everyone disagrees with the policy and the norm, but no one does anything about it. They just runs on autopilot right off the cliff and that's happened many times in human groups.

Mahan Tavakoli: 

The other thing I see in many organizations, David, that you address is minimal group paradigm and how departments, divisions, groups, whatever you call it, get into a certain level of competition. And there's a lot of research with you give people different color shirts. They start treating people who are wearing the same color shirt differently.

So, that is a big challenge that leaders of teams and organizations have where people are in the accounting department, some others are in the marketing department, sales, so on and so forth. How can that be addressed in organizations where people can align and collaborate better together?

David McRaney: 

Wow. This is a tough one. For anyone who's not familiar with the minimal group paradigm, this is the work of on Henri Tajfel. A lot of the psychology of the, fifties sixties and seventies was trying to understand how did World War II happen. How did large groups of people commit heinous acts?

At the time the assumption was its probably very charismatic, bad actors are the ones who are to blame. And so, Henri Tajfel was very skeptical of all this and thought, Okay, I'm just gonna run a very, almost like physics, chemistry, biology kind of experiment on human behavior.

I'm gonna get groups of people, I'm gonna strip away one item at a time of things that would identify them as being one group or another and he called it the minimal group paradigm. What is the least amount of obvious information that I can gather just by looking at another person or maybe something I could learn about them written in a piece of paper that would cause me to start treating them as a member of a group and my behavior be affected by that?

And he was like, Okay, how about I just start at nothing and we worked up from there. That was his solution to how do you do an experiment like this? And he found out that anything you add to that person that identifies them in any way will start to generate this sociality, this tribal behavior. This us versus them thing.

He showed them a picture of a, like an abstract painter and said, which one of these two do you like better? Then they'd say it, it's like, Okay, you're a fan of so and so and, anybody who said they liked the other painting, they'd say, You're a fan of so and so and then when they would put them into other experiments where they could divide money, they could choose who got a certain reward or who got favored tasks of fairness. They'd always give more to their own group. They'd always punish the other group if there's a chance to give equal rewards, never, that kind of thing.

My favorite version is dots. They'd have them look at a picture of dots for three seconds and they there were 40 dots on the screen or on the piece of paper, and they asked them to estimate how many dots did you see? Now, he didn't actually record what people said. He just randomly sorted people into either the over estimators or under estimators, and they would do all sorts of things where they would show them, okay, hey, here's another experiment you could help us with.

We had a group of over estimators spend some time with some under estimators and blah, blah, blah happen. And at the end of it, they had this opportunity to give a large amount of it was a financial reward to both groups or you could get a lesser amount, but the group that's not the one that you're in gets an even lesser amount.

And people always go for this. You would rather get less if you were guaranteed that they got even less. So, we will choose to live in a worse off world. We'll choose a lesser world than we are living in if we're sure that still though, at the end of that, the balances in my group's favor. And remember the groups we're talking about that instigated this type of response were randomized and meaningless and arbitrary, and they just became a member of it right before the experiment.

So, you can imagine the difference between what if you've been a member of that group for a long time? What if your entire livelihood depends on this? What if the way people treat you in your hometown depends on this? So, if we'll do it at a minimal level, imagine what happens when we have all these other extra motivations.

So, knowing that people will act in this way, there are things you would want to avoid in any institution. One is giving people the opportunity to make decisions in that way, like empowering people to make that kind of choice between the in group and the out group. You need to be aware that people are gonna behave in this way if you're given that choice. And anything that identifies you as a them is going to generate this. If you feel like this makes me an us and that makes you a them, it's gonna generate this type of behavior in scenarios that where fairness is being questioned. And you might just do it a little bit, but a little bit adds up across a large organization, across a lot of decisions. So even at a very light level, it's going to put a finger on the scale of whatever you care about. Also, anything can become politicized once you do this. 

This is something that blew my mind. Once people are thinking in terms of us versus them, then if anything becomes a signal of us versus them, people will start acting in this very strange way. So, like you said, in replications of this, they just have people wear different colored shirts or they have them different colored hats. In your organization, it's very easy to have a communication department and a research department. It's very easy to have a user interface department and a hardware engineering department. What happens when you do that? A little bit of us versus them is gonna come in. It seems crazy because aren't we all in the same company?

Man, I come from military family. Military people still do this too like, that guy's a tanker and I'm a soldier, I know we're in this together, but they always do this, blah, blah, blah. One of the ways around this is where people from different groups are in the meetings together, thinking about the problem as a group and also creating orgs where multiple people from different departments are working together. Where people don't feel so isolated at all times.

You're also creating those social bonds that are important to create the cascade effect. But the thing that they found in Taj Phil's work is constantly communicating that we are dealing with a shared problem and we're trying to work toward a shared goal. If you're working for Super Donut incorporated and you're we're working together on a shared goal. We're trying to get away from the thing that David McCraney said not to do it. 

I've worked for plenty of companies that told me that we were all working toward a shared goal and I knew from my desk that goal was to make my boss rich. That goal was to make this company rise higher in the stock market. That goal wasn't my goal, that was your goal. So, be careful about that. You need to have goals that are pertinent to that particular group of people who are working towards something. And you also have to have problems that you wanna solve together.

In Tajfel’s extended work, they basically created, Lord the Flies. It's a long study called the Robbers Cave Experiment, where they had groups of children and they divided them into two different groups, and those two groups ended up almost killing each other. The way they got them out of that thinking was the bus that was leaving the camp broke down and they had to all work together on the bus. That's not just a shared problem, that's a shared problem that directly affects those people not at some abstract level, they all needed to get out of the camp, and if they worked together, they got that thing. 

So, the easiest way to bust this up is to create a system where people can communicate freely. Create a system where people do feel like though we have specialties’, all of us are appreciated for those specialties’, not grouped by those specialties. And those specialties work in tandem. This is a never-ending multidimensional handshake with one another towards some sort of problem. We're all trying to solve some sort of goal. We're all trying to reach some sort of value that brought us all together in the first place. And this is going to do a lot of work toward reducing the negative impact of these things. You could never reduce it to zero. We are social primates, those bonobos, chimpanzees us. We might have suits and shoes and everything, but we're still act in very primate ways.

So, I can't reduce this to zero. But we can get away from the type of group think like there have been moments in the history of the United States where the decision to drop bombs, the decision to invade countries, the decision to institute policies, most of the people in the group making the decision did not want to do that. But these social dynamics led them to a decision that seemed as if it was unanimous. When it was not unanimous, it was unanimous in its expression but inside the heart and soul and brain of each person at the table, people didn't agree what was about to happen. There are ways to mitigate that.

Mahan Tavakoli: 

That's really important, David, in that awareness of it at least makes you try to address it.

David McRaney: 

Huge inoculation, as they say. Huge

Mahan Tavakoli: 

It's not realistic to totally eliminate it. However, this exists in all organizations. 

Now, one of the other important things I wanna touch on this because your book is about How Mind Change.

First of all, I love the title and I know it's intentional, it's not how to change minds. CEOs and business leaders read all kinds of books about how to change minds. And I've gone through a journey with you, David McRaney. 

At first, what I had come out of business school, I felt like with data and facts, I could overwhelm people to see things the way they are and they were always the way I saw them. 

So, at first, that's what I felt. Then I gave up in that and there are people that, for whatever reason they don't get it, you're not going to change their minds, and I love your optimism and the work you have done, including in studying deep canvasing in that minds do change.

So how can minds change?

David McRaney: 

You're right. I didn't want the title to be How to Change Someone's Mind or How to Change Minds. I had a very nice phone call with Simon Sinek. I think he's a wonderful human being and I took quite a bit of his advice, but one of the things he advised was like, you should change the title.

I'd buy that book more than I would the other book. He's right. For that purpose. It would be a better title. 

Mahan Tavakoli: 

This is exactly what I love about you and the content that you put out, and it makes it different than a lot of content that is out there. In that there are times when we can do things, say things, especially in the social media age that get a lot more traction, but are further from real value and what can benefit people.

I wanted to make sure I highlight that because that is a sign of who you are and the kind of value you share with your community.

David McRaney: 

Thank you. I really appreciate that. I've made a lot of decisions that would've pumped up the volume of my signal, I just can't do it. And with this book in particular, it was really hard to put it together if I wanted to write a book that was just pure persuasion, no problem.

But I didn't wanna write How to Win Friends and Influence People Part two. I didn't want that book and I am personally, ethically, morally opposed to coercion and manipulation. What I was more interested in was the resistance to change and the insistence on why don't the facts work on people?

Things like that. The frustration was fascinating to me, for two reasons. One, I was giving a lecture before the book was even a real idea, and someone came up to me and she said that her father had fallen into a conspiracy theory and at the time I was very pessimistic. I was very cynical.

I was only talking about motivated reasoning and things like that. I still saw human reasoning as being flawed and irrational, which I don't see it that way anymore. I see it as biased and lazy, which is way different. And she said, how do I get my father outta this conspiracy theory? And I just told her, You can't.

And I really, I felt so awful, before the words even came outta my mind all the way. For one thing, I didn't know enough about the topic to give this kind of advice. And for another, I didn't even believe myself. I thought there must be more to it. So that started me on the path of trying to understand this more.

And the other thing was, while I was saying that the norms about and the attitudes about same sex marriage in the United States were just completely flipping. Over the course of just a few years, it went from 60% opposed to 60% in favor and I just couldn't get it outta my head that you could take the majority of this nation, put them in a time machine, it's in the back, just 5, 7, 8 years, and ask them, how do you feel about this issue if they're both standing in front of you?

And they would disagree. They would argue, and I couldn't get outta my head that if this person was going to eventually feel this way, why did it take this long? What happened between here and here? And if I could understand that, I could understand a better way of persuading, a better way of coming to understand things.

So, that's what I would understand. How do minds change and how do you go from seeing things in one way to seeing it another, believing in one way, believing another, having an emotional response, and then having the opposite of that emotional response. So, I thought one of the best ways to understand this was to spend time with people who had changed their minds in drastic ways, but also spend time with people who changed minds in drastic ways.

And I had no idea there were such groups, but there are many. They are all in the book deep canvasing, street epistemology, smart politics, and then in therapeutic models. I have motivational interviewing, cognitive behavior therapy and so on. And the deep canvassing was my entrance into this world. My entree and this but I wanna mention all of them at once, just to say what blew my mind and still to the day, blessed mind is all these groups that I visited in person and learned techniques from.

They were not aware of each other. They were very advanced in their techniques; they oftentimes had never looked at the literature that supported it. The scientific evidence that either had similar practices, or there were things in therapeutic models that were similar. It was astonishing that in each of these isolated silos, intense A/B testing had resulted in a technique that was similar to where it occurred in another silo.

And that seemed important. Cause I kept thinking about, if you were to invent the airplane, it wouldn't matter where you invented it on the planet, it's gonna look like an airplane because it needs to fly, it needs to follow the laws of physics, it needs to work with gravity and wind resistance and lift. 

So, that means that if these techniques were invented in isolation away from each other, yet they all seem to follow the same. Order of Operations and also say, don't ever do this, but always do that in the same way. It feels like this is on to something that is important. In the book I'll wait all the way till page 200 before I get into this stuff because I want you, the reader to understand that the neuroscience in the psychology and the sociology and the political science, that is the foundation of why this works.

And I could have done it the other way around, but I wanted to do it that way. However, I do introduce deep canvasing up front because full transparency. I did want you to go, Huh? And then read the whole book. So, here's how all this works together. Deep canvassing is a technique that was developed by the LGBT Center of Los Angeles through the leadership lab.

The leadership lab is their political action arm and. What makes the LGBT center of Los Angeles so unique is they're incredibly well funded. Millions and millions of dollars, and they can do the kind of work that you couldn't do in a lab or that other organizations could not do at their scale.

And they had been dealing with the crush below of prop eight when same sex marriage went up for the vote and they lost the vote and they were astonished. How could this happen in California? How could this happen in Los Angeles and San Francisco? So, this LGBT organization, the ALBs part of this organization, which stands for Learn Act, Build, they were headed up by a man named Dave Fleischer.

And Dave Fleischer said, what if we just go ask people why they voted the way they voted? Which was a bonkers idea at the time. Cause it required lots of logistics. So, they set up these squads of 40 to 75 people at a time, and they just knocked on doors and did an inverted version of canvassing where they knocked on the door and said, Hi, I'm here to ask you why you voted this way, And what blew their mind though, was they, people really wanted to tell you. They really wanted to tell you why they voted the way they voted. So, they did that for a very long time. They recorded lots of answers, and there's one aspect of the story, which I'll tell in the book about what they discovered from the answers and what they were getting over time and what they figured out about certain attack ads.

But the other thing they discovered was every once in a while, when a person told them why they voted the way they voted, by the end of the conversation they would talk themselves out of their position. And they were like, what's going on there? That's new? And so, they started recording the conversations first on audio, then on video.

And by the time I met with them, they had recorded 17,000 conversations and they didn't just throw them onto a hard drive. They're searchable. They were labeled. And I spent hours in those archives watching these things and. I specifically wanted to look at the conversations in, which people had changed their minds.

And you could run the thing back to the beginning and they're like, here’s how I feel about this issue. And then you run it to the end and they're telling you, they feel the other way about the issue. And I had that same feeling. If that person was to go back in time and talk to themselves from 20 minutes ago, they would argue with themselves.

So, what's going on in the middle between these two things? And also, I love by the way, when people do change their minds in that way, and you start asking them what about this? They'll start getting angry with you that as if you haven't been listening to them the whole time. Oftentimes they don't realize that they've done it. They don't realize they've talked themselves outta their position. So, over lots of ab testing, trying to replicate this, throwing away what doesn't work, keeping what does they landed on this. And this is a technique that works I can tell you a similar story about street epistemology that is something very similar there.

For the sake of time and for people who are like yeah, get to it. How do you do it? You only need these two steps. I'll tell you all the steps, but these are the only two you actually need if you wanna get this started today. Step one is build rapport. You need to, at the level of the social primate, communicate that you are not out to shame or ostracize that person, nor are you there to put them into a position where should they leave this conversation?

They'll be shamed and ostracized by the groups to which they feel allegiance. You are communicating. I am interested in how you feel about the issue, and I wanna understand how you feel about this issue. And, I'm gonna hold space. We can explore it together.

You'll notice this is a different framing from I'm right and you're wrong, or I wanna change your mind, or I wanna win this argument and I want you to lose the argument. If you even wanna say anything about how you feel about it, just go so far as to say. 

But it is interesting. I find you a rational, intelligent, reasonable human being, and it's odd that we both are looking at this and seeing it differently. I'm curious as to how that can be and I wonder, if you don't mind, I'd love to have a conversation with you and explore a little more deeply to understand. I wonder why we disagree about it.

I'm curious as to why we would disagree. Completely different framing from the debate frame. Then to get into it, what you wanna do is you want to encourage metacognition introspection. It is gonna be nuanced depending on what we're talking about. If its attitude based or fact based, it changes how you talk about it.

If it's a fact based or evidence-based issue, you'll want to investigate the person's confidence, their certainty. If it's an attitude-based issue, you just wanna feel how positive or negative they are on a certain scale on this particular issue. So, if it was gun control, you wanna say it's like zero to 10 or one to 100?

So, on gun control, you could frame it as, let's say you need to identify what the zero to 10 is, you'll say on gun control. If you're zero, you think merely looking at a gun gets you 10 years in prison. And if you're a 10 on gun control you think that everybody should get a gun in the mail once a week.

You could also flip that, doesn't matter. Just make sure you put something on the numbers and then say, where are you on that scale? If you're doing it with certainty, you could say, Is the earth flat around? And the person has to make a claim in that regard and they say, I think it's flat.

And you say, Okay, how certain are you of that? Like a scale from zero to 10. And if you just talk about like a movie, like how much did you enjoy MadMax Fury Road. If you open with did you enjoy it? They might say, I loved it. Okay. What would you give it on a scale from zero to 10?

And this is that moment. So, in all these instances, you just change it depending on the type of mental construct we're working on. There's usually a moment where a person goes or they go that thing that people do is that hesitation, that hedging. That's a person slipping into metacognition.

And it's amazing. You can do it to yourself. If you're listening to this right now, let me ask you do you like pumpkin pie? And probably very easy for you to go, I love it or I hate it. But if I ask you what would you give it on a scale from zero to 10, you can feel the difference in what's going on inside of you. You go if I gave it from zero to 10, I guess I'd give pumpkin pie a seven. Seven, I'm gonna give it a seven.

 This is an amazing power of both street epistemology and deep canvassing from our politics and others and motivational interviewing. Stay in that space. That's the space where change takes place. Cause the next thing you're gonna ask the person to do is produce reasons, justifications, why a seven? Why did you give it a seven? If you wanna get even deeper, say, how come you didn't give it an eight? Why didn't you give it a nine? And if you happen to have a opinion that you're hoping the other person moves toward you ask them why they didn't go closer to the extreme. And they will produce arguments in favor of why you wouldn't go that way.

And oftentimes what happens is these are new counter arguments they've never considered before, but they're the author of those arguments and they start moving away from the position they were already in. That's the essence of it. You only need those two steps. And if you wanna go deeper into it, I go into it in the book, but you ask a person, once they've given you justifications to explain by what method have, they arrived at that level of certainty? How did they vet that method and so on? But they all use pretty much that, and it's all about guided metacognition. Now, I ask people to have a step zero. And that is to ask yourself why you're doing this. That's the step zero. Why would you want to change somebody's mind about this particular topic? Oftentimes, people haven't considered that, and I would ask you to really deeply Socratic method yourself in that regard because I'm not saying you have bad reasons to wanna change somebody's mind about something, but I would ask you to understand what is your specific motivation. 

My own father had some strange conspiracy theory, got into his politics and. on the surface, it felt like I wanted him to change his mind because he was wrong. But when I deeply introspected and did the Socratic method of myself, what I discovered was I didn't want him to be in the them and me being the us in that dynamic. I didn't wanna ruin our relationship. I wanted to be able to discuss that issue and not worry about it.That was the real motivation. And you know what I did? I just told him that. I just said out loud, I want us to be able to talk about this. I don't wanna lose this relationship. It matters to me. And when I said that, he is like I love you. I don't wanna lose the mobile. And say that we started from there. It was much more likely that I could go through those steps. I could go onto this forever. But what I deeply recommend is before you start any of these techniques, do it.

My friend will store said it's a great takeaway. That step zero is sometimes really hard, and it's very hard to Socratic method yourself. So, here's a fast track to doing that, whatever the topic is. Now, if, since we're talking about, since this podcast deals a lot with business and institutions and groups like that, we can get very specific.

It could be some policy that you're talking about this week. If it's politics, it could be something very broad. If it's something that's happening in the zeitgeist, it could be something very specific that's being argued about. Whatever it is, ask yourself regarding that issue. Do you think you're right about everything, or if this is your first time to ever do this kind of thing? Just ask that question in general. Do you think you're right about everything? Okay. If your answer is yes 

Mahan Tavakoli: We have a different book and podcast for you 

David McRaney: 

Yes. But let's say you say no, I can't be right about everything. I must be wrong about some things. At the minimum must be incorrect about some things. Okay? Then ask yourself what are you wrong about? And if your answer is, I don't know, which it must be, because if you knew what you were wrong about, you'd stop being wrong about it, right? 

So that's step one. Am I right about everything? If the answer is no, ask, what am I wrong about? And if the answer is, I don't know, ask yourself, why don't you know and what could you do about that?

This is the exercise that I recommend. This is like one of the biggest takeaways of you are not so smart. Start there. And you'll land in the very middle of some delicious, beautiful, all-encompassing, epistemic and cognitive humility. And then when you approach the other person, offer them that same thing, they likely also don't know what they're wrong about either. 

And on this particular topic, what you can offer each other is your shared perspectives. There's some things that you've seen, they haven't, there’s some life experiences you've had. They haven't. They have concerns that you would never even imagine being concerned about.

And at the end of the what you end up with is we don't have to try to prove that I'm right and you're wrong. We can now work together to try to understand why do we disagree, and maybe it's true that we're both wrong. Maybe there are some places we're both right and we could gain from this interaction instead of trying to leave it in some sort of zero-sum space where one of us has to walk away going, Ha-ha, you are stupid, and now I've shown that you're stupid. 

It's very likely that neither one of you has the big picture and both of you can borrow from each other to get a little bit more than what you have walking.

Mahan Tavakoli: 

What a beautiful challenge to everyone, David, and I'm sure the audience will wanna become more familiar both by reading your book, following your podcast, and your work. For someone who has followed your work, one of the things that adds a lot more value to your statements to me is that you have also done that with humility, changing your own mind and your own approach as you have learned more.

So, this is not just your thoughts and perspectives as advice for others. It's something you do yourself with that empathetic, non-judgmental listening and asking that first question, which is why do I even want to change this person's mind? Really powerful, impactful, and then the humility with which it requires us to challenge our own thinking.

If I'm not right on everything and none of us are. Then, what am I wrong about? So maybe how minds change. We should first and foremost start with ourselves. As I mentioned David McRaney, I can spend hours talking to you, 

David McRaney: 

I can spend hours answering a question. 

Mahan Tavakoli: 

The audience is going to want to read your book, follow your podcast.

How best can the audience find out more about you, David?

David McRaney: 

There's two ways. All my, you are not so smart stuff is under You can find me on Twitter, notsmartblog, that sort of thing. The podcast name is you are Not so smart. And then all the other things I'm doing, go to David to find out what those are.

And on Twitter, I'm just at David McRaney.

Mahan Tavakoli: 

That's outstanding. And you mentioned why do you want to change a person's mind? One of the stories that really touched me is your conversation at the tech conference, on Changing Minds. I think it's important for us when we think about changing people's minds, there is almost a level of, for lack of better way of putting it, dominance to, I want to change people's minds.

I want to correct the way people are thinking. So, for you, its how minds change. A lot of it has to do with us and the work we do on ourselves. And why do we want to change people's minds.

David McRaney: 

By the end of the book, I felt like I had almost had a superpower, in a way that I'd always wanted, which was I could, on issues of morality, issues of scientific evidence. Issues of politics, I felt like I had this ability to encourage other people to question themselves. And if they were really dogmatic or they had something very strong that I felt was empirically incorrect, I could loosen that up. 

And in the beginning, there's no zealot like a convert, and I was eager to use it. I was invited to this conference where I was given an opportunity to demonstrate this. Somebody there wanted me to demonstrate it on them. And I promised them that I wouldn't repeat the story. If you wanna listen to the full story, you'll have to get the book. But I can tell you that in essence, this was their origin story of how they arrived at their faith. And I listened to the story and was so intensely moved by how they had arrived at their faith.

It didn't matter to me that I didn't share their faith. I actually gained something from that story that was unique and helped me feel closer to my humanity and to them. And I could not think of any good it would put into the world to try to get that person to eject from their faith. I also couldn't think of any poison that was taking out of the world. I couldn't think of any harm that I was reducing. I talk about non-judgmental listening and I talk about trying to share perspectives. Look, I understand, there are people who want to put harm in the world. There are people who want to harm you and if you don't wanna offer them that space, hey, I am not saying there's anything wrong with that.

Like, pick and choose your battles, for sure. I do say that if you want to change minds, you will have to engage with people in this way. And I have met people who have been absolutely on the end of prejudice. I've met people who have reached out to people who are heinous and have wanted to cause them harm and have watched them change their minds.

So, it is possible, but I do not ask anybody to do that just because, and not only to feel like you are lesser than if you don't feel like you wanna offer that all off branch. I get that, a hundred percent. But I also say there's another side of that, which is, I have been in situations where, this one in particular in the book where a previous version of myself would've tackled and loved to have messed with this person's belief system.

After visiting Westboro Baptist Church, after spending time with conspiracy theory communities, and after spending so much time with people who have experienced post traumatic growth and just running the gamut of humanity with this book, at that moment I could not think of any good reason to shake this person up in any way.

And I actually felt I was gonna leave that conversation with something incredible and I just said, you know, we're done here. So, in the end, like he changed me more than I could ever possibly change him. And we hug and I tell him the book, this ended up with a group hug. And it was one of the most powerful moments of my life.

And I do draw from that still daily. Ask yourself why you wanna do this. And if you're not taking harm outta the world, you're not putting good into it. If this conversation's all about just going to, it's gonna make you feel super smarty pants, please question your motivations.

Mahan Tavakoli: 

That's a powerful, not just story, David, but a powerful example of you living with deep values, which is what I really appreciate with the content you put out. Thank you so much for this conversation for partnering leadership, David McCraney.

David McRaney: 

Thank you so much for having me. You're the best.