Bonus Newsletter: Interview with John V. Petrocelli author of "The Life-Changing Science of Detecting Bullshit."
John V. Petrocelli is an experimental social psychologist and professor of psychology
The following is a transcript of the podcast. An interview with John V. Petrocelli the author of “The Life-Changing Science of Detecting Bullshit.”
Jeske: So today we are going to be talking to, I'm very excited, John Petrocelli about his book, “The Life Changing Science of Detecting Bullshit.” I'll tell my listeners how found you. It's totally random. You had posted something. I don't remember the context. I don't remember the comment on Twitter. And I thought, well, that's really insightful. And I checked your bio and I was like, what? That's his book? I have to get that book because this is my life with Fox News of constantly detecting bullshit. And another little funny caveat is I never curse on my podcast just because I live in New York. And if I allowed myself to curse, it would be every other word. So it's gonna be funny that I'm gonna be saying bullshit a lot on this episode. So why, and I'll just let you talk, like what drew you to this type of work? You are an experimental social psychologist, which already that's a fascinating thing. So what drew you to this type of work?
Petrocelli: So yeah, I'm an experimental social psychologist. And for a long time, I, in my work, my research, I had focused primarily on persuasion and attitude change, antecedents of attitude formation and change, consequences of strong attitudes. And that was pretty typical experimental social psychological topic but doesn't get the attention it deserves. It gets a lot of attention, I think, in marketing and in business, but outside of that domain, maybe also in law, but outside those applications, there's only about maybe 5,000 to 10,000 people that would ever come across my work.
So to be quite honest, I was looking for something that might make a bigger impact. I also did a lot of work in counterfactual thinking. So would have, could have, should have, if only thoughts that people mentally simulate through their minds and kind of play out. And what I was finding in the qualitative data is pretty elaborate stories of these would have, could have, should have beens, “if onlys”.
And people almost appeared to be sort of believing that they were entitled to a desirable outcome, you know, that if only they had thought more about the situation beforehand, or if only I had worn my red sweater, I would have gotten her number, you know, or if only my friend had shown up to the party first and mentioned Rachmaninoff, I'm the one who likes Rachmaninoff music, it's not him, and I would, you know, so they'd go on and on about this, all of this it was pretty elaborate things and it never seemed to occur to the authors in my experiments that perhaps things just weren't in the cards for you. Maybe even if you had worn your red sweater, you still wouldn't have gotten the number.
Jeske: It's funny too because you call, this is from your book, The Bullshit Studies Lab, and you're a professor at a university in North Carolina, Wake Forest University. And so your students, when they sign up for this, they're signing up for the Bullshit Studies Lab. I just think that's hilarious.
Petrocelli: Yeah, well, we take it very seriously. I don't think there's a better word for the social substance of which we study. It does get a fair amount of attention. But I think part of the solution to potentially reducing bullshit in our in our communicative atmosphere is to start calling it and actually referring to what it what it actually is. For whatever reason it's not treated in the same way that lies are treated.
If you call someone a liar especially in the south anywhere, I guess maybe south of West Virginia. I mean, those can be fighting words, you know, calling someone a liar. But people usually shake off bullshit like, you know, like it's a mild social offense and they assume it doesn't have any harm. But that's really where people can't be more wrong. And a lot of negative consequences, undesirable consequences of bullshitting for learning, memory, opinions, beliefs about what is actually true and what people believe to be true, there's nothing more important than what you believe to be true for optimal decision making.
Jeske: That's huge. I was fascinated with, I knew I was going to like your book when you start ripping on or criticizing, I don't know the correct term of Deepak Chopra. I was like, this is my guy because I have never fallen for him. I always thought he was a fraud and just a charlatan, basically. And you also talked about astrology being absolute nonsense. And I'm on board with you on that because it's just, if you want to believe it, you'll believe it. And Fox News also kind of plays into this. And what really got me excited was your research into kind of, I would call it peer pressure. You would probably have a different term for that, but the idea of when the group, behavioral contagion, copying behavior sort of is, you talked about that in your book, sort of when people feel like they're part of something, they don't want to get out of line.
So what have you found in terms of, for you to explain it a little bit better, of when, for instance, on Fox, they use terms like we're a family, “Hey, family.” It's common. They use that throughout like “Fox and Friends,”. They use the same term on “The Five,” the sense of we're all in this together. So what have you found about that in this type of work?
Petrocelli: Well, I have not done any direct work with groups and bullshitting kind of arena yet. But we have sort of a ton of earlier social psychological research that really speaks to these types of issues. One of one of the earlier ideas in social psychology is that if you get a group of people, like three or more people together, and they're like minded, or they have similar goals, and you get them to talk, they tend to become more polarized in their attitudes. They tend to be, now they've got other reasons that others have noted for believing what everyone else believes.
And then, whatever, whether they're for or against, pro or anti on some issue, they'll be even more so after talking with like-minded individuals. You also see with small groups that have similar goals or they have something to accomplish, where you'll see everyone seems to be thinking in the same way. We call this groupthink. Okay, so, and it was first believed that you needed several conditions in play. You needed people who had some cohesion, you know, they're part of maybe some team, they have goals in mind, they have what they call gatekeepers, people who might keep out outside points of view or they might insult those people or they might disparage them in some way. They also would have people who might be related in the group. They have some other kinds of conditions that keep them focused on the same page.
What was found though more recently is that all you need is a little bit of cohesion to get symptoms of groupthink. You don't need all of the other factors or the pressure or the stress on the group. You don't need all of these factors. You just need some people who are like-minded and they've identified with some sort of group.
And usually we call this a loose association, people who are just kind of, you know, maybe standing at a bus stop, you know, or, or maybe in a social category, you know, that, that have some similar, you know, demographic, you know, that would be enough to, to encourage liking and encourage sort of perspective taking.
And so you don't even need to know names of people you just have to have sort of an identifier and then what happens though is that people find even the threat of being rejected from the group, okay even the threat of or maybe being ostracized by a group, is enough to prevent them from sticking their neck out and going against the current. And we also know that if you give four or five people in a group similar information, let's say you give them four or five different pieces of information and they all have the same information, but then you add one unique piece of information to each of those five individuals. Rarely, if ever, will they all talk about the one unique piece of information that they have. They'll all tend to talk about the four things that they all have, that they all know, the shared information.
And this is a form of group process loss and in which you might have across the four individuals you might have an equal amount of positive and negative information about an issue or about a person or maybe some policy or a law or maybe somebody to vote for. You might have an equal amount of positive and negative information but if all of the all of the shared information is positive then that's what they'll tend to talk about they won't bring up the negative information and and vice versa you know so it's it's just very difficult uh social norms are real and and they're and they're powerful and it's very difficult to cut across the grain over a long period of time. And the only time that that actually it's looking at that the sort of the existing literature, the only time that people can stick their neck out and and have influence on the group is when they're consistent at it. And, you know, over a long period of time. But most groups a lot of times won't let you stick around long enough if you're gonna be the one dissenting voice. Or the annoying person that's always asking more and more questions for clarity.
Jeske: So, another thing that you brought up was facts and logic. And I know this is true for my own work. I've designed my project for the friends, family, coworkers, loved ones of people who are addicted to Fox. I don't think I'm going to pull anybody away from Fox no matter how much I fact check, no matter how much I show them like, you're being lied to, this is deceptive, this is manipulative media. They're just going to double, triple down, get more defensive, more angry.
But I’m at least keeping people more people from getting sucked in and the other thing is I'm helping kind of friends and family understand their relatives a little bit better or their friends a little bit better so they they “oh this is what they're talking about,” “oh this part of the Hunter Biden nonsense or this part of whatever,”
So what do you think, and you did discuss this a little bit in your book, but I know that that was very much, when you were talking about that in the book, I was like, yes, it's spot on that you can't, you can't, no matter how much evidence you show somebody, they're just going to triple down. Because again, it's like a security blanket. They want to like hold on to what they have there. They can't let go of that reality because it's too scary. What do you think is the best way to, or do you have any advice on how to reach somebody like that? Who's in this, they're being bullshitted. You know they're being bullshitted. You want to help them, but you don't know how. What would you say about that?
Petrocelli: Yeah, Juliet, I would, I would say that when you're speaking with them, you're maybe you're not making immediate direct influence. And yeah, you're probably not going to pull any diehard fans of Fox News away.
But I would venture to guess that you're at least modeling for them a new way of thinking, and maybe a new way of asking questions, or at least planting the seed of considering asking questions in a different way. I mean, usually, what we want to do is, is find a way to first clarify the claim, what is, you know, what is what exactly is being said. Usually, when you ask people who if you can speak directly with the bullshitters, I mean, you can, if you can get them to clarify their claims, a lot of times they will start to reduce the bullshit exposure for you. And so that's a really nice thing. Clarity is an excellent antidote to bullshit because people will usually start to clean it up and say, well, what I really mean is this, or they'll start to give you conditionals, specific instances in which it's true.
Because when it first comes out, it sounds like a blanket statement, you know, like, um, Defund the Police, build the wall, uh, stop funding science, whatever we, you know, and it's just like this, this blanket kind of solution or or statement of a problem. So you have to ask questions. What do you really mean by that? Well, what would that look like? You know, how would it work? Tell me the logistics. How would we know it's working? You know, those types of basic questions just to show that you're interested and you'd like to understand what is they're saying.
And then if you can ask another important question is, well, how do you know that's true? What are the reasons behind what it is that you're saying? How do you know? Not why do you know? If you ask why or why do you think that, usually what you will get sort of a philosophical value-laden kind of explanation as opposed to pointing to verifiable, observable, genuine evidence, something that actually supports the claim.
Lay people often confuse explanation for evidence. Explanation is not evidence. So just because you've got some reasons doesn't mean that you're pointing to tangible, verifiable observable evidence to a claim. But if you ask how do you know that tends to elicit more what we call a more concrete level of construal where people are able talk about things, they're more likely to talk about things like facts and evidence. If if you ask why, they'll tend, it's what we call is a more abstract kind of construal level and you'll get the philosophical, theoretical stuff that that's fascinating that might be full of explanation and no evidence. But there's two other questions that you should ask. If you have direct connection with the potential bullshitter. Have you considered any other alternatives? Are there any other solutions? But the very best question to ask in line with that other alternatives question is, how might the claim be wrong? Because even the how do you know it's true question doesn't always work the way we intend for it to. But if you ask people to consider any disconfirming evidence, for their ideas, you'll tend to get very different responses. And it might be very difficult for individuals because quite frankly, people do not naturally consider any disconfirming evidence for their ideas. I mean, it's very natural to focus on confirming evidence, all of the things that point in the direction of your belief.
Jeske: Well, I think it's why we have such a problem right now with disinformation, misinformation and conspiracy theories is if you want to find your own echo chamber online or in, you know, just any media, you can find that echo chamber, you can stay in it and you can just hear the same voices confirming whatever you believe, even when it's false. My Instagram account is very interesting because the comments are very interesting for me. Whenever I post anything about Trump, and I just did, and it was his, he did a town hall that wasn't really a town hall because nobody in the audience asked a single question.
It was an interview with an audience. So I loved making fun of that because I'm like, that is not a town hall to just have people cheer. They're supposed to ask questions, but anyway. I’ve noticed in the comments on my Instagram whenever I post anything about Trump when the diehard Trump fans show up and they’ll make these comments such as “The election was stolen,” and then all these people pile on going, no, it wasn't. And they'll start using evidence or what they think is gonna work. And it never works. The people just double, triple, quadruple down. And they're like, he had 60 court cases. He lost them all. This is, you know, he lost this, he lost that. Georgia, Michigan, they go through all the states, Arizona, and they just, it doesn't matter. It's like they cannot let go. It feels like that secure, again, like that security blanket of this is what I believe, this is what I've been told. Everyone I agree with thinks this. When I turn on my, you know, OAN or Newsmax or Fox News, this is what they tell me. When I see, you know, Trump on Truth Social, this is what he tells me. So how dare you? They just will not see it.
And I was just wondering if you, I don't know if you've done this type of research. It was a little bit in your book of like, What causes people to be more likely to be sucked in? How do you think emotion plays with that quality? Because Fox uses fear and paranoia. like fear, paranoia, and “you're the victim.”, Because when you watch like MSNBC or CNN, it's a different message. They're not telling their viewers constantly every five minutes that they're a victim and that, you know, they're persecuted and everything's going to hell, but that's what Fox does. So I'm just curious what you think of that.
Petrocelli: Yeah, I mean, there's a lot in there. Certainly, yeah, but certainly, I mean, typically, we've looked at the qualities of the receivers of bullshit and seeing like, well, what is it about the receivers? And under what conditions do people appear to believe bullshit? And typically, we find people will, especially if something plays to their emotions or something that plays to their values. That's about all you need. You don't you don't need to even have good bullshit. I mean, you're asked, you know, about like, what makes somebody a good bullshitter that doesn't. That's very hard to classified because people usually good bullshitters don't usually do the very things that would make them appear as though they are concerned with evidence so if they actually that would be the best bullshitter is that is if they pulled off the actual appearance that they actually are concerned about the evidence and this is evidence beyond the beyond the one or two books in philosophy or history that they’ve read that they go back dogmatically to. Are they really concerned about facts and evidence? That would be the best bullshitter, but that’s extremely rare.
You know, so, so it's more it's really more about individuals receiving bullshit that that will help us better predict whether or not bullshit is going to have have any influence but it's certainly easier to believe something that one wants to be true, hopes to be true, wishes to be true than to go through the really hard work of finding any genuine ideal evidence, empirical evidence that might support an idea or claim a belief. So that's really all you need because it's, you know, even talking about the effort that would be needed to sort of debunk or unpack some of these claims that people make in all areas of life, whether it be, you know, beliefs about
you know, nutrition, politics, law, whatever, it's to really unpack that it would take a lot of time, could take weeks, months, years, relative to the amount of time it takes to generate bullshit. I mean, we can generate some good bullshit right now, and it might take people a lot of time to unpack that. So if it aligns with someone's attitudes, we found in our bullshit studies lab that if the message aligns with one's attitudes so it's a like it's coming from a like-minded appears to be coming from like-minded source and it's it's difficult to be caught in bullshit we call it we call this our ease of passing bullshit hypothesis like If you're going to be communicating with people that you think have less knowledge than you do, or might also agree with you anyway, it's really easy to pass bullshit, right?
So, the only time, the only condition in which we found that people will appear to refrain from bullshitting is when they're not obligated to share their opinions. Like, we've made it very clear that you're under no obligation to share your opinion, But you can write some thoughts if you'd like, but under no obligation. And we've also coupled that with your thoughts are going to be evaluated by an expert or somebody who's an expert in this area. Right. So for example, I'm not going to bullshit a mechanic, an auto mechanic, because I don't know enough about cars. They're going to know that I'm bullshitting them. Right. So, so, so that's where I'm going to refrain from it. But in all other conditions, as long as, whether people feel obligated or don't feel obligated, or they have experts or non-experts, the only time they seem to refrain from it is when they're not obligated, and there appears to be somebody who has more knowledge than you that might be evaluating what you say. So people use bullshit in all sorts of conditions, and if you understand when it's going to occur, and you're also concerned about the potential consequences of it, then you can turn your bullshit detector up a few notches on the throttle.
People will bullshit to promote their status. So, I’m responsible for this, I’m not responsible for that. I’m partly responsible for all the good things happening. I take no responsibility for all the bad things. They also use it to connect with others. So we might talk about connections with, you know, I've never been to Montana, but I've watched “A River Runs Through It,” so I might connect with somebody on that. But that might be somebody that I want to connect with.
Now, other people, maybe in the workplace that I don't want to connect with, I talk and just talk about the weather, or I don't talk about what I actually I actually did over the weekend because I don't really want to connect with these individuals. So if people have a feel obligated to share their opinion at some point or another, people are going to look to you and ask you, well, what do you what do you think?
The gloves are really off, but there's a real preference for ignoring the truth and evidence, especially when it's painful. It's hard to get to and it's time consuming.
Jeske: Yeah, that's huge. Well, one thing that Fox does all the time, this is a real, where they look like they're using data, they look like they're using evidence, but they're leaving out context. I'll give you a couple examples. With COVID, they would say, oh, well, there's, you know, X amount of soldiers who won't get the vaccine. And it sounds like a big number. I can't pull it out of my head, but it would be, I would do the math. I would get out my spreadsheet, because you talk about spreadsheets quite a bit in your book too. And that cracked me up because I am the queen of spreadsheets. It's in the logo. For Decoding Fox News, there's a spreadsheet in the logo. It's subtle, but it's in the logo. But one of the things they would say is they'd say, let's say 10,000 soldiers won't get the vaccine. Sounds alarming until you realize how large the military is. And it's like not even 1%. Or here's another one I know off the top of my head, 1,400 municipal workers in New York City won't get the vaccine. Oh my goodness, it's not even 1% because New York City is 8.4 million people. Huge, huge government. So that's one thing they do quite a bit. And if they say it with authority, they just, you know, hey, look at this. This is really shocking. And they throw the number up there and it looks really scary. I found with, I used to study extremists and so I've watched a lot of right-wing commentators and the only, the scary thing is what I found is all it takes is a little bit of charisma, confidence, That's it. You don't have to be smart. You don't have to be even good at talking. And what happens with the right wing is if you're just angry the whole time, like there's so many of them on the right who just sit there and they just rant. Mark Levin is one who just yells a lot. I’m really angry. I’m really angry. Do you know these communists, these socialists, and that’s it, he’ll just keep yelling and people like, fall in line, they’ll follow like ducks just, what are you saying? And that's the scary thing is like when it when it doesn't actually take much to be a good bullshit artist, It really doesn't, which is absolutely terrifying in many ways.
Petrocelli: No, it doesn't. It doesn't. I mean, and you gave a lot of really good examples There, I think, to add to sort of the basics, though, I think, in sort of selling BS, if you can keep the message simple, you know, keep... Oh, that's huge.
Jeske: That's huge.
Petrocelli: Keep simple conceptions. This is probably, I think, part of the reason why I've sometimes referred to as like these fifth grade sort of tactics of argumentation that can work really well. You know, when Vivek Ramaswamy had said that Nikki Haley doesn't know three provinces in Ukraine, you know, in the debate, that can be really effective for people because it's like, well, what are the real implications of that?
Probably, I don't know, who knows what the implications really, but nobody ever goes there because it's like, they're waiting for, they're sort of waiting for an answer. Like, yeah, like you don't even know, You don't even know what you're talking about.
So if you make other people look like they don't know what they're talking about, you don't really have to know what you're talking about either. But going long-winded, people who are long-winded can do very well with bullshit as well because...
Jeske: Rush Limbaugh was another one. Rush Limbaugh would take something you could say in two minutes, he'd stretch it out for 20 minutes. And he was just saying nothing. But he knew how to, you know, like that.
Petrocelli: Yeah, we know from the experiments in our lab that that if you signal to others that you're really not concerned about what the evidence suggests, you know, like, if you say, I don't care what the research shows, I don't care what the data, you know, gosh darn it, this is the way it is. And you signal to people that you don't really care about the evidence. What appears to occur is that both strong and weak bullshit arguments will perform equally well and people don't recognize the difference between weak and strong bullshit because they've already turned off their what we call cognitive elaboration. They've already turned off that the the depth of thinking that they're willing to to put in because look, look, if you're not interested in the evidence, that might otherwise support your claims, why should I be?
And and I won't really exactly have to bring my A-game to argue against you because you're not even concerned. You're not even concerned about the evidence. So what's interesting, though, is that that when people are, they do signal they're interested in the evidence, then then they do recognize their audience will recognize strong versus weak arguments. But if they've already signaled that, I really don't care, this is the way things are.
I don't care about the numbers then then people tend to engage in what we call peripheral route processing, where they're only interested in the things that might shift their attitudes and their thinking would be peripheral. There would be superficial things like maybe how attractive the person is, maybe the number of arguments. So that's where I'm going with this long winded thing is that if you if you have nine garbage arguments and somebody else has three strong ones, but but they but the two individuals sound like bullshiters, then the nine arguments is going to win.
So if you can be long winded, you can also appear consistent, even though even though you might have contradictions all over your your theory and your philosophy, if you can appear consistent, you can say, well, well, I was talking only about in this particular situation, you know, so I add in conditionals to to easily disguise any contradictions. And if you do all of these, like the basic things are trying to make it difficult to sort of reach beyond what the basic, you know, critical thinking skills kinds of questions that one would ask in a in a clever way to say like, look, no, I actually have looked at all of the evidence.
It's very it can it can be difficult to not be influenced by the information. But you had mentioned sort of the the echo chamber types of effects. And what we find in the lab is that you don't really even need an echo chamber. All you need is one exposure to a claim, even if it's even if Lisa Fazio at Vanderbilt University has shown that this illusory truth effect, this kind of recognizing, I've heard this claim before and then confusing it for truth, this happens even when people know better, even when it's in an in an area of their expertise. So if I tell you that Charles Dickens wrote “Of Mice and Men,” along with a lot of other claims and then later on in the day I say John Steinbeck wrote of mice and men but Charles Dickens wrote “Of Mice and Men,” You're more likely to believe it or rate it as slightly true because you'll likely confuse the familiarity with truth. And you only need one exposure. I used to think that, yeah, oh, you'd probably need like 16 exposures to a claim that wasn't true and you'll start to believe it true. But the evidence doesn't suggest that. It's only one prior exposure.
Jeske: Well, that's one of the things that's interesting about covering disinformation, misinformation is we have to be careful not to, by even covering it, sometimes you're actually elevating it, you're promoting it. And it's a delicate balance where you have to kind of call it out. You can't ignore it, but you also don't want to feed it. And sometimes if people are not careful about how they cover fake news, false stories, they actually make it worse. And it's like Ben Collins, I quote this all the time, he said, because he's in a similar beat. As soon as a fake story is out there, all the people who do the kind of work that we do are just chasing it and it feels like it's constant. It moves much faster. A lie moves much faster than the truth, unfortunately. I wanted to read this section from your book because I think it's so important, which is it's like a bulleted list which is in your set, you're probably like, what are you doing?
This is from the Columbo Mindset of Expert Bullshit Detectors, which I just adored this. I loved the fact that you named them after the, there was a, if you don't know the reference, there was a TV show in the 1970s, the main character was named Columbo and he was like this, you know, a detective who always figured out a case every time, but you have this great bulleted list of what is the Columbo critical thinking mindset look like in practice?
And this is the list: .
Having a passionate drive for clarity, precision, accuracy, relevance, consistency, logic, completeness, and fairness.
Having sensitivity to the ways in which critical thinking can be skewed by wishful thinking.
Being intellectually honest, acknowledging what they don't know and recognizing their limitations.
Not pretending to know more than they do and ignoring their limitations.
Listening to opposing points of view with an open mind and welcoming criticisms of their beliefs and assumptions.
Basing beliefs on facts and evidence rather than on personal preference or self-interest.
Being aware of the biases and preconceptions that shape the way the world is perceived.
Thinking independently and not fearing disagreement with a group.
Getting to the heart of an issue or problem without being distracted by details having the intellectual courage to face and assess ideas fairly even when they challenge basic beliefs
Loving truth and being curious about a wide range of issues and perceiving when encountering intellectual obstacles or difficulties.
Wouldn't that be great if people actually, I love that you wrote this all down. I think it's great. I just want to be like, oh, please.
Petrocelli: Even if, yeah, even if, even if you do one of those, I think it would help for any one of those dozen or so, because I think you'd mentioned emotions earlier. I mean, and the fact that we are social animals you know and we do we connect with others we a connection with others are important and we often look to others especially when we feel uncertainty or or ambiguity and and what's you know what's what's what's correct and usually it works pretty well but it can be wrong.
It's just people will just kind of look around and sort of take a poll and see what every, you know, see what their neighbors believe and their friends and their family believe. And that, again, that usually works pretty well for people. But when there's something else better out there, evidence, you know, it's, you know, just taking your social poll is probably not gonna I think there was a really nice website that Daniel Dale had put together I think the first couple of years of Trump's presidency that tracked basically all of the claims that turned out to be either lies or bullshits or misinformation. And it was a catalog of like thousands of instances and it had links and it had all of the you know easily accessible references. So some of this is already being put together because otherwise this is like a full-time job and I think that's why he ended up stopping it because it was just taking too much time.
Jeske: Oh, I know all about trying to keep track of lies. That's all I do all day.
Petrocelli: So, you can pull the masses sometimes I think in more effective ways. There’s also a really nice new search engine. That gives you some basic stats on any report. It’s called S-E-E-K-R. It's basically another Google kind of type of search engine but what it will give you is some stats. It'll give you also a byline and it will tell you like what is the political leaning and it gives you some stats to help you kind of make a decision as to whether or not okay this is misinformation or fake news or this is legitimate. So it keeps track of these things and it provides a score and they also even tell you how they derive the score.
Jeske: How do you spell that again?
Petrocelli: It’s S-E-E-K-R. And it's, it's a wonderful search engine because to be honest, I, Juliet, I mean, I'm about as independent politically as you, as they come. And so, and I'm not, I'm not a political expert in any way, shape or form. So I rely on this thing quite a bit, actually, you know, so I'll say, is there something, oh, this seems to have a lot of spin on it, which people can detect spin quite readily. And then I'll check the same article, you know, the same source and find out what the seeker score is for the write-up or the, you know, the newspaper article online or whatever. and it's usually pretty informative, sometimes surprising, and people will be surprised again. I mean, if you love the truth, and that's actually my favorite one on the Columbo list, is loving truth and being curious about a wide range of issues.
I mean, if you really believe that, you're probably gonna be surprised quite often. We know that as an experimental social psychologist, I can tell you from experience and just even in the field, we're often wrong. And a lot of our findings turn out to be pretty counterintuitive and cut across what you would intuitively expect to find in an experiment and that just kind of comes with the territory but yeah if you really want to know the truth and you're really interested, concerned with evidence you'll probably, if you're not surprised at least once a day or I think book I was reading, I can't remember the author's name, and it said, I'm continually surprised by how stupid I was a week ago. But that's a good thing because if you're continuing to learn, then you're less wrong. You know, so if you're less wrong if you find out that your expectations, your predictions, your hypotheses were incorrect after seeking out evidence that speaks to it.
Jeske: The book was great. I flew through it. I really enjoyed it. I love the bulleted lists, the fact that you break things down. You have great examples about people who are bullshit artists and whole like intermittent fasting one cracked me up because I've been doing ‘IF’ for years. And you're like, yeah, just you just eat less. And I'm like, that makes sense.
Petrocelli: No matter how many nutritionists might tell you that that's true, there's no evidence for that at all.
Jeske: I believe you completely. It's just because when you do IF, you can't snack. You have to be very like, this is what I'm eating, this is how much I'm eating, and I'm stopping. Whereas when you're just casually eating, you could just sit there and eat and not realize that you're putting 3,000 calories in your mouth. With IF you are specific about your food. You can't be casual about it. But yeah, there's a lot in here. It's great. I highly recommend it. I really enjoyed it. And like I said, as soon as I got to Deepak Chopra, I was like, yep, I'm going to like this book because you were like, this guy's a huckster. And I'm like, did you get I just want to ask you very quickly before I end it. Did you get any backlash from anybody that you called out in the book?
Petrocelli: Oh, absolutely. I'm glad you asked. I got primarily a lot of backlash from people who support various political people. But I tried really, really hard not to make this a political book at all.
Jeske: I didn't think it was.
Petrocelli: I mean, the initial... Now, if I hadn't used any Donald Trump examples of bullshit, I would have sounded like I'm living under a rock somewhere. But even the examples that I give of Trump, they're not political issues. Talking about the number in a crowd or the weather. Those are not political issues. Firing or hiring some people, those are not political issues either. In fact, I think I was, of all the political figures that I was most damning of, was Mao Zedong, dictator in China that I was most critical of. But all of the other political examples, I think are or that of political candidates or presidential individuals that I gave examples of were not political issues.
But that didn't stop quite a few people emailing me and saying, you know, you know, how could I be so disrespectful? And I and so I explain. And usually that they write back, they seem to understand. But yeah, even the fly on the cover, that was nothing to do with Mike Pence and vice presidential debate that was the cover was done six months before that debate.
Jeske: Well, that's what you think of with bullshit. Actual shit, you think of flies, right?
Petrocelli: Yeah, they're the quintessential bullshit detectors. You know, they're thebest. I mean, I just thought to have that on the cover was great. And to use that as a sort of an index of potential harm of bullshit as a one, two or three flies, I thought was a nice touch, but yeah, it's usually that first chapter, I think people write me about, or they say they stopped reading because I was using political figures. And again, it's in no way do I intend for it to be a political book. as you said, like I took look at, from decisions that physicians make to- Yeah, I didn't- people but even the ones that I did use I don't think they're political examples but that like I said that didn't stop people from seeing it that way.
Jeske: Wow no I didn't see that at all I you know I deal with politics all day long and I didn't I didn't I did not project that onto this at all I mean I just because both parties can bullshit both parties but I mean like would Gavin Newsom just had his debate with DeSantis, even though I think he kicked DeSantis's butt in that there were times where he was bullshitting, you know, he was talking about stuff about California. I was like, well, that's, you know, come on, I could probably fact check you and you're not being completely honest, but every politician does that, you know, it's sort of this game they have to play where they, you know, they have to kind of pitch things and sell things and and I think Newsom's better at selling than DeSantis. DeSantis is like an awkward weirdo. And here's one thing my sister picked up on. Linguistically, and this was just an interesting use of bullshit, you could see one person was stronger. Newsom kept saying, “it's a fact.” No, that's a fact. And DeSantis kept going, “you're lying.” You're a liar. Which one's stronger? That's a fact. Much stronger. One's aggressive, one's defensive, you know?
Petrocelli: Trump had used a similar tactic, I think, in his debates with Hillary Clinton that every time he would say “wrong,” wrong you know she like she would say something he was like “wrong” you know and that that that resonates again it's simple it's a simple conception it's easy to remember too and and it fits too like I mean especially if if somebody has uh sort of negative views and evaluations of the other candidate it's really you know you really want to see your candidate really stick it to them you know and so if you could sort of you know dumb it down keep it simple and and so easy that a fifth grader can understand it then it's then it can be very effective.
Jeske: John Petrocelli and his book, “The Life Changing Science of Detecting Bullshit,” Thank you so much.
While I love your show and find interviewing an expert in detecting bullshit important, especially in regards to Fox News, I did not care for such negative opinions on Deepak Chopra without acknowledging that there are many out there who align with him regarding spirituality. It actually made it hard for me to finish listening to this podcast and I love your shows. I do believe meditation, which has a valid place in the scientific world, is something I do subscribe to along with humanitarianism, personal transformation and many other things spiritual - this is what he subscribes to as well (all identified from his website). I am curious what about his beliefs cause you to believe he is a charlatan?
A master class. Thanks.