E329 -

Bobby Duffy joins the show today to talk about human biases and how they shape our understanding of the world.

Hosts
Trevor Burrus
Research Fellow, Constitutional Studies
Aaron Ross Powell
Director and Editor
Guests

Bobby is Professor of Public Policy and Director of the Policy Institute.

Prior to joining King’s in 2018, Bobby was Managing Director of Public Affairs for Ipsos MORI, which is a team of around 250 researchers in London, Manchester, Edinburgh and Brussels, and Global Director of the Ipsos Social Research Institute, across around 30 countries.

He has worked across most public policy areas in his career of 25 years in policy research and evaluation, and has been seconded to the Prime Minister’s Strategy Unit and the Centre for Analysis of Social Exclusion (CASE) at the LSE.

Bobby also sits on several advisory boards for think tanks and universities, as well as Chairing the CLOSER Advisory Board, the Campaign for Social Science, and as a trustee of British Future and the Centre for Transforming Access and Student Outcomes in Higher Education. His book, The Perils of Perception – why we’re wrong about nearly everything, was published by Atlantic books in several countries, drawing on a set of global studies on how people misperceive key social realities.

Bobby Duffy draws on his research into public perception across more than forty countries, offering a sweeping account of the stubborn problem of human delusion: how society breeds it, why it will never go away, and what our misperceptions say about what we really believe.

How do we consume information differently now than we did in the 1950’s? How do we gain knowledge about the world around us? Why are Americans high in confidence but low in true knowledge? Are Americans creating false beliefs from false information?

Further Reading:

Who Elected Donald Trump?, Free Thoughts Podcast

How the Media Really Works, Free Thoughts Podcast

Social Media’s Moral Panic, Free Thoughts Podcast

Transcript

[music]

00:07 Trevor Burrus: Welcome to Free Thoughts. I’m Trevor Burrus.

00:09 Aaron Ross Powell: And I’m Aaron Powell.

00:10 Trevor Burrus: Joining us today is Bobby Duffy, Director of the Policy Institute at King’s College London. Formerly he was Managing Director of the Ipsos MORI Social Research Institute and Global Director of the Ipsos Social Research Institute. Welcome to Free Thoughts, Bobby.

00:24 Bobby Duffy: Yeah, great to be here, thank you.

00:25 Trevor Burrus: So you worked in an opinion research firm and what did you learn? Well, what kind of work did you do there and what did you learn about opinions while working there?

00:34 Bobby Duffy: I worked at in opinion research for about 25 years and I started off focusing on the UK and then particularly, when I started looking at people’s misperceptions of social realities, I did a whole series of studies looking at what people thought was happening to crime rates, immigration rates, those types of things in the UK and what that showed was that people were incredibly wrong about a whole series of social realities in the UK, that made me think whether the UK was uniquely bad at this or whether this was a global phenomenon. So then I started looking at it internationally about 15 years ago and I’ve done a whole series of studies across all different policy areas, across all different countries, up to 40 countries now, and what it shows is there are some very consistent biases and misperceptions in how we see some of the key social realities about the world.

01:41 Trevor Burrus: That didn’t seem that surprising to I think most people, they probably understand that most people don’t spend a lot of time reading crime reports or immigration reports or things like this, so that they’re inaccurate doesn’t seem that shocking.

01:55 Bobby Duffy: Yeah, I guess it depends what level of accuracy you’re looking at. I don’t… These are not tests of knowledge or expecting people to have a perfect view of the world, that’s not the point of the studies, it’s much more that people are biased in consistent directions, so if this was about a lack of knowledge as opposed to a positively held misperception, that would be a lot less interesting. These are not trivia quizzes. But what we find is it’s not random lack of knowledge that shows up, so if people really didn’t know something, you would expect people to be wrong in all sorts of different directions, but what you find is that people are biased to be wrong in a particular direction, so taking crime as an example, people think it’s higher than it is and it’s more like to be going up than it actually is. So this gives us an insight into how people think, how people see the world. It’s a long way from a test of knowledge.

03:05 Bobby Duffy: In the literature there’s a big distinction between ignorance, not knowing something, and misperceptions, which is thinking you know something but being wrong about it, and that’s really, it’s that latter group of things that I’m interested in and what that then tells us about how we see the world and why we’re so often so wrong.

03:27 Aaron Ross Powell: It doesn’t show up in… You have a lot of charts in the book with people’s responses broken up by all sorts of categories and all sorts the questions, but one answer that doesn’t show up much is the “I don’t know” answer, and I’m curious about this, because there are lots of things that we don’t know about, there are lots of things I don’t know about that I simply don’t have, because of that I don’t have beliefs about them. Why is it then that we don’t see or I guess how much do we see people simply say, if you ask them what are crime rates like or how many people from different countries live in your country that they don’t say simply, “I guess I don’t know.”

04:12 Bobby Duffy: Some people do, people do, so these are mean scores based on the people that do give a response, that do give an answer, so typically we’ll find between about a fifth to a third of people won’t give an answer, but will actually volunteer that they don’t know. Depending on the nature of the question, it can go anywhere from 10% of people saying they don’t know upto a third of people, but most of the time it’s around about a fifth of people who say they positively don’t know. So people are allowed to give those responses, but the majority do give a response. They are encouraged to give a response, because we are, again, there’s that point of it not being a test of knowledge, it’s a test of how they see the world generally.

04:58 Bobby Duffy: I think one of the most interesting charts in there is, we also ask people on one of the studies about how certain they were that they got the correct answer, so this is interaction between certainty and correctness, whether if you think you’re right, are you more likely to be right, and the one I showed was almost an opposite correlation between those two things, that the people and countries that were most confident that they’ve got the answers absolutely correct were the most likely to be the most wrong. This is related to that kind of Dunning-​Kruger effect of an over-​confidence in your view, being related to just not knowing when you don’t know something, so that is a very important subset of these types of questions where you kind of understanding that confidence in how you see the world is very unrelated to actual levels of knowledge, it’s much more about your own identity, your own values, just how you see things rather than depth of knowledge.

06:00 Trevor Burrus: I’m going to take a guess and maybe think, believe that Americans rated high on confidence and low on knowledge.

[laughter]

06:21 Bobby Duffy: You were quite high on there.

[laughter]

06:23 Bobby Duffy: America doesn’t do very well overall, in the overall index. In fact, it’s second worst behind Italy in terms of being correct on these answers across the 14 countries that have been in the study from the beginning, where we did a mega-​index across all of them, and the US was second worst, Italy was worst, and then countries like Sweden and Germany were most accurate.

06:54 Trevor Burrus: I think I remember seeing that Italy was 100% wrong on the questions you asked.

07:00 Bobby Duffy: That’s, no, that’s just an index.

07:02 Trevor Burrus: Oh, it’s an index, okay. [laughter]

07:04 Bobby Duffy: They were the top, they’re the kind of thing we’re all aiming to get away from, but that’s all indexed on a 100 and everyone relative to the most wrong country, which was Italy.

07:17 Aaron Ross Powell: So this wrongness, is it the result of external factors, largely? So we’re simply, we’re consuming poor information, we’re consuming incorrect information and then basing false beliefs on it? Or is it more of an internal factor thing where we’re engaging in, say, motivated reasoning or we’re discounting information that conflicts with pre-​existing views or… I mean, I assume it’s a combination of both, but is it weighted more towards one than the other?

07:49 Bobby Duffy: That’s a great question. Effectively, that is the crux of the book is pointing out that it’s exactly those two buckets of effects that we need to bear in mind. Sometimes, there’s a great deal of focus on our post-​truth world, fake news and alternative facts, like this is a sole cause of people’s misperceptions and biased views of the world.

08:18 Bobby Duffy: And then sometimes, in other books and analysis, there’s an absolute focus on our biases and heuristics, our shortcuts that we take and how they determine how we are wrong about the world, when the reality is it’s a systemic issue where one interacts with the other. The media know which biases we have and how to play on them, we respond in a particular way, that gives the political or hit, page view rewards that you get from that, and it creates this cycle where our biases become built into the system of… Our information environment, really. So it’s kind of a… Saying that it’s caused more by one than the other is impossible and kind of misses the point of that one and the other interact and reinforce each other in this cycle of delusion that we create.

09:19 Bobby Duffy: It appears that it’s not an accident, it’s kind of the media is curated around, the media and political statements are curated around our biases and our biases react to those and give the feedback loops that the media and politicians are after. And it’s kind of… That sounds quite scary, to have it as a systemic problem where there’s not just one thing you can fix, there isn’t… You can’t just teach kids critical literacy, news literacy and then suddenly you will solve this problem, ’cause you can’t teach the human biases out of our kids. And, equally, you can’t just tell the platforms, social media platforms, to sharpen up their act or get stricter in the regulations with them, and that will solve the problem. It’s… How you deal with that as a whole group and system is the crux of what we need to do.

10:17 Aaron Ross Powell: If a feedback loop is playing this role, do we see differences either over times or changing levels of wrongness, call it, or differences between countries based on how facilitated that feedback loop is? So the media, a very rapid version of feedback loop would come in sophisticated analytics, where we can see, if we put up a headline that plays to this bias, we get this many clicks over the next hour, and then we can adjust it. So we can have very rapid formation of that loop versus, you go back to the 1950s when you had a handful of television stations and they didn’t really have a good way to measure if people were actually tuning in or how they were responding ’cause they’d, I suppose, wait for letters to come in to their reporters, but that’s not a terribly fast thing, and so the feedback loop can’t operate as quickly. Or between countries that are more wired up or less wired up, do we see differences that would play into that narrative?

11:20 Bobby Duffy: That’s again, a really great question, ’cause it is… That phenomenon of change over time or difference between place and why is one country better than another, is something that I looked at, both the trends and the differences between countries, quite a lot. So on the trends, first of all, there’s… We’re lucky that we’ve got in the US, that you have a great tradition of study in academia that goes back to the 1940s and 1950s that looked at political ignorance or rational ignorance, where there was a stream of thought that was actually to do with that the electorate doesn’t have huge amounts of knowledge of political realities, both political and social realities, as a part of the argument around the size of the state and how big you should have as a state operation, a government operation because people don’t pay much attention to this and they’re not making informed choices.

12:19 Bobby Duffy: From then, from 1940s, 1950s America, there are a few questions that are similar to my… The questions that I’ve been asking. Things like unemployment rates, which people massively overestimate, and what you find from that is people were just as wrong in the 1940s and ’50s on things like unemployment rates as they are now. And then more recent trends, like I say, I’ve been doing this for 15 years. And I haven’t… What we haven’t seen is particularly any improvement or deterioration in how wrong people are on things like whether the murder rate is going up and down or the level of immigration in their country, it’s not been a massive change. Similarly, when you look across countries, I tried to get measures of media quality or plurality or control measures of the political system, the extent to which it was controlled or pre… And again, not much relationship between those two things. You can see in the individual relationships between people who consume particular types of media, like Fox News in the US, or the Daily Mail in the UK. They do tend to have attitudes that relate to those, but that’s… You can’t tell that that’s cause and effect because people pick the media that already reflects their view.

13:51 Bobby Duffy: Overall, my view is these haven’t changed hugely despite our change in information environment. And the question that raises is why, and it maybe that it’s just too soon to feed through. But the second thing that I look at, and I think is probably more important, is whether misperceptions themselves are the most important indicator here. I’m not so sure whether being more wrong about something, we would expect that over time, as opposed to, say, people becoming more certain of their own world view being correct and other people’s world view being wrong, as people increasingly see more of the things that they already agree with. So this is… I think the endpoint of this is much more to do with polarization in politics and one side drifting apart from the other than it is to do with these kind of answers on how people see reality. So that’s the end-​point I’m looking at in this much more than are people getting more wrong or more correct, it’s much more, how do they see the other side of the argument. And there’s a much more worrying side on the state’s polarization than there is on misperceptions.

15:10 Trevor Burrus: It seems that many of these misperceptions, just to fill in our listeners, a lot… The book sort of centers around asking people what they think something is and then seeing what the differences between the reality, immigration rates, crime rates, how much sex you think other people are having, things like this. But it seems like a lot of the ones, especially the more policy-​oriented ones, are pessimistic, that the bias is pessimistic. You don’t see people systematically over-​estimating the unemployment rate positively, and thinking it is much lower than it would be, or the immigration rate, or the crime rate. And maybe some of that has to do with just nostalgia, and that we think back on being 10 years old and saying, I didn’t feel as endangered by crime when I was 10 years old, and you probably didn’t, because you were within 10 minutes of your house and you’d watch cartoons all day so you didn’t see the news stories. And so, now that you’re, whatever, 35, you say, well, the world is so much worse than when I was 10 years old. And that that’s the way these sort of pessimistic biases go as you get older.

16:18 Bobby Duffy: Yeah, I’m sure that’s a thing. Actually, I’m working on my next book on generational differences, and that sense of nostalgia is a key aspect in the sense of which we have a foreboding about generational decline. I think there’s probably… There’s two [16:35] ____ biases going on in that negativity bias, which is rife throughout the results, where in the US it’s only… The murder rate has gone down significantly over the last 20 years, but very few people within the US say that’s the case, it’s saying on all sorts of different aspects of people thinking things are getting worse. And the true bias is really are that negativity bias where we are programmed on paying more attention to negative information than positive information.

17:12 Bobby Duffy: This goes back to our [17:13] ____ days, where negative information was more often a threat-​based information. So if you didn’t take notice of that warning of a lurking sabre-​toothed tiger you were edited out of the gene pool. So we are literally descendants of people who have brains, who take more notice of negative information and store it more readily. And then secondly related to your nostalgia point, social psychologists will talk about rosy retrospection, which is that we literally forget the bad things from the past. It’s not, it is partly definitely to do with our childhood and we didn’t notice those things in the first place, but there’s a lot of experimental work that shows that even if you did notice something bad in the past, we’re more likely to forget it, to let go of it.

18:06 Bobby Duffy: And that’s, again, that’s not a dumb fault of our brains, it’s actually good for our psychological health not to dwell on those bad things from the past, but it does also mean… It has a negative impact and it means that we think the past was better than it was, and it means that we think the present and the future are therefore worse than they really are. So we do have both of these things and it’s a consistent thing, not just in the US, across all countries. More focus on the negative, and more likely to think things are getting worse rather than better.

18:41 Trevor Burrus: Did you do any studies that try to connect a certain belief with political ideology? Well, for example, and I’ll ask listeners this question and give them… I’m going to take a pause, so they can think of what the actual answer is, which is out of every 100 people in your country about how many are Muslim? And you go through a bunch of countries with what the actual percentage of Muslims are. So if anyone is thinking about their country, I will just say… So if you’re in the US, the actual percentage is about 1%, but the average guess was 17%. And in the UK… Trying to see if I can find the UK here. Well, Germany pretty much over-​estimated. The actual reality is 5%, the average guess was 21% and in Great Britain about 5%, and the guess was 15%.

18:48 Bobby Duffy: Yeah.

19:31 Trevor Burrus: But that would seem to correlate with, I mean, just… I’m just sort of commonsensically if you think that Muslims are a huge danger to the world, then maybe you’re more likely to overestimate the level of Muslims.

19:46 Bobby Duffy: Yes, that’s exactly that effect. I think one of the key explanations for lots of our misperceptions is that they’re more emotional than they may seem. We may be asking or think we’re asking about a very neutral fact of what proportion does this part of the population make up within the overall population. But that’s not how people react. We react much more emotionally to those types of questions than we think. And that sense of threat from the Muslim population is very clear-​cut. When you look at the media coverage of the Muslim population in the US and the UK, it’s very similar, is that 80% to 90% of all media coverage is negative of the Muslim population. It’s all associated with threat. And we are story-​telling animals in the end that, where a vivid unusual story draws our attention and takes more of our notice than the boring, normal stories, and the consequence of that is it grows in our head, and we think it’s a bigger thing than it really is, even on these realities. And the Muslim population one is interesting because we also asked what do people think it would be in a few years time, not 10 years or 20 years, but just four or five years time. And in the US, people thought it will go up again to 23%.

21:16 Bobby Duffy: When the reality, when the expected reality is, it’s still going to be around 1%. So this view that nearly a quarter of the population in the US would be Muslim in a few years time. In France, it was 40% people projecting forward. So it’s not just a sense of current threats, a sense that the Muslim population is growing at an enormously fast rate, and that threat is increasing and that is driven by that emotional reaction much more than a rational consideration of the facts.

21:51 Aaron Ross Powell: How much too, does just the fact that humans are bad at big numbers play into this? We’re pretty good at manipulating numbers on a small scale or doing some basic arithmetic, but if you ask us to imagine a big number, we kind of have a hard time distinguishing it from an even bigger number. And so people are just kind of, if something’s going up, then that means it’s going to be huge and I’m going to name an excessively huge number.

22:18 Bobby Duffy: Yeah, absolutely. So yeah. In the group of things that I look at about how our brains work in that bucket, that maths and statistical, mathematical and statistical literacy is a key element of it. We don’t really think in precise ratio terms, we think much more orderly, as in, an order of things, where something is big or small. And there is this really fascinating effect that is looked at by a US academic called David Landy, using our data, called psycho-​physics. And psycho-​physics is the study of our psychological reaction to a physical phenomenon and basically, what that shows in this context, is that we tend to overestimate small things and underestimate big things. So if I ask you to estimate the brightness of a light, you will overestimate a dim light and underestimate a bright light, in terms of how bright it is or a heavy weight and a light weight, it’s kind of built into the mechanics of our brains. And what it boils down to for this type of effect is that we basically, when we’re uncertain about a figure, we hedge towards the middle.

23:38 Bobby Duffy: We kind of know it’s relatively small, but we don’t want to go stupidly small in our thinking. So we go a bit higher, we hedge, in our case, ’cause it’s usually, out of 100% of the population, people hedge towards 50%. So there is also a very mechanical effect going on here, where we’re bad at numbers, as you say, bad at understanding the scale of things, switching between different sorts of scales. So we kinda hedge our bets a little on it. So there’s definitely some of this, which is not emotional, not driven by politics, it’s just the way our brains work. And the good thing about that, is you can kind of partial that effect out from the data, to look at what people are really wrong about, if you were… If you take into account those psycho-​physics effects and it still leaves people very wrong about certain social realities, even more wrong than you would expect from the psycho-​physics models. But it’s a really useful thing to remind ourselves of is that we’re also, we have these mechanical challenges. It’s particularly prevalent when we’re talking about risk, people have a terrible understanding of risks, because we are drawn to the attention-​grabbing risk rather than ones that will actually cause us harm.

24:58 Trevor Burrus: Have you found any partisan bias, which… Partisans are all sort of worse at this, because you go through immigration, you talk about Brexit and things like this, and I imagine some listeners are either looking at the title of your book, Why We’re Wrong About Nearly Everything, might say, “Well, it’s because they watch Fox News, with their bias” or, “Oh, it’s ’cause they read he Daily Mail” or they watch MSNBC or they read the New York Times, if you’re in the kind of Trump camp. But did you find any idea that maybe one group might be a little bit more often wrong than another group?

25:35 Bobby Duffy: It was… There are differences in the errors that people make, and it is… So for example, one of the key questions that we asked in the US and other countries was, “What kills more people out between guns, knives, and other violence in interpersonal violent deaths?” So, not suicides. “What’s the biggest killer of people, between guns, firearms, knives or other types of physical violence?” And overall, in the US, it’s guns. Guns kill more people, they’re responsible for 68% of interpersonal violent deaths in the US. And the average guess in the US is not too bad, 59% correctly choose guns, well ahead of everything else. And the most interesting aspect was on the difference in views between supporters of different political parties. So it goes from 83% of people who strongly identify as Democrats saying that it’s guns, down to 27% of people who strongly identify as Republicans saying it’s guns. So here we have the same social reality, seen entirely differently, depending on your pre-​existing political views. And this is… That’s kind of one of the more extreme examples of difference between the media consumption, or political views, and misperception. We see, we do see similar sorts of effects on immigration and deaths from terrorism.

27:20 Bobby Duffy: Where either your media consumption and your, and/​or your political views will bias your views, to some degree. Not as much as that guns example. And the overview of that, though, is that there’s only really a handful of issues where partisan bias really makes a big difference to how wrong people are. And that’s really important I think in this, is that it shows that different partisan views are important, and the confirmation bias that we were talking about earlier, those such things do have an effect, but they don’t explain everything. There’s only those handful of things. And that’s really important, I think, because there’s been a lot of other work in this sort of area, that’s talked about things like backfire effects, which is, where if you tell someone they’re wrong about a factual point and give them the correct answer, that can actually make them more likely to hold on to the wrong answer, to try to argue against the correct answer, because they’ve got such a strong partisan bias that they just find ways to hold on to that view, and it reminds them of their identity and those views. And I think more recent work on backfire effect has shown that it barely works on any issue, hardly any issues do you see backfire effects where telling people the right answer makes them more likely to think, believe in the wrong answer.

28:52 Trevor Burrus: And that’s really important, because it shows that people are not completely siloed by their partisan views, or the media that they consume. It’s not nearly as bad as, for a while we thought, where you couldn’t even use information to help inform people because it may have the opposite effect. And that’s a massively important thing to bear in mind, because it means that facts, information isn’t useless, isn’t or actually detrimental. In most cases, for most issues, it doesn’t have that negative effect. It’s still… It’s questionable whether it actually changes people’s minds, you can’t just tell people the correct information and expect them to change their world view. But at least it doesn’t have that negative effect, and that’s really important to hold on to, ’cause we’d got to a point where actually you were starting to think information is useless because people are just so wedded to their side of the argument they won’t look at anything else or think about anything else. And that’s not, doesn’t seem to be true. The more hopeful view of how people can reach outside their own bubble, of their own echo chambers on these things.

30:06 Aaron Ross Powell: How does that process work? So we’ve… We’re saying that this blowback effect is relatively minimal, or at least constrained to a handful of issues. But as you just said, people, if you give them the correct answer, it’s not necessarily clear that that has a huge impact on them updating their beliefs. How then do people update their beliefs? How do we set about correcting some of these issues? Or are there… Is it simply a matter of just reinforcement that… So someone who identifies heavily with the Democratic Party and thinks that 83% of violent deaths are from guns, when the actual answer is 68, if you tell them once they don’t update, but if they hear it a bunch of times they do? Or is it more about the source of the information, that if it’s coming from someone who also identifies strongly on the left, so they’re not suspicious of the person, then maybe one time is enough? Are there just kind of features of updating that we could use to help to correct these problems?

31:10 Bobby Duffy: Yeah, that’s again, a great question. ‘Cause it’s… What you’re aiming for, really, I think is skepticism, but not cynicism or utter credulity about everything everyone tells you. So you want people… You don’t want people flip flopping around on their entire world view depending on the last thing they heard from people. Equally you don’t want people to dismiss everything they ever hear if it doesn’t fit with their previous view of the world. So you want people to have that skepticism to be able to test new information, different information and question it. And I think that takes time. I think… I still find the old theories of cognitive dissonance really useful in this. So this was mostly in 1960s America, great academics developing this approach to thinking about cognitive dissonance where when you hear something that doesn’t agree with your world view, it causes you some element of psychological pain.

32:19 Bobby Duffy: So your initial reaction is to kind of ignore it, denigrate it and look for stuff that does confirm your world view. Eventually, though, the psychological pain of updating your view, as opposed to just avoiding the new information, comes to a tipping point, where it’s actually better and easier, less painful for you to update your view than to stick to your guns and say everyone else is wrong. There is a tipping point here, and how you get to that depends very much on the individual, the issue and the approach, as you say. I am personally a fan of deliberative approaches, deliberative approaches to democracy, where you do get people together, with experts, with each other, to present the evidence to them, to allow them to discuss it between themselves and try to come to a consensual view, the kind of town hall model that is developed in the US and the deliberative polling models that James Fishkin and others advocate.

33:33 Bobby Duffy: We have a lot of growth in that in the UK right now, with various citizens assemblies, they’re mostly called, where people are given a task to answer a question or come up with an approach to something, where they’re given a few days and information and allowed to question that information, allowed to talk to each other. And what you find from those processes is that people don’t absolutely up-​end their world view, but they do listen to others, they do take on new information, and they do update to some degree on particular issues. And that’s the element of this is you don’t have to, if you get the environment right and the questions right and the approach right, you don’t have to expect people to come out of these things with a completely different world view, to completely change how they see themselves and that their own identity. It’s much more about getting people to reasonably discuss these things, and people are much, much more amenable, capable and interested in doing those types of things than the impression you get from all the arguments you see on Twitter or other social media.

34:49 Bobby Duffy: It’s not… That’s not how people are in real life. They are much more capable of compromise and listening to the other side in real life is my experience. And I think holding onto that and then looking at ways in which you can actually get to those more reasonable traits that people have is a key aspect of how we are going to renew democracy, renew people’s faith in the system in the future.

35:17 Aaron Ross Powell: This, though, would seem to raise the issue about the overall costs of wrong beliefs because… So people… Lots of people have wrong beliefs, inaccurate beliefs about lots of issues, as you’ve demonstrated at length in the book, but the mechanism that you just described for correcting these false beliefs is quite a costly one. In terms of, I mean, participating in these things, you say you put people in a room for a few days to talk about the stuff, that’s like a few days that they’re not spending doing other things, like we all could be just immersing ourselves in learning the correct answers to the kind of data that you present in the book, but there’s opportunity costs, there’s other things that we could be doing with that time, and potentially for a lot of people, it wouldn’t be a super valuable use of their time compared to the other things they could be doing. So is there an optimal level of wrong beliefs or… How do we weigh that in the sense of just saying, yes, we could fix them, but it would mean taking a lot of effort away from a lot of other things that are valuable?

36:25 Bobby Duffy: Yeah, it’s like, this is the one of the core elements of the rational ignorance or political ignorance trends that the US has explored in a number of different ways. The other entirely different group, which is the logical conclusion for much of the rational ignorance literature, is that because of that opportunity cost and because of the incentives to inform or not inform, an alternative approach is to shrink the size of government, and decentralize decisions hugely and bring the decision-​making much more down to an individual level; instead of having these centralized decisions, the more that you can put towards individuals the better, because we invest more in researching which washing machine to buy, than into which political candidate to back.

37:25 Bobby Duffy: And that is a different legitimate group who think about as an approach to actually just instead of saying, how can we make these decisions better, how can we reduce these decisions is an approach. It runs into stickiness and difficulties because people have preferences still and how do they express them. The idea that Ilya Somin and others develop is footfall democracy, which is people follow, move to the areas which they agree with the basic aspect of how that state is run, and again that runs into practical difficulties of how people live than whether they are as free to move as that kind of system would allow.

38:17 Bobby Duffy: I think the… If we’ve got a system which requires that centralized decision-​making, which we ought to have in many different ways, the question, I suppose, is do we come to better decisions as a result of those types of efforts than we would do otherwise? Will they have more legitimacy with people and are they likely to stick for longer? Incredibly difficult to measure the value of that, the monetary or social value of those types of things. I think the potential there is only going to increase, though, in terms of thinking deliberative democracy. What’s been notable is how little we’ve managed to use technology for those deliberative democracy approaches to date. ‘Cause this seems very old-​fashioned, get everyone in the room for this length of time.

39:11 Bobby Duffy: And there are online methods that are being developed at a much lower cost in terms of investment and effort, much more in tune, I guess, with how people live today. Things like, tools like vTaiwan, which Taiwan has developed into this more deliberative tool model, lots of different decisions that people don’t have to commit the same sort of resources and time to and can fit in around lifestyle. So I think what we’ll see is a blending of these different things over the next few years. I guess the value of it depends on how you view the current system and the future threats to the current system.

39:49 Bobby Duffy: If we see it as being a proper existential threat to liberal democracy approaches that people are losing so much faith in the system that they are going to reject the responses from the system, then those kind of threats actually warrant quite a bit of investment and quite a bit of new thinking. If we think we’re just going to muddle along and the implications are not going to be nearly as dire, then yes, you could say actually is it worthwhile and shouldn’t there be other things that we are working… We should work on instead.

40:23 Trevor Burrus: On that point about… It was hard for me to kind of figure out in reading your book whether or not you’re optimistic or pessimistic overall, but it’s, for me I think that… I’ve been watching the internet and there’s obv… We’ve discussed different sources of this, of the false beliefs than just the internet, but that’s definitely one in social media. But we’ve also seen throughout the history of the internet, let’s just say since the late ’90s, so not the really early history, that it became known that the internet is a source of misinformation and spam and fraud, so Nigerian princes, and things like this, and then you had the development of antibodies, so to speak, like Snopes, to come in and help correct that misinformation.

41:10 Trevor Burrus: And that’s a still developing process that when we find something, some sort of quote unquote fake news site that there can be a method that is in demand, that people actually would like to know whether or not something is coming from a reputable source. And then we have things like, you write about Hans Rosling in the book, people out there who do go viral telling things, like how things actually are. And so it seems like we’re always kind of developing new technologies and institutions to combat some of this misinformation and maybe, since we’re kind of in the infancy of the internet, this could go more in a positive direction than a negative one.

41:51 Bobby Duffy: Yeah, I mean, that was definitely the vision. It was definitely an optimistic vision to start with, where the natural assumption would be that this would be a freer resource of information that people would have access to. And that would improve both our world view, accuracy of our world view and decision-​making. And that optimism kind of left out the human biases that would be interacting with it, well, all the types of negativity bias, rosy retrospection, confirmation bias, all of those types of things, not factored in sufficiently to it, but that doesn’t mean to say at all like you say, that doesn’t mean to say, that we’re stuck with this model of the internet where you have a fairly… A way in which misinformation and disinformation can spread incredibly quickly, and then the fact-​checking process comes in later, and by then has minimal effect.

42:49 Bobby Duffy: The fact-​checking world is very much looking at second and third generation fact-​checking, which is much more about how do you build it into the system, because this is a system we have built in a relatively short period of time, and you can imagine how there may be possibilities in the future that instead of a relevance ranking coming up in your Google search, you have a veracity ranking about how checked this information is, and there is nothing to stop us to start to build those types of things in, in many ways.

43:25 Bobby Duffy: There are things that would have to change and revenue models that would have to change, there’s a big stickiness in this, but it is something that we have created. I do think the danger with all kind of technical control regulation is, we’re always regulating or controlling the last thing, rather than the future thing, and that there will be new developments that make it more and more difficult to keep an accurate view of the world. And we all need to start thinking more about the principles that we want to uphold in our information environment rather than the… A response to a particular type of technology, and that’s kind of, that more principle-​based thinking about what is it we’re trying to achieve with this, is more difficult, but that’s going to stand us in better stead in the future.

44:16 Bobby Duffy: I think on the optimism, pessimism question, I am optimistic in the end, partly because we need to be on this, because our biases are so pulled towards being negative, that playing into that sense of there’s nothing we can do about the information at the moment, there’s nothing we can do about people’s misperceptions, nothing we can do about that sense that everything is going downhill, kind of reinforces those who want to tell us that the system is broken and we need to rip it all up and start again.

44:20 Bobby Duffy: So we need to hold onto that because we’re… Hold on to that sense of optimism because we’re naturally pulled in the other direction, and if we get pulled into too pessimistic a direction, that leaves space for things to be ripped up that are actually doing a lot of good for people.

44:20 Aaron Ross Powell: Someone listening too this episode might, they’re thinking, “Oh, my God, do you know what all am I wrong about,” or they’ve just read your book and let’s say they got, they got basically every single one of the questions in it wrong and they want to get better. And you’ve just told us, well, but your brain is wired to misconstrued information, or overestimate in certain directions, you’re enmeshed in a media environment and a technological environment where there’s lots of information coming from everywhere and not just is a lot of it wrong, but many of the sources you go to get the information or the platforms that you find information through are incentivized to show you not the most accurate. So, you’ve painted you’ve just told us there’s a sense of optimism, but there’s also the overall picture is pretty grim, and so as an individual, what can I do just in my own life in the way that I approach things to try to mitigate against all of these forces, how can I go about consuming information, go about trying to understand my world, in a way that’s going to at least bias me back in a less wrong direction?

46:27 Bobby Duffy: Yeah, so again, it’s like, there’s three or four tips. And there’s kind of technical tips about how you get your information, and mixing up your feed and looking across the divide for where you find information, so effectively popping your own filter bubble as much as you can, and that’s not just online, that is in day to day life as well. There is that element of seeing the other side and not trying to avoid it, bearing in mind that confirmation bias is one of our strongest biases. Just being aware of that is helpful. I think the three or four other things, just to bear in mind. It’s much more about modes of thinking than it is about technical tips or what app to download or whatever else.

47:16 Bobby Duffy: I think that it does start with remembering that things are not as bad as we think, quite literally, that our brain is telling us that things are worse than they are because that’s what it focuses on. But second, within that, accepting the emotion that there is, you’re going to have these emotional reactions to things and you can’t really stop that. When Daniel Kahneman was asked is he getting better at this over time? He said, “No, absolutely not. I’ve been studying it for 45 years and I’m no better than I was.” But what he meant was, he couldn’t stop his system one, his fast thinking from a ruling his first initial reaction; he can, you can, you can, with training in some circumstances and with focus, get your system to slower thinking, to [48:07] ____, so accepting the emotion but then challenging your thought.

48:13 Bobby Duffy: There is that third point about cultivating skepticism but not cynicism, it is just as bad to be utterly dismissive of other points of view as it is to accept everything you’ve heard, so that kind of more cynical outlook, but not… That more skeptical outlook but not utterly cynical outlook. And then two final ones are this presumption that everyone is like us, more like us than we think, that we are normal, often leads us astray, we generalize from our own experience to other people, we think we’re represent… We and our friends are representative of the world. So one of the key messages is just to remember that we’re not as normal as we think.

48:58 Bobby Duffy: And on the other side of that, there is an element of not being drawn to the extremes. So, we’re not the norm, but be careful when you see extreme stories, vivid emotional stories, then how that changes your view of reality, because that they do play a bigger part in our responses and how we see the world than they warrant, often. We are drawn to the vivid and unusual and then forget that they’re vivid because they are unusual, they are not the norm, and we shouldn’t generalize from them.

[music]

49:43 Aaron Ross Powell: Thank you for listening, if you enjoy Free Thoughts, you can find our Free Thoughts discussion group on Facebook or on Reddit at r/​Freethoughtspodcast. You can follow us on Twitter @FreeThoughtspod. As always, please rate and subscribe to us on Apple Podcasts, Spotify, or wherever you get your podcasts. Free Thoughts is produced by Tess Terrible and Landry Ayres. To learn more, visit us on the web at www​.lib​er​tar​i​an​ism​.org.