E196 -

Emily Ekins has identified five different types of voters that elected Donald Trump as president. Do these groups represent a shift in American politics?

Hosts
Trevor Burrus
Research Fellow, Constitutional Studies
Aaron Ross Powell
Director and Editor
Guests

Emily Ekins is a research fellow and director of polling at the Cato Institute. Her research focuses on public opinion, American politics, political psychology, and social movements. She leads the Cato Institute project on public opinion in which she designs and conducts national public opinion surveys and experiments

Emily Ekins has identified five different types of voters that elected Donald J. Trump the 45th President of the United States. Do these groups represent a big shift in American politics? In this episode we also discuss polling methodology and analysis. How reliable are public opinion polls and voter surveys?

Show Notes and Further Reading

Here is the Democracy Fund Voter Study Group Ekins participated in.

And here is her report on “The Five Types of Trump Voters.”

Ekins also mentions FiveThirtyEight’s Pollster Ratings project.

Transcript

Transcript

Trevor Burrus: Welcome to Free Thoughts. I’m Trevor Burrus.

Aaron Powell: And I’m Aaron Powell.

Trevor Burrus: Joining us today is our colleague Emily Ekins, a research fellow and director of polling at the Cato Institute. Welcome to Free Thoughts, Emily.

Emily Ekins: Thank you for having me.

Trevor Burrus: You’ve been studying public opinion for quite a well. Actually, spoiler. Emily and I were actually interns together and she was doing it then in 2010. At that time you were studying Tea Party [00:00:30] and a lot of sort of polling about what the attitudes that go into especially I guess Republicans’ political philosophy when you were doing Tea Party work. Then we had the Trump phenomenon, so I guess you’re a really good person to ask the question: Where did Trump voters come from?

Emily Ekins: That’s a big question.

Aaron Powell: Yeah.

Emily Ekins: It’s the question that everyone keeps trying to answer. You want me to give it a try?

Aaron Powell: Yeah, you give it a try.

Emily Ekins: Well, so the first thing I will say … I think that people have been a little too quick to [00:01:00] try to look for a simple explanation, like what’s the one thing that explains why people voted for Donald Trump. Since he’s such an unusual candidate who has said so many things that have offended people, people think: How could he have won? I’ve just recently conducted a new study with the Democracy Fund Voter Study Group. This was actually put together by the Democracy Fund. They brought together an ideologically diverse group of academics and pollsters to field an original survey right after the election [00:01:30] and do some really in-​depth analysis of voters of Hillary Clinton and Donald Trump and try to understand the dynamics.

I contributed one of four reports that were released this past week on the 2016 election. What I found is five different types of Trump voters came out to vote for him on election day, and I think that that’s really important because people keep looking for this single explanation to explain this surprise, and [00:02:00] I think the answer is there is no simple explanation. There are certain things that make this election distinctive, and we can talk about that, but at the end of the day Trump’s voters are a typical coalition which is how it always is when it comes to politics.

Aaron Powell: Who are those five? What are the five kinds?

Emily Ekins: All right, so the five kinds. The first one I call the American preservationists, and I think these most closely align with the media accounts of a Trump voter. They have lower levels of education [00:02:30] and income. They’re underemployed. Among the working age group of this group, half are on Medicaid, which is quite a lot as you can imagine. You wouldn’t really think of them as Republicans. They want to raise taxes on the wealthy, they’re very concerned about Medicare, so they’re more economically progressive.

What made them vote for Donald Trump? Well, we can’t be sure but what I can tell you is that they are very, very skeptical of immigration. Not just illegal immigration but [00:03:00] legal immigration as well. About 8 in 10 want to make it much harder for people to legally immigrate and as you both know very well, the system is already very difficult to navigate. They want to make it even harder. They seem to have really been drawn to Trump on some of that. They also have less favorable attitudes towards racial minorities. That’s another kind of … You’ve seen that caricature in the media. That’s one of five.

Aaron Powell: [00:03:30] Can I ask about that one really quick? Is there a reason why they’re anti-​immigrant? Are they very skeptical of it because they think immigrants take jobs? Are they very skeptical of it because they think immigrants cause crime or because immigrants change American culture?

Emily Ekins: I’m so glad you asked that question. By doing this analysis we could kind of see that it appeared that different motivations drive people to be concerned about immigration. Some of the different reasons could be security concerns, [00:04:00] concerns about fairness. People talk about it’s unfair that people get to come in illegally when other people come in legally. Others are concerned about assimilation. Then there’s also those who kind of are just flat-​out ethno-​nationalists that don’t want people who aren’t white coming into the country. Where do all these groups fall? It’s always hard to tell what motivates a person, right? They usually won’t tell you. For this group I can tell you a few things that gives us some clues. About half [00:04:30] of them thought that you need to be of European descent to be really American, to be truly American.

Trevor Burrus: That sounds pretty ethno-​nationalist.

Emily Ekins: Yes, it does. Now again, there’s the other half that didn’t feel this way. But still, I mean, that was very surprising and shocking. None of the other five groups came close to this. This group is the most likely to think of their own identity, which was mostly white, as being very important to them. Most people don’t go around thinking about their race, [00:05:00] but this group does. They also feel like … It’s something called linked fate in the academic literature that some people believe that what happens to their racial group will impact them, so they’re more likely to feel that way. They’re more likely to think that whites are this group, and so what happens to, quote, “other white people” will affect them. That’s I think the media caricature that we definitely saw going on.

When Hillary Clinton talked about “the deplorables,” I think that those individuals were more likely [00:05:30] to be found in this particular group. They’re the most likely to think that you have to be Christian to be really American, to have lived here almost all of your life, or actually have been born here. That makes immigration difficult. If you think people have to conform to a really kind of narrow set of characteristics to truly be a member of society, especially things that are immutable and things that people can’t easily change, that makes it very hard to become accustomed to immigration. [00:06:00] But other Trump voters were very, very different on these very same questions.

Trevor Burrus: So the American preservationist was what percentage of the total?

Emily Ekins: About 20%.

Trevor Burrus: Then the next category would be?

Emily Ekins: Well, how about I give you a contrast? I’ll show you the group that was the most dramatically different from the American preservationists. I call them the free marketeers. They make up a slightly larger chunk, 25%. They have the highest level of education and income. They [00:06:30] are very favorable towards immigrants and racial minorities. They look just like Democrats on those questions. They are like Democrats in terms of wanting to make it easier to legally immigrate to the United States, but they’re also very fiscally conservative.

They don’t think government should be so involved in healthcare, they don’t want to raise taxes on the wealthy, and they’re very supportive of free trade. Basically all the things I told you about the preservationists, the free marketeers kind of in many cases [00:07:00] had opposite answers. Not on every single one, but had opposite answers. You think, “How on earth were these individuals all voting in the same party?” It’s not that uncommon. I mean, parties are coalitions of very different types of people. They just don’t seem to realize sometimes how different they are. What they do have in common is that they both really hated Hillary Clinton. Really.

Trevor Burrus: Did you have a question on your survey that was like with maybe faces, kind of like the pain scale … It’s like, “How do you feel about Hillary Clinton?” Like smiley face … [00:07:30] Or something to try to measure the hatred of Hillary Clinton?

Emily Ekins: Well, we used something called a feeling thermometer where we asked people-

Trevor Burrus: See, that’s what I told you. It’s like the pain scale.

Emily Ekins: Well, okay. You rate someone on a scale of 0 to 100, 0 being very cold and unfavorable, 100 being very warm. I mean, people did not like Hillary Clinton. What’s interesting though is that for some of these voters, they liked her in 2012. They turned against her in 2016, which shows that [00:08:00] all of the coverage, all the negative media coverage, her emails, her servers, the charges of corruption and all of that, did seem to have an impact on these voters.

Trevor Burrus: When you said some of these voters, I mean, we’re talking about specific? Did you have data from the same person and what they did in 2012 or were you just taking groups?

Emily Ekins: You’re absolutely right. I’m glad you mentioned that. We do have data on these same individuals from 2012 and that’s what makes this data set so exciting. We did field this survey [00:08:30] in 2016, but what we constructed is something called a panel survey or a longitudinal survey. We asked people to participate in the survey that had also participated in a survey in 2012. Then we asked a lot of the same questions too so that we could see how their attitudes change, which is how we could see that some groups changed on trade and others didn’t. We can go back and see: Back in 2012, how did you feel about Hillary Clinton? I’m not asking you to remember how you felt, [00:09:00] but I’m actually looking at what you said.

So it’s far more believable and credible. One of the groups in particular that we haven’t talked about yet, I call them the anti-​elites, they make up about 19% of the coalition, half of them had a favorable opinion of Hillary Clinton in 2012. On economics they lean progressive. They’re pretty friendly moderate on immigration, maybe not quite as liberal as Hillary Clinton on immigration, but you think, “Why did they not vote for her?” I mean, something happened [00:09:30] that really turned these voters against Hillary Clinton. We can all just guess what we think it was, but obviously all that negative media attention made a difference.

Aaron Powell: The thing that strikes me about … The first group, the American preservationists. Was that what they were called?

Emily Ekins: Yes.

Aaron Powell: The line from them to Trump seems somewhat clear to me. The things that they want to my mind are abhorrent, but Trump wanted those abhorrent things too and so it makes sense for them to vote for him. [00:10:00] The second group, the free marketeers, was these people just terrifically naïve? How do you get from having that set of beliefs to thinking Trump is your guy who has campaigned against all of those beliefs?

Emily Ekins: Well, a couple of things. They’re loyal Republicans, and a majority said that their vote wasn’t a vote for Donald Trump. Their vote was a vote against Hillary Clinton. If you really, really despise Hillary Clinton, then it’s just [00:10:30] who’s the other guy and you vote for them.

Aaron Powell: Do these same groups show up during the primaries?

Emily Ekins: Yeah. I mean, they voted for different types of people in the primaries. As you can imagine, the preservationists, you’re absolutely right. They are the core set of Trump supporters. They are the ones that catapulted him through the primaries. The free marketeers, a majority of them voted for one of the other 16 candidates, primarily Ted Cruz and Marco Rubio if they voted in the primaries. The anti- [00:11:00] elites, a lot of them also did vote for Donald Trump but if they didn’t vote him they voted for John Kasich. You can kind of see there are different flavors of Republican. The preservationists, Donald Trump was truly their flavor.

Aaron Powell: Okay. With the anti-​elites, so they disliked Hillary Clinton a lot or did they dislike elites across the board?

Emily Ekins: It’s hard to say. We could definitely see that they dislike Hillary Clinton a lot. We can also see that they don’t like elites. Then if you look at their immigration attitudes, a plurality [00:11:30] of them supported a pathway to citizenship for unauthorized immigrants. They’re not like super hardline on immigration, but compared to where the Democratic Party platform was they weren’t quite there. They were a little bit less feeling comfortable with immigration, and in particular it seemed like that might’ve been related to the temporary travel ban on Muslim immigration. That was one thing that made these Trump voters kind of stand out from the non-​Trump voters. [00:12:00] Majorities of Trump voters supported the idea of a temporary travel ban. That being said, the intensity of the support was very different. Those preservationists, like 80%, they support this kind of policy, strongly support this kind of policy.

For the anti-​elites and the free marketeers, a majority of them supported the policy but only 1 in 10 strongly supported. What you can kind of get a sense of is that they don’t want to support this kind [00:12:30] of policy, but they’re frightened. They see things in the media, they see things that are happening in Europe, and it scares them. Although we can have our colleague Alex Nowrasteh explain the statistical probability that they would be harmed, that’s not how humans usually think. People are strongly influenced by the stories they see in the media. I think the fact that Trump kind of was rising in the polls right after the Paris attacks, after the Orlando shooting and what happened in San Bernardino, [00:13:00] all of those things scared people. Trump responded. He responded in a way that Hillary Clinton did not, and that may have helped him among some of these groups who otherwise would’ve been a little bit more reluctant to vote for him.

Trevor Burrus: So we have American preservationists, free marketeers, anti-​elites. Fourth is …

Emily Ekins: The fourth actually are the largest group, but they aren’t quite as distinctive. I call them the staunch conservatives. They make up 31%. They’re just conventional social and fiscal conservatives. They’re loyal [00:13:30] Republicans. They are going to vote for the Republican candidate. They’re not as hardline on immigration as the preservationists are, but yes, they are skeptical of it. It seems like they might be skeptical of it for slightly different reasons than the preservationists. They weren’t like the preservationists in saying that you had to be white to be American, but they seemed to be a little bit more concerned about assimilation and ensuring that the community has a sense of cohesion and belongingness.

[00:14:00] For individuals like that immigration can pose some challenges in that it first … It can be hard when you have different groups of people with different traditions, different languages kind of coming together. If you really like people to kind of be cohesive, that can be challenging. I think that’s what we saw with this group. They’re very fiscally conservative. They look a lot like the free marketeers on all of the kind of the role of government in the economy issues, but kind of in the more in between the free marketeers and the preservationists [00:14:30] on some of that immigration issues.

Trevor Burrus: I’m sure that they would not be caught dead voting for Hillary Clinton.

Emily Ekins: Right.

Trevor Burrus: That’s another too, is the Clintons have formed such the absolute devil of the Republican Party for 20 years now, that maybe if they’re like, “I don’t like Trump but I will never, ever vote for Hillary Clinton.” That very common opinion.

Emily Ekins: You’re absolutely right. If you look at what they said in 2012 it was like 4% said they were favorable, and probably those were mistakes. [00:15:00] Those are people not really paying attention to the survey.

Aaron Powell: [inaudible 00:15:03] pregnant [inaudible 00:15:04], things like that. Yeah.

Emily Ekins: Yes.

Aaron Powell: What’s the fifth?

Emily Ekins: The fifth, this is a small group. They’re only 5%. I call them the disengaged. Really these are the types of people when they take a survey they just say don’t know, don’t know, don’t know. They don’t really have a lot of opinions except for issues of immigration and elites. It’s like if you say I don’t know about every public policy question I ask you except for issues of immigration and distrust of elites, that tells [00:15:30] me something about who you are and why you voted for Trump. That fits with the Trump rhetoric. They tended to be a little bit younger, have a little bit less education. They don’t pay attention to politics but they do have a skepticism about immigration. Donald Trump made it very clear where he stood on a lot of those issues and so it got their attention.

Aaron Powell: We’ve got these five groups making up the Trump coalition. Why weren’t we talking about this more nuanced view [00:16:00] up until now? Why during the campaign and then immediately after the election was all of the conversation about the first group?

Emily Ekins: Well, I mean, think how long it took us to just go through all five of those groups. By now people have changed the channel, they stop reading the op-​ed. It’s easier to have just a simple explanation. I actually have a document where I’ve been cataloging all the different theories that come out about why people voted for Trump. It’s collective narcissism was one. Racism, nativism, [00:16:30] populism, class anxieties-

Trevor Burrus: Rust Belt woes.

Emily Ekins: Rust Belt woes. Exactly. Because that’s easier for people to remember. I mean, reality is far more complicated than that.

Aaron Powell: It’s all of those.

Emily Ekins: It’s all of those, and for different people. That’s another thing is that people think, “Oh, well, sure, okay, it’s not all racism. It’s economic anxiety at the same time.” But then they kind of think that that’s true for all of the voters. They were all a little bit racist or maybe a lot [00:17:00] racist, and then a little bit concerned about the economy, when perhaps it was that some people have racial animus towards people of color and then others do not. Some are concerned about the economy while others do not. That’s I think the piece that was missing. Why do we care? I mean, that’s another question. I think there’s a couple of reasons why we care, but I think it’s important to understand how diverse this coalition is if we are to understand kind of the future of American politics.

Trevor Burrus: Well, it seems that this [00:17:30] sort of goes into a lot of the work you do in general and things that you spend a lot of time thinking about, which is opinion about a public opinion and why people hold this. If you’re sitting on one side … If you’re a Democrat, most Republicans look the same to you, and it’s really easy to tell yourself a story that they’re just racist xenophobes. Republicans tell a similar story about Democrats. They say Democrats are all just whatever, socialists, slow socialists as I think what [inaudible 00:18:00] [00:18:00] said on our last episode. You definitely typecast the other side and lose the nuance, and that gets into the bias that is often in polling I think and the way people think about politics. They’re not really thinking about a nuanced way. There’s a lot of reasons to hold opinions that are more than just the other side is stupid and dumb and evil.

Emily Ekins: You’re absolutely right. I guess to that point, yeah, it can be used as a weapon. It’s a lot easier to try to delegitimize your political opponents if you try to boil it down to [00:18:30] a straw man that’s very easy to knock down. I mean, I’m not defending either side here. I’m just reporting the data as it is, but it seems like that that is probably part of the reason why people grasp onto those kind of single theories.

Trevor Burrus: But they did it with the Tea Party too, which you have done work on that as I mentioned at the beginning, trying to say, “Well, the Tea Party are just racists.” I’m thinking specifically about a book review you did for the Cato Journal where discussing one polling set, trying to figure out the Tea [00:19:00] Party, and they basically just concluded they’re all like KKK members. I mean, I’m overstating this a little bit, but there was no nuance to their analysis and you see this a lot in polling.

Emily Ekins: Yeah.

Trevor Burrus: That they just kind of put their own biases in there and say … You obviously knew these authors really didn’t like Republicans and they do some polling and: “Wow, we were right. They are KKK members. What a surprise.”

Emily Ekins: I think that a lot of the work that was used to describe the Tea Party would be more accurately applied to certain segments of Trump voters, [00:19:30] particularly those preservationists that we were discussing. Like you said, I wrote my dissertation on the Tea Party movement and I did something similar on them that I did with these Trump voters. I did a cluster analysis. The statistical tool, it’s called a latent class analysis, and you basically allow a statistical algorithm to try to find kind of these natural groupings of people. It’s less dependent on your own judgment.

Your judgment can impact it in some respects like what questions do you even put into the little statistical algorithm, but what it spits out [00:20:00] you aren’t really controlling. When I did that I found several groups within the Tea party. The Tea Party, the central thrust of it really was more about limiting government’s role in the economy and far, far less about immigration, changing demography, racial issues. It was far more about the economy, spending, and deficits. With Trump I would say kind of the … Where is kind of the center of gravity? The center of gravity [00:20:30] with Trump is far more in the area of immigration and concern about demographic change.

Aaron Powell: Do any of these five groups or the coalition of them represent something new, like a big shift in American politics, or are these kind of groups that have always been there and they just happened to coalesce around Trump?

Emily Ekins: There’s two things. I think what surprises people is to see some of these groups like the preservationists or the anti-​elites that hold views [00:21:00] that seem very out of step with certain Republican orthodoxy … Even though Republicans may not actually cut spending they talk about it moreso than the Democrats, right? The preservationists, that’s not even their rhetoric. They’re not speaking the language of tax cuts. Many of them actually used to be Democrats. About a third of them four years ago said that they identified as a Democrat. It’s surprising that you would have individuals that are so different from the Republican Party’s kind of stereotypical platform [00:21:30] in the party. I think that surprises people. I don’t think it’s unusual.

Aaron Powell: One of the narratives of the election is that it wasn’t so much that Trump won but that the Democrats lost.

Emily Ekins: Yes.

Aaron Powell: The election. Is it possible using the statistical methods [inaudible 00:21:44] kind of control for Hillary hatred, such that can we answer the question of: Had the Democrats run anyone but Hillary, would they have won?

Trevor Burrus: This would be like one of Alex’s synthetic controls, Alex Nowrasteh, when he tries to imagine a city if immigrants didn’t come. [00:22:00] Can we run something where some very just stereotypical Democrats run? This would be like … In baseball there’s a thing called wins above replacement where you postulate the average baseball player, and then you figure out how much better or worse some player is. We should be able to do that in politics.

Emily Ekins: I’m not sure how you would do this. Statistically, counterfactuals are always very difficult to try to prove. I mean, if you look at 2012, Barack Obama won [00:22:30] a lot of the preservationists and the anti-​elites, which really surprised people because people say, “Well, if the preservationists have so much animus towards racial minorities, why did they vote for the first black president?” People’s attitudes are far more complicated than people realize, and he had an economic message that resonated with them. Hillary Clinton did not emphasize those issues the way Obama did. She seemed to focus more on kind of identity politics. I would argue Obama [00:23:00] didn’t really do that during the 2008 and 2012 campaigns. As a consequence I think he won over these types of voters. What would’ve happened had it not been Hillary?

Well, it depends on who the other guy was or other woman was. If they had a message more like Barack Obama, perhaps they would’ve won because … I think most people were surprised that Donald Trump won. Political scientists have these models where they’re able to predict the outcome of an election based on just economic [00:23:30] indicators alone. That had predicted a Republican win regardless. I thought that there was some limit to the efficacy of these models but apparently they’re pretty strong. That would’ve predicted that any Republican would’ve won, regardless of if it was Donald Trump or Ted Cruz or John Kasich. Perhaps if the Democrat had a message like Obama that was more unifying and had kind of this economic element to it, they could’ve captured a lot of the preservationists and the anti-​elites.

Trevor Burrus: [00:24:00] Let’s talk a little bit about polling itself because I think a lot of our listeners … Everyone knows that these polls happen. Especially in an election year they come out every week. There’s the Pew and there’s the Rasmussen. There’s all these different names. How is polling generally conducted? I mean, I know there’s multiple ways, but what’s the general process if you were putting this together? To call people or get them to come over or fill out a survey, and then how do you kind of work with the data after that?

Emily Ekins: Well, there’s several different ways [00:24:30] to contact people. I mean, in the olden days people would identify people based on addresses. They would figure out what a representative sample would look like and they would fly interviewers to the cities and they would literally walk to the door and knock on the door and sit down with a family, or a person depending on what kind of survey it is, and conduct the survey.

Aaron Powell: Sounds expensive.

Emily Ekins: Very expensive. As companies and government decided they didn’t want to spend so much money, and technology [00:25:00] was evolving and more and more people were getting access to telephones in their homes … We’re talking about a long time ago. People started to transition, and a lot of people pushed back. They said, “Look, not everyone has a telephone in their house. You’re not getting a representative sample.” They said, “Look, I mean, more and more people are getting telephones in their homes. This is prohibitively expensive. I think we can do a good enough job.” Then they switched to the telephone interview. So they’ll have a list of questions and they’ll have people in a call center, and they have machines that will [00:25:30] call people and they create these representative samples beforehand. Then someone will call a person and ask them if they would participate in a survey, and they’ll ask them a questions.

Now that’s becoming prohibitively expensive for a couple of reasons. One, that more and more people are not using landlines anymore. They’re using cellphones. There is a government regulation that says it’s illegal for a machine to call a cellphone. You have to have a human being actually dial [00:26:00] out the number. That’s very expensive because you have to call 100,000 people or something like that for these surveys, so to have someone dial 100,000 different numbers is just insane. People are doing that. They have big call centers that will do half landline, half cellphone, but this is really giving an incentive to look for new survey methods. In addition to that people are taking surveys less and less on the phone. Even if you do get them on their cellphone they’re just like, “I’m too busy, I’m in the middle of something,” [00:26:30] and they don’t take the survey. Now with technology with the Internet, people are starting to switch more and more to surveying people using the Internet.

Now what I mean by this … This isn’t like where nbc​.com sets up a poll and says, “Who do you think won the debate?” and everyone voted for Ron Paul. Like 90%, right? That is not what I’m talking about. I’m talking about firms like YouGov or Knowledge Network, Ipsos, and what they do is they just create these huge [00:27:00] panels of people’s emails. They will contact you. They will determine if you should be sampled, and they will contact you and ask you to participate in a survey. You get a unique link. You click the link and then you can take the survey online. What’s really great about this way of surveying is that people don’t have to share their opinions over the phone with a stranger, so they’re more honest with you. You can imagine the impact that has on issues today: [00:27:30] immigration, Donald Trump, Brexit in the UK. Whether or not people feel comfortable telling you their true feelings, it’s probably going to be better ascertained using an Internet survey than over the phone.

Aaron Powell: Is there a skewing in kind of the kinds of people who respond to Internet surveys? It seems to be like you could imagine there are certain demographics, certain kinds of people who are more likely to answer a survey that shows up in their email box than others.

Emily Ekins: Look, in [00:28:00] any kind of survey method there’s always a problem with non-​response bias and coverage issues if there’s certain types of people that would just never even have a chance to be included, or if there’s certain types of people that even if they had a chance to be included they would always say no. That’s always been a problem with surveys. What I would suggest is when it comes time for elections, you can actually look at what the survey predicted the results would be and compare that to the election outcome and see how good of a job that they do. [00:28:30] Then also you could also compare these surveys to kind of large-​scale census data collection activities. You can kind of compare the survey data to those as well and see how good of a job they do. Now, people are pushing back now because they think the election, that the polls were so bad. They actually weren’t that bad. The election result was in the margin of error, and Hillary Clinton did get more votes than Trump.

Trevor Burrus: Is that [00:29:00] true for Brexit too and the British general election where all these things for the polls just seemed to be inaccurate?

Emily Ekins: It seems like a lot of the polls got it wrong in the UK, although if I’m remembering correctly, some of the online survey firms like YouGov did a pretty good job predicting what was going on. People think that in part that might be because it’s online. If you are afraid to tell someone because it’s not, quote, “politically correct” to say you support Brexit, you’ll say so on an online survey but you won’t tell them over the phone.

Aaron Powell: [00:29:30] When you were doing your survey research, when you’re conducting a survey, do you partner with one of these firms? I assume you’re not setting up a machine that’s calling people from your office at Cato.

Emily Ekins: Correct.

Trevor Burrus: They’re just dialing the phone all day.

Aaron Powell: So you write up a list of questions and then pay a firm to conduct this thing?

Emily Ekins: That’s exactly right.

Trevor Burrus: When you write the questions … That goes with my question, which is biasing a question. [00:30:00] You and I talk a lot about writing these questions and how they can be biased. When I was asking about the finding racial bias in the Tea Party and stuff, there’s ways you can ask things that really can force … It’s sort of like a force and magic where you can kind of force someone to take a card. There’s ways it seems like you can ask things where you can look at their question and say, “These questions are really, really bad.” Do you see that a lot in terms of how people use these questions to bias their results?

Emily Ekins: [00:30:30] From the reputable firms, not too much. We do everything we can at Cato to make sure that our survey questions are unbiased and straightforward and that we’re doing our best to measure what people actually think, but there are some limitations to how you ask the question that can create some of these problems even if you don’t want to insert any bias at all. One of those is if you ask a question without any costs. That’s what most of the reputable firms often I think find themselves doing. Part of it [00:31:00] is that it’s hard to insert all the possible costs.

If we pass this repeal and replacement bill of the healthcare law, it could have … You could have 100 different consequences, right? Are we going to poll about all of them? What we often see is polling about benefits, as though policies are benefits only. So on healthcare we saw things like: Would you favor or oppose a law that would allow children, of course they call them children, [00:31:30] to stay on their parents’ health insurance policies until they’re 26? Even though most people would call a 25 or 26-​year-​old an adult. Worded that way, worded that way-

Trevor Burrus: Aaron just said under his breath, “I wouldn’t.” I wanted to point everyone out to that. If you follow Aaron on his Facebook he really loves the Millennials. Anyway, sorry, continue.

Emily Ekins: Technically you are an adult by that age.

Aaron Powell: True. True.

Emily Ekins: But again, wording aside here. These questions will find like 75% of the population say yes. [00:32:00] Because why not? Now, what we did in one of our surveys is we asked that same question the same way that everyone else does and found the same result, because we’re not trying to manufacture results. We found that. But then we asked a follow-​up question, and that’s where I think we’re really adding some significant value, is by adding these follow-​up questions so that we can show the nuance. This time what we did is we inserted real costs that come from academic studies. A new study coming out of Stanford I believe has found that this policy …

[00:32:30] It’s called the Dependent Coverage Mandate, where children are allowed to stay on these plans until they’re 26. These economists found that this policy costs workers on average $1,200 a year, and this is whether or not you have a dependent child. You could be 50 years old with no children living at home, and you would be losing $1,200 a year. It’s not just one time. It would be like every single year if you have employer-​provided insurance, which [00:33:00] is many, many people. This is kind of the median voter, right? We inserted that into the question. This is the follow-​up question. Would you favor or oppose allowing these young adults to stay on their parents’ plans until they’re 26 if it costs you $1,200 a year? Want to guess what happened?

Trevor Burrus: I bet it changed. I’ll go out on a limb and say that one.

Emily Ekins: It flipped. It flipped. Strong majorities oppose the policy [00:33:30] now when they learn that it would cost them $1,200 a year, which is what many of our colleagues are constantly always saying, is that there are all these unintended consequences. Obviously no one wanted to charge these people this much money. Maybe some did, but a lot of them didn’t realize they were going to do it, right? They thought it was a free benefit. With polling I think that’s where a lot of the problems come from, where we say, “Do you want to increase or decrease spending on education, on healthcare, on veterans, on roads?” With no cost. [00:34:00] I mean, are we going to cut spending somewhere else? Are we going to raise taxes somewhere else and on whom and by how much? Without those questions, without those costs included in the question, what they essentially ask you is: Do you like education? Do you like children? Do you like veterans? And 75% of people say yes.

Trevor Burrus: 75. Wow, that’s-

Emily Ekins: Oh, well, it depends.

Trevor Burrus: I know. I guess 75% of people probably like children, maybe veterans.

Emily Ekins: It depends on the question.

Trevor Burrus: [00:34:30] Do you like fun? Yeah.

Emily Ekins: Yeah, I like fun. Then a lot of our friends kind on the progressive, economically progressive, will say, “Look. Americans really are in agreement with us. They want to raise spending on all of these programs.” My response is, “Because you inserted no cost into the question.” All you’re asking them is if they like the outcome. What I think we’re doing here is that we are providing very needed nuance to these types of policy questions. Yes, people would [00:35:00] love a free benefit, but they do not implicitly associate a cost with that benefit, but when you provide that for them we find out that Americans make trade-​offs in a much different way. They don’t want to raise their taxes. They don’t want to cut spending on these other areas to make room for this new program.

Aaron Powell: If this nuance is as easy as adding a follow-​up question that just mentions even a cost, why aren’t we already being provided that nuance in our polling?

Emily Ekins: [00:35:30] Well, I’m doing it.

Aaron Powell: Well, you’re doing it, but why isn’t it more widespread?

Trevor Burrus: I want to interject that Emily and I had had this conversation before where you kind of were like, “This is so easy and it is amazing that people don’t do this.” You kind of mentioned that, onto Aaron’s question.

Emily Ekins: Well, to be fair, some pollsters do do this occasionally. Occasionally. But not all the time, and I think that the argument that they would give, and it’s a fair argument, is they said, “Well, we didn’t know what the costs [00:36:00] would be before we passed it. We had to pass it to find out what was in it.” Or alternatively there are a gazillion costs and a gazillion benefits. How are we supposed to accurately put those all into one question and ask someone to pick between the two? I think that that’s a fair point, but what I would say is how about … If we really want to know how people think about this issue let’s ask a variety of questions, let’s ask about several different benefits, several different costs, and we can kind of get a sense of kind of [00:36:30] where that median voter is, rather than go around and say, “75% of Americans support X. There is a clear mandate for the policy that I love.” Let’s have a bit more humble approach to public opinion.

Trevor Burrus: For people who encounter these polls all the time, and they are used increasingly for policy purposes … I don’t know if it’s increasingly, but it’s not just this during an election season. [00:37:00] You see politicians using it to push policies. In one of my areas with the Second Amendment firearms policy we have the 90% of Americans support common sense gun control rules, which is just a magnificently frustrating, empty statistic that has the same problems as you outlined there. For intelligent laymen who want to look at a poll and try and figure out if they’re being manipulated or lied to, is there a way you suggest [00:37:30] to them to look behind the numbers and easily spot some “this is probably a bad poll” kind of indicators?

Emily Ekins: Well, some questions are more obviously bad questions than others. I would say polling that you see from the reputable major outlets like CBS, CNN, New York Times, I mean, those are good questions. They don’t insert the costs very often for the reasons that I’ve described, but if you just know that going in, realize, “Well, these [00:38:00] results would probably change if people thought about X, Y, or Z.” It does matter if support for policy A is 51% with no costs included, versus 90% with no costs included.

That gives you some sense about where people are, right? A lot of these gun questions like assault rifle bans, support is just marginally supportive. I mean, you see polls that are under 50% and over 50%. That tells you that as soon as you insert a few more costs in there you’d probably see support decline. [00:38:30] Now, advocates of these types of assault weapon bans, as they are called, they would say, “Well, you’re not including all the benefits that would accrue by banning these weapons.” Yeah, if we asked a variety of questions we could kind of see where people shake out.

Trevor Burrus: Do you see many non-​representative sample problems in polls that are widely discussed or just pose questions problems? Are there many [inaudible 00:38:55] this is obviously a bad representative sample or bad ways of doing [inaudible 00:39:00] [00:39:00] or something like that?

Emily Ekins: Like I said, from the major outlets that do polling like the Pew Research Center, CNN, New York Times, I haven’t seen that be a problem. For some of these, like certain political consulting firms where they will get hired by a campaign or a group, and they won’t release their top lines … The top line is where you have the actual question wording and then the actual answers with the numbers associated with it. If they don’t tell you their methodology, a lot of those pollsters [00:39:30] are not to be trusted. That’s actually to your earlier question.

If you want to know whether to trust a poll, see if they’ve posted their full results online somewhere. See if they’ve explained their methodology. If they haven’t, they’re probably one of these consulting firms that gets paid to kind of weave a story. If you are very interested you could go to 538’s pollster ratings where they use an empirical method to rate pollsters and how good they predicted various outcomes. [00:40:00] They have an A through F rating. Some of these pollsters were getting F’s. I don’t hear about them anymore.

Aaron Powell: The one I ask about, one poll that always stood out during the run-​up to the election and now when there’s approval ratings, is Trump would always tweet out the Rasmussen results because it was always … He was always doing substantially better there than anywhere else. Why?

Emily Ekins: [00:40:30] Rasmussen does a unique method that a lot of pollsters maybe aren’t totally on board with. He, if I’m remembering correctly, uses a combination of what’s called a robo-​call. It’s one of those machines that will call people but it’s not a live telephone interviewer. It will be like a computer that says, “Would you please take this survey? Press 1 for opposed. Press 2 for support. Press 3 for don’t know.” I believe they combine [00:41:00] that with some sort of online Internet panel to try to get a younger cohort, because you can imagine these robo-​calls are only able to call landlines. They can’t call cellphones because of federal law. So how do you get the people that don’t have a landline? The first thing is that people who have landlines are disproportionately more conservative because they’re older and they’re more likely to have a landline. Rasmussen tried to address this by adding in this Internet panel. Again, I may not be remembering this 100% so [00:41:30] I don’t want to be unfair to Rasmussen. I think this is what they do.

Trevor Burrus: But it’s the kind of conservatives that exist even if it’s not capturing a good enough group.

Emily Ekins: Yes. That online sample’s supposed to get the younger group, but then the question is: How good is your online panel? There are only a few firms that are widely recognized to have a really good online panel, and these are firms like YouGov, Knowledge Networks, Ipsos. I’m not naming them all but those are some that come to mind. [00:42:00] That’s the other issue. But to be honest with you, I like a lot of the questions that Rasmussen asks. I think they’re good questions. Since people aren’t 100% sure about the methodology, it’s very easy for them to dismiss the questions that they don’t like.

Trevor Burrus: This might be a too complex question, so forgive me because I don’t know anything about polling really. Is polling getting worse? That’s the first kind of question. We kind of discussed some people think it is. [00:42:30] If maybe there’s some more difficulties in polling, we’ve discussed getting youth with email addresses and landlines and cellphones are probably also biased with race and all these different cohorts. It might be harder to get a representative sample of people increasingly as people become more in their own niches in varieties of ways. I guess I’m asking these two questions which are maybe related and maybe I’m just out to lunch and I have no idea what I’m talking about. Is polling getting better or worse [00:43:00] and is it becoming increasingly difficult to get a representative sample because of the sort of diversification of opinions? They may not even be related questions.

Emily Ekins: Well, so some people think it’s getting worse. Some people think it’s getting better, like a lot of different areas. I think that what we’re seeing now is very similar to what we saw before. I mentioned earlier that they used to do polling by going to your doorstep and sitting in your living room with you and going through 100 questions on [00:43:30] a survey. That was not sustainable. The technology came in and provided a new opportunity, a new way that was less expensive and arguably in many ways more effective, more accurate through telephones. Well, now that people are kind of abandoning their landlines and only using cellphones, now with the advent of the Internet … I think it’s like 85, 90% have cellphone [00:44:00] access in their house or they get it on their cellphone.

It’s the same idea where technology is coming in and providing a less expensive and I would argue more accurate way to measure people’s opinions because they’re able to answer privately, without having to share what might be an unpopular opinion with an interviewer. I mean, interviewer bias is a very serious problem for certain types of questions. Also online polling offers interesting ways to ask [00:44:30] the question. For instance on the phone you would say, “Who are you planning to vote for?” and you may be given a bunch of names. Well, they’re not really that informed, but online you could show them a bunch of pictures and see which of these people are you going to vote for. Which one of those is more predictive of getting at the final vote at the end of the day? What we’re seeing is that a lot of these online polls, these reputable online pollsters, are doing very well at predicting the [00:45:00] outcomes of elections, particularly in cases like Brexit and things where people feel like they can’t share or express their true feelings.

Trevor Burrus: Thanks for listening. This episode of Free Thoughts was produced by Tess Terrible and Evan Banks. To learn more visit us on the web at www​.lib​er​tar​i​an​ism​.org.