E305 -

Milton Mueller joins us to discuss how social media is not a medical addiction that requires government intervention.

As social media platforms grow it is apparent that they will never be able to make decisions that will appease everyone. We should also recognize that calls for government-​induced content moderation will make these platforms battlegrounds for a perpetual intensifying conflict over who gets to silence whom.

What is a moral panic? Why are people panicked over fake news? How addictive is social media? What is Section 230 and what implications does it have for social media companies? What is a social media platform?

Further Reading:

What Made the Internet Possible?, Building Tomorrow Podcast

Free Speech Online: Unfriended, Building Tomorrow Podcast

The Problem with “Fake News”, written by Ryan Khurana

Transcript

[music]

00:07 Aaron Ross Powell: Welcome to Free Thoughts. I’m Aaron Powell.

00:09 Trevor Burrus: And I’m Trevor Burrus.

00:10 Aaron Ross Powell: Joining us today is Milton Miller. He’s a professor at the Georgia Institute of Technology, School of Public Policy and Director of the Internet governance project. He recently published a paper with the Cato Institute on moral panic and social media. Welcome to Free Thoughts.

00:24 Milton Mueller: I’m happy to be here.

00:26 Aaron Ross Powell: At the beginning of your paper, you make an interesting point, that it’s not the activity that we do on social media that’s new but instead that everyone can see us doing it.

00:37 Milton Mueller: Yes, that’s… I think the surprising and shocking thing about social media is that it’s so easy to find out what millions and even billions of people are doing in some dark corner of the internet and you have these powerful tools for searching and locating things, and for tagging things and of course, these things can be manipulated or exploited in various ways to get attention, but fundamentally nothing that’s happening on social media wasn’t happening in some way before.

01:16 Trevor Burrus: So is that just… Part of this says that I’m thinking too that there were racists before social media, and now it just seems like we might be able to see them better or hear them better or they’re willing to speak out in certain ways. And of course we interact with people in various ways for other social purposes, so maybe we’re a little bit over-​blown on how big of a problem these things are.

01:38 Milton Mueller: I think it’s actually even worse than that, in a sense, that we are pretending that the problems that we see, that are revealed to us by social media are actually caused by social media, and I think that’s a mistake that’s going to lead to crackdowns on freedom of expression. So we have… Let’s take this latest case, and let me take deliberately the most offensive group of people I can imagine, which is the ones that seem to be congregated around HN, right?

02:09 Aaron Ross Powell: Mm-​hmm.

02:09 Milton Mueller: And these are sort of misogynistic, racist, whatever, all kinds of bad feelings are expressed on this group, but it’s not clear number one, that the mass murders that took place from people who happened to be on that format were caused by the fact that they were on social media. It is, in fact, probably better that we can actually see that these people are saying these things and doing these things.

02:48 Aaron Ross Powell: Is this related… You call it the fallacy of displaced control.

02:53 Milton Mueller: Exactly so what people seem to be saying is, “Okay, we’ve just had a mass murderer and that mass murderer appeared on HN, so let’s pull down HN, let’s erase HN from our vision. And now we can kind of pretend like mass murder is not a problem that this will never happen again because we have eliminated the discussion board that this guy happened to use. And that strikes me as kind of obviously a fallacious way of thinking, but it’s such a powerfully, a symbolically satisfying way of dealing with the problem that it’s hard for people to get beyond it.

03:36 Aaron Ross Powell: Let me push back a little bit on that to see, it seems like there’s a critique of what you’re saying the way that you’re presenting it, sounds like basically the equivalent of sticking your fingers in your ears, right? There’s stuff happening, there’s people saying things you don’t like, and the way you’re gonna solve it is by putting your fingers in your ears, so you don’t hear it anymore. But it still is, of course, going on just like it’s always been going on, and I think that’s one way to look at it, but it does seem like these platforms can potentially play a role in that they’re providing a medium for communication, that didn’t exist before, so it’s not just that we are hearing about or that these racists are more visible than they used to be, but also that these racists who used to be like one guy kind of angry at the world, but hanging out around mostly normal people is now able to find a bunch of other people who are angry about the world in the same ways and producing feedback loops that make his views more entrenched or more extreme or more dangerous, more dug in, in ways that maybe wouldn’t have been possible when he didn’t have thousands of people spread across the world to commiserate with.

04:53 Milton Mueller: Yeah, I think that there’s a grain of truth in that, and there’s something to pay attention to in that and that is, yes, the social media is so powerful in terms of locating information and locating people that you can form like-​minded groups of people that probably would not have been able to find each other before. But it’s not at all clear that that, in fact, leads to more violence or even causes the violence to begin with. So, for example, it’s not difficult to come up with very many instances of mass murder that either predates social media, or that actually seems to have had nothing, in particular, to do with the social media community.

05:44 Milton Mueller: Now what I do think happens is that there is what you call a genre or an imitation effect. So, somebody hears about a mass murderer, and this becomes like these school shootings that have alienated high school or junior high school students in the 90s. This becomes some sort of a reference point that people can imitate because they’ve heard about it and it’s definitely true that that happens. However, the problem there is not that there are these communities of people who have locate each other. Then, in that case, the problem is simply that we heard about it. And so what are you saying, are you saying that the media should not actually report on these events? Fundamentally, if you’re saying that.

06:40 Milton Mueller: The problem is that people hear about these things and then can imitate them. I think that that may in fact be true, but that’s not a problem unique to social media, and it isn’t a problem that is necessarily solved by trying to suppress the communication interactions that these people have. And I still think it’s better for us to be able to see these interactions going on and to be aware of these communities as they’re forming.

07:08 Trevor Burrus: Now, of course, the moral panic around social media does not… We’ve recently had these shootings and so that’s in the news, but for a few years now, especially since the election of Donald Trump, we’ve talked about another thing that people are panicked over, which is fake news. Is that something that we should be concerned with?

07:28 Milton Mueller: Well, we should be concerned with fake news, and people have been concerned about various forms of media bias, manipulation of the media, and so on. And again, it’s true that the interconnected network society and the massive scale of user-​generated content creates sort of new problems and new techniques for propaganda, for disseminating information. But again, the idea that social media is inherently sort of a one-​way ticket to all of these bad things you hear about is definitely a manifestation of moral panic. These people want to blame almost everything bad that’s happened that was in some way connected to social media, they want to attribute the causation to the existence of social media.

08:29 Trevor Burrus: We’ve said moral panic a couple of times, but how you are defining that? You kind of have a more specific definition of that than just the colloquial moral panic.

08:39 Milton Mueller: Right. The term moral panic comes from sociology, and it’s about this sort of self-​reinforcing feedback loop in which people take a threat or a problem in society, and they amplify it and elevate it to the point where it takes on a life of its own and people start attributing everything to that cause without any sort of proportionality or rationality to the relationship between cause and effect.

09:17 Trevor Burrus: You use a line “the un-​typical, is made typical” in describing moral panics, from sociologist Stanley Cohen, which it’s kind of interesting is it reminds you of things too, like the satanic panic in the ’80s when something that is very, very uncommon becomes to be seen as extremely common and possibly in your child’s kindergarten right now, and fake news is possibly another one of these. But back to the question of fake news, it’s interesting because we have a baby boomer generation that remembers, and people younger than baby boomers, who remember the three networks and the sort of centralizing forces of those three networks, and how that was some sort of great period of time where there was some overlapping consensus due to listening to Walter Cronkite or David Brinkley or Tom Brokaw, or any of those.

10:07 Trevor Burrus: But that was also a pretty strange moment in American history, that most of American history we didn’t have three sources of news that most people turn to, and then a few newspapers. It was all spread about in different ways around the country and pretty localized. And, of course, people are always prone to believing in fake things and biases. So maybe all that fake news is showing us is that this is just an inevitable fact of human nature that we’re gonna have to deal with and not eradicate via some strange social media programs.

10:41 Milton Mueller: I think that’s correct. I think the disturbing thing about the moral panic around fake news in particular, but more generally, is that the claim is that this is eroding democracy, that this is undermining fundamental democratic institutions, which is a very strong claim. And what is ironic about that claim is that it’s actually a surfeit democracy in the media that has everybody concerned, that we really are getting the unfiltered, un-​varnished expressions of the masses in a throne before us through social media. And the sort of elite establishment people who have been in control of public discourse are frightened to death by this. It’s just incredible to watch.

11:42 Milton Mueller: There’s a very astute and very intelligent social media scientist, named Danah Boyd, I think she works for Microsoft Research, at least she used to. And it’s like you hear her talk and one of the things, the criticism she makes is that the YouTube videos show you both sides of issues that she considers there are no debate about. There just shouldn’t be any other side presented as a legitimate position. And even if that other side is a position that is horribly misguided as, let’s say, Holocaust denial, the idea that you can simply rule certain forms of discussion out of the public sphere strikes me as extremely anti-​democratic. And yet, this is what’s motivating a lot of the critiques of social media, the idea that there are just certain things out there that should be suppressed, that shouldn’t be visible.

12:48 Trevor Burrus: I like your… You point out that, indeed in the ‘1970s, progressives tried to force media outlets to include marginalized voices in their tuner lineup through public access channels. Nowadays, apparently, the media system is dangerous because it does precisely the opposite, which I think is an interesting point.

13:05 Aaron Ross Powell: Though it seems to me that there’s something of a difference between the fake news phenomenon, at least people imagine it, and what Trevor just described, because it’s not… The concern is not that we are… That fake news is about. Enhancing the voices of promoting the voices of marginalized or fringed groups and viewpoints. It’s the fake, is that there is intentionally misleading the news that is made to look like real information with the intent of deceiving people, which is a slightly different thing. And it makes me think, going back to Travers remark about the three networks, and the boomers I remember a study that came out maybe a year or so ago, it was after the election, about the spread of fake news online, and what they found was that sharing of legitimately fake news items was heavily, heavily concentrated among boomers.

14:06 Aaron Ross Powell: Basically like older Americans were sharing the overwhelming majority of it. And that would seem to click with the… If you grew up in an era where you only had a handful of networks and they, by and large, were reputable, they weren’t intentionally lying to you intentionally sending out fake news. You come to think of news from sources that look like news sources as being real, in a way that maybe prior generations of Americans who grew up in an era where we had lots and lots of competing newspapers, that all very much had their own viewpoint, like the kind of journalistic non-​biased thing was not part of the journalistic culture at the time. But you grow up in a… So you’re used to that, but then you grow up in an era where everything looks legit. You don’t really know how to operate an era that looks more like the past.

15:02 Milton Mueller: I think there’s a number of issues that you’re raising there, that maybe have to be dealt with separately. So one of them is the fact that this environment can be manipulated, that it can… That there are new mechanisms and new techniques and they are automated and scalable, that can be used to inject disinformation or so-​called fake news into the public sphere. And of course, that’s true. And again, if you studied public relation campaigns in the past, you know that that happened in the past as well. I think the main difference now is just the multiplicity of potential sources and the scalability of the techniques, and in some cases, the incredible… I don’t know what’s the word here, the…

15:53 Trevor Burrus: Reach, the viral-​ness, the…

15:56 Milton Mueller: Well, the ability to like say, Make a video that really looks like it’s some celebrity having a pornographic interaction with somebody…

16:03 Aaron Ross Powell: Oh the deepfakes.

16:04 Trevor Burrus: Yeah, deepfakes.

16:05 Milton Mueller: Deepfakes, that’s new in a sense that it’s different than just telling a lie in words. It’s quite, it’s a little more immediate and present. So definitely this is going on. And again, it’s a question of number one. It’s actually more democratic than it was before, in the sense that if you had a handful of big gatekeepers such as, the Associated Press in the Spanish-​American war, manipulating your news, it was probably a bit more consequential, than if you’ve got dozens of bots or thousands of bots or millions of bots all doing automated forms of disinformation and trying to compete with each other to catch people’s attention. So you were also making a kind of a psychological argument about my generation that is to say the Boomers. People like me who grew up in the period when we did have the three networks in a more sort of monolithic media environment as being the ones who are possibly more worried about fake news, because they’re more susceptible to it.

17:28 Milton Mueller: And I don’t know if that’s true or not, that would require some real kind of social psychology research that I certainly haven’t done, but it’s a plausible statement but again I think it’s kind of missing the point which is that the real difference now is that the thing has been blown over, it’s been… The gate keepers have been blown apart and if we have this information and fake news, it’s coming from all kinds of places. Among them, foreign powers which is what’s panicked a lot of people, that they attribute enormous power and weight to these efforts, for example, of Russian bots and Russian the Internet Research Agency and some people actually believe that that’s why Donald Trump was elected.

18:24 Aaron Ross Powell: That argument that Russian bots and Russian advertisements on Facebooks won the election, the people who are most into… Most convinced of it always seemed to have a hard time grappling with the scale that we’re talking about. The numbers that you see this tweet from a Russian Bot was shared 10,000 times and you think, 10,000 looks like a really big number, but in the scheme of how big Twitter is, it’s vanishing-​ly small. Or these Facebook ads. We heard about how 100,000 voters were reached by these Facebook ads. And I like to point out, like I have run ads for lib​er​tar​i​an​ism​.org that had reached more people than that. And I have yet to swing an election, right?

19:16 Trevor Burrus: Maybe we did, we don’t know.

19:18 Milton Mueller: You’ve really falling down on the job.

19:21 Aaron Ross Powell: So the numbers, the scale of the numbers feels like it throws people because we’re talking… Facebook has billions and billions of active users, and so any vanishing-​ly small fraction of them is still going to be an extraordinarily large number, but it’s a vanishing-​ly small fraction that is totally insignificant.

19:39 Milton Mueller: Right. And this is a very interesting debate for an academic involved in this, because it’s all about networks of communication, how they work, how influential they are, but to choose the most recent example ahead of this, you probably watched the Democratic debates in which Tulsi Gabbard really slammed Kamala Harris, and immediately there were a lot of tweets. I think I made a tweet, a very boring tweet saying something like… Oh, her attack on Harris was much more effective than Biden’s. But there was another… It’s turned into a hashtag.

20:20 Milton Mueller: I can’t remember exactly what it was, but something like Kamala Harris’ toast, or something like that and… Or Tulsi destroys Harris or something like that, and all of a sudden, within probably 20 minutes of that interaction, I got a tweet from an academic friend of mine who was retweeting something from Kamala Harris’ PR person saying that Tulsi Gabbard was a tool of the Russians. Yeah, yeah, she was a tool of the Russians, and the Russians wanted to take out Harris. And so our whole discussion now is kind of poisoned by this idea that we’re being manipulated by evil foreign forces. And then somebody did a network analysis of all these tweets and the hashtags related to the Tulsi-​Harris exchange, and they discovered that really most of the tweets and retweets were actually done by sort of normal Twitter subscribers and not by anything associated with Russian, but there were a few bots in there, little red dots they were marked in the network graph.

21:33 Milton Mueller: And so the people who believed in this manipulation theme said, “Well you see that’s exactly what they want, they want to tweet something that they send and picked up by ordinary people.” So it’s like one of these empirically almost non-​falsifiable claims that if there’s a lot of Russian bots and you can say, the Russians are doing it, if there’s not lots of Russian bots then you can say that they caused it, and they’re successfully manipulating us, because everybody’s retweeting this stuff. So this is something that requires really more scientific research, but fundamentally, it’s all about the old debate and communications about influence and how opinion is molded and shaped by various hierarchical relations among people. And I don’t think, again, there’s anything new here. I think people learn, people learn at some sources are Incredible and some are not, some people are ideologically wedded to certain people that they’re gonna believe no matter what they say, and others are not.

22:40 Milton Mueller: And there’s just tremendous lack of confidence in the rationality of the electorate, of the people, it’s a very anti-​democratic approach to things. I think people really believe that the masses are sheep and they’re just pushed in one way or another by a few powerful gatekeepers. And that sort of leads you to the conclusion that the only way to deal with this is to have a more authoritarian communication system in which the good guys tell us all what to think and protect us from bad things.

23:18 Aaron Ross Powell: Speaking of authoritarian communication systems, Senator Josh Hawley just introduced the legislation to basically give him control over social media. And one of the reasons that he gave for this, and one of the things that he wanted, he says this legislation is meant to address, and is one of the other moral panics that you mentioned is addictive-​ness, that there is something exceedingly, and uniquely addictive about social media that it’s been finely tuned through careful visual design and psychological studies and rigorous AB testing, and all of that, to get these things that are just inescapable addictive in the way that we’re told heroin is.

24:03 Trevor Burrus: Basically exactly what the cigarette companies did, but to Apps.

24:06 Aaron Ross Powell: So is there anything to that?

24:10 Milton Mueller: Very little, certainly there is an incentive among the platforms to encourage engagement in ways that keeps people on the platform and looking at advertisements. But again remember, we’ve heard the addiction argument about almost every new media that comes down the pike. We heard it about comic books. We heard it about video games, we heard it about television itself. How many of you were not accused being a television addict? I guess you’re maybe not old enough.

24:44 Trevor Burrus: We’re children of the ’80s, and definitely, that came up with my grandma, I remember my grandma saying, we were watching some show when I was over at her house, and the fact that it ended on a cliff hanger, she’s like, “That’s exactly that’s how they get you, that’s how they get you to come back.” And so now we have these binge watching shows, and everyone’s saying, I binge-​watch and it’s wonderful. But then, that’s the same kind of “manipulation”.

25:09 Milton Mueller: Yeah, there are definitely incentives of the providers in the platforms to keep people engaged, and they’re competing for our attention and there is an attention economy, but the idea that this is some form of a medical addiction that requires government intervention to save us all, I think, is pretty a dicey proposition. And when you look at Hawley’s legislation, it just gets to ridiculous levels of micro-​management of the layout of the screen of the period of time that you can do things. This is a crazy piece of legislation.

25:50 Aaron Ross Powell: You as a way to illustrate the way that these moral panics can either lead us wrong or that we can misinterpret what’s going on. You tell the story of Myanmar, which is both extremely tragic. But an interesting case study, can you tell us what went on there?

26:09 Aaron Ross Powell: Well, the first thing is to put it in the broader historical context. So again the… I’m amazed at how progressives have taken this up. So it’s just now an article of faith among them that Facebook causes genocide. That’s a pretty big accusation.

26:27 Trevor Burrus: That’s a strong claim.

26:28 Milton Mueller: Yeah. Facebook causes genocide. And if you unpack that they end up talking about Myanmar. And let’s first put that in historical context. Myanmar is an ethnically their ethnic cleavages in Myanmar. The majority are Buddhists, and Ethnically different from the Rohingya who are Muslims, so there’s a religious as well, as an ethnic difference, the Rohingya were brought in as workers, and like in many countries, let’s say, it’s the Turks in Germany or the Mexicans in the United States, they were poorer, and immigrant laborers. And then when Myanmar declared Independence, you had a nationalist kind of military dictatorship that just there was no place in their minds or in their political system for ethnically different people. And so from the 1950s, they have been oppressing violently often and repressing politically shutting them out expropriating the Rohingya minority. So did they embrace and use Facebook during this process to manipulate the information that the majority heard about the Rohingya? , They definitely did, they created a few incidents in which there was viral displays of panics formed around fake threats and most interestingly succeeded and something that should be opposed to the advocates of this content regulation is that they successfully manipulated the content moderation policies of Facebook in order to take down many of the exposés as of their violent activities.

28:44 Milton Mueller: So for example, if a critic or a human rights advocate, posted a video or a Facebook post that was exposing the violence then it would get taken down because it had violence in it, and that’s something people need to be aware of that the manipulation works both ways. So and the other interesting thing here is that the democratic or semi-​democratic, transition in Myanmar was also said to be facilitated by the rise of Facebook that people could more freely exchange ideas about how the country should be democratic, and not a military dictatorship anymore. So suddenly that’s been forgotten. I use that in the paper as an example, of how moral panic filter facts for people. So, we’re forgetting suddenly forgetting the good things Facebook presence there did and we’re now turning it into a totally negative and demonic force. That’s literally responsible for genocide.

29:49 Trevor Burrus: And it seems like we have to put this in context, as you do in the paper that… So Facebook is a communications device. A way of networking people together, and so have been newspapers and radio and other things of the past and of course governments have been manipulating, those in the past to serve their dastardliness, whether it’s William Randolph Hearst during the Spanish-​American War, or we have… I know during the Yugoslav War, there was a lot of manipulation of the media to rile up the different ethnicities, against each other. So some of that’s just a fact of what humans do to media and what government does, and so, but giving them more control over that seems like a bad idea. In light of that fact.

30:30 Milton Mueller: Yes, that’s another thing I point out in the paper, is that the only reason we were able to ultimately moderate and stop the manipulation, by the Myanmar government was that Facebook was not a nationally regulated or government-​controlled entity it was, based in another country, it was relying on transnational communications across borders and that meant that the normative and political pressures on Facebook were not just coming from within Myanmar, were the dominant powers were of course, the ones oppressing the Rohingya.

31:07 Trevor Burrus: And of course, China has known this at the outset by not letting these things into its country, it’s trying to control those narratives and make them not as democratic and bottom-​up and keep control of the information and therefore the people.

31:21 Milton Mueller: Exactly, and this is probably the ultimate tragedy of the progressive critique of the social media, is that everything they say literally everything is already been said by China and other Authoritarian countries in terms of why they regulate and control social media, so they have to think where exactly are they leading us?

31:46 Aaron Ross Powell: We’ve touched a bit on prior moral panic. So we mentioned comic books, I think we mentioned radio, as well but… And how these things played out and the similarities. But there’s a section of the paper where you go into this and how we can learn from the past, ones. In order to better understand and have better perspective and what’s going on now. And one of them I thought was particularly interesting and so I’d love for you to tell us more about it, was the panic in the 17th and 18th centuries, about the rise of literacy.

32:19 Milton Mueller: Right. Well, the whole Catholic church was based upon the idea that the clergy was the intermediary between God and the people, and so they did not encourage readership of the Bible, you were supposed to get deal your exposure to the sacred texts through these intermediaries that fundamentally controlled what people could hear or tried to anyway, there was still various forms of heresy, even before then, but the whole reformation and the Protestant revolution was all about.

33:00 Milton Mueller: Direct access to the text about… And that was presuming sort of the growth of literacy in the 15th and 16th centuries to the point where pretty much ordinary people could read the Scriptures for themselves and make their own interpretations for it. Now, did this lead to instability? Sectarianism? Yes, it did. There’s no question about it. But again, that poses you the questions, so what is your remedy for this? Is it to keep people in the dark? Is it to keep them illiterate? Or are you just saying as society develops, new media and new forms of communication, we have to learn to deal with them in ways that are consistent with notions of individual freedom and individual choice.

33:50 Trevor Burrus: Now in your paper, you also deal with an important and relatively in the news right now, and I think will be in the news for quite a while, but Section 230 of the Communications Decency Act which has been… There’s been some prominent op-​eds in The Wall Street Journal by Dennis Prager on this, which made a ton of errors. And Charlie Kirk wrote one for Washington Post. It seems like First of all, no one understands a section 230 actually is. I think the New York Times printer of correction to one of their 230 op-​eds that said, it said that section 230 protects hate speech online. And then it said the correction was that was an era was actually the First Amendment that does that, not Section 230. So what does Section 230 what does it say, what does it do?

34:34 Milton Mueller: So Section 230 is a very interesting piece of law and sometimes I compare it to squaring the circle. So essentially, in communications you have, let’s say three different models that you can talk about. The First Amendment model basically applies to the government and it says, “Thou Shalt Not, thou shalt not censor anything, thou shalt not establish a religion, it just keeps the government out of the picture. And then in the private sector, you’ve had two models, one of them is what you call the common carrier model where you’re a telephone company you carry calls because you’re a common carrier you’re open to anybody, you’re not responsible for me plotting a crime on the telephone, you’re not supposed to be policing that. If I commit a crime, you’re not supposed to be responsible for it. And then the other model is the the editorial model of the newspaper model where you are responsible for what you publish, and that also means that you have complete discretion as to what you publish and what you don’t publish. Now, you’re not like a telephone company you don’t just have to take information from anybody who wants to transmit it and non-​discriminating and publish it.

35:54 Milton Mueller: So what Section 230 does is it actually gives platforms the best of both of those worlds. It says to them, “Hey, in some ways you’re a common carrier in the sense that anybody can put information up about, on your platform and you’re not gonna be legally responsible for it unless it’s copyright or intellectual property but that’s another issue. And at the same time, you can act with editorial discretion. You can say, “Hey, we don’t want violent videos, we don’t want nudity, we don’t want child molesters, or we don’t want conspiracy theories.” You can do anything, any kind of editorial discretion you like, and you’re still not being given the liability that a newspaper would have for exercising that editorial discretion. So that’s what section 230 is.

36:50 Aaron Ross Powell: Just for clarification, what’s a platform?

36:52 Trevor Burrus: A platform is a social media entity, we call it platforms because of the economic theory about how it’s a multi-​sided market that matches providers and seekers of some kind of a value unit. In most cases, it’s just information or videos, but YouTube is a platform. Uber is a platform, it matches drivers, and people seeking rides. Facebook is a platform, it matches people with people they wanna hear about.

37:24 Trevor Burrus: So this would be the difference between, say the Wall Street Journal’s op-​ed pages or just its own pages as a publisher in the common section of the Wall Street Journal website.

37:35 Milton Mueller: Exactly, so the op-​ed pieces where they publish something, if I write something completely illegal, and defamatory, then the Wall Street Journal might have to share some of the liability for my comments. Whereas if I published these defamatory comments on Facebook, I might get prosecuted, but Facebook would not.

38:00 Trevor Burrus: Now, in that Dennis Prager op-​ed, I mentioned, he writes, “Big tech companies enjoy legal immunity premised on the assumption they’ll respect free speech,” which is…

38:11 Milton Mueller: Wrong.

38:11 Trevor Burrus: So wrong… It’s the… The exact opposite is true. They can censor as you point it out if they wanted to be a website that was only about something, they can censor at all, they don’t have to respect free speech at all. So, what do you think is going on here especially, I guess, with American conservatives, in particular, they’re pretty upset about Section 230 and sometimes they call it a subsidy, which is itself bizarre.

38:37 Milton Mueller: Well, I’m not sure what’s going on with the conservative movement.

38:41 Trevor Burrus: Yeah, okay, that’s a big question.

38:42 Milton Mueller: Ever since 2016. [chuckle]

38:42 Trevor Burrus: Yeah, exactly, yeah.

38:45 Trevor Burrus: But I think in terms of… Intellectually, their approach to… We got Section 230 because of conservatives who wanted to enable platforms to get pornography off the internet. And what… And again, it was sort of squaring the circle. So they wanted to do that without violating the First Amendment, and without… And while encouraging freedom of expression. So Section 230 does that, it says, “you can take stuff down that you think is offensive or bad for your subscribers or your users, and you’re not going to be held responsible or legally liable for anything else that you leave up.” So that at once both limits expression and enables it. It enables it because by making them not legally responsible for whatever comes up, the platform does not have an incentive to constantly suppress things, but it limits it because it does give them the right to take down or not allow certain kinds of content that they think will be bad for their platform. And I guess it’s hard for people to understand that dual-​edged character of Section 230.

40:08 Aaron Ross Powell: Given most of our conversation today has been pushing back on, “Here’s what’s wrong with social media. Here’s why it’s bad,” and you saying, “Wait a second, it’s like it’s, what you’re saying is overblown, or not true at all,” but you do, you do think that there is something broken with our social media environment right now? So what is the real problem, and then how do we go about fixing it?

40:34 Milton Mueller: Yeah. Again, the problem is really almost a social-​psychological one in the sense that the platforms, particularly Facebook and YouTube, have gotten so big that they raise all of these normative questions. So their ability to exercise their Section 230 freedoms are almost self-​contradictory at this point. Whatever they do is not going to make… Is going to make somebody unhappy, right? It’s very hard for them to know what is the optimum, what is the proper way to suppress and not suppress in order to optimize the value of my platform. They are under so much normative pressure from so many different conflicting viewpoints that there’s simply no way they cannot be subject to this pressure of regulation.

41:37 Milton Mueller: And that’s kind of the thesis I made in the hyper-​transparency argument, that they are exposing things and they get blamed for the things they expose simply because those things are out there. And then our natural kind of knee-​jerk reaction is to eliminate that stuff from social media, pull it down, block it, don’t allow people to see it. That’s kind of a natural reaction for people. Once they’ve seen bad things happen by people who use social media, they want to block the expression on social media. And so it’s very hard for them to actually figure out what is the optimal thing to do. The one thing that I argue that they should not do, in the paper, is this Facebook idea of shifting the responsibility for content moderation onto the government, and having the government set the standards.

42:40 Milton Mueller: And this was actually a conclusion that surprised me when I came to it. It was like I went into that, writing that paper, thinking maybe we do need to modify Section 230. Maybe we do need to make some regulatory adjustments. But the more I thought about it, the more I thought, “No, that’s precisely what you don’t wanna do. You want the platforms to maintain the responsibility for making those choices and to bear the economic consequences in terms of their user base.” And if they’re learning that they’re too big, that there’s no way they can satisfy everybody and that there’s no way they can maintain a cohesive community by being so huge, then maybe that’s a good thing. Maybe the new competitors and alternative platforms will arise and people will migrate to them, and we won’t have this incredible concentration on one platform.

43:39 Aaron Ross Powell: The concentration was something I was thinking about as I was reading your paper, because one of the things that always seems to happen when we’re talking about policy and the internet is that the internet moves very, very quickly, and things change very quickly. And so we think this is the way… Facebook is huge, and this is the way it’s always been and always will be, so we gotta get this solved. But Facebook is, in the scheme of things, quite new. And the underlying tech changes, and just it’s… The world shifts much faster than the policy can catch up. And it struck me that, as you said, one of the things that sets Facebook apart from other communications mechanisms we had is that it’s a single entity that is providing a single platform to billions of people worldwide, and these people are then communicating in ways that bother us.

44:31 Aaron Ross Powell: But prior to Facebook, we all communicated via email, and people were passing all sorts of crazy stuff around. You’d get these emails from some distant family member that were just riddled with conspiracy theories and whatever else, and you would just eventually mute those people. But that all happened, but we didn’t have the kind of moral panic around email, we didn’t have the calls to regulate email. The problems that did exist on email, say, spam, ultimately were solved with, or mostly solved, with technological changes; spam filters got very good. And I wonder if what sets email apart and the reason that we didn’t get a Josh Hawley Act for email was the decentralization, the fact that there wasn’t much that you could do. So there’s gmail and gmail has a lot of users, but it’s an open system; anyone can set up an email server. And so do a lot of these problems and the kind of psychological drive to blame Facebook for genocide in a way that we wouldn’t blame email for genocide, does that go away if social media eventually becomes decentralized?

45:47 Milton Mueller: I think it does. Yeah, definitely. And this is an amusing sort of interaction I’ve had with some of the advocates of content moderation. Because, typically, if they’re your typical kind of liberal or progressive digital rights advocate. They’re very concerned about this sort of problems of social media, and they’re calling for various forms of cracking down on hate speech, and so on and so forth. And at the same time, they’re complaining about the concentration of economic power and dominance in these big platforms, and they’re calling for, “Oh, let’s disseminate the freedom box,” or, “Let’s have these completely decentralized forms of social media.” And I’m telling to them, “Look, you can’t have it both ways. If you want content moderation, if you want to regulate hate speech or all the other kinds of speech that you don’t like, you have to have concentration. If you have the decentralized system, all that stuff is actually going to flourish. All that stuff you don’t like is going to be there, and you won’t be able to reach it. So make up your mind as to what you actually want.”

47:03 Milton Mueller: Me, personally, I’m fine with the decentralized system. If people commit crimes, I, of course, think that they should be prosecuted and discovered. But the idea that you’re going to concentrate everybody into a single platform, and then impose various forms of content regulation and moderation on the platforms, working secretly or openly with the government in the background, this, to me, is, I think, the worst possible outcome.

47:36 Trevor Burrus: I think it’s also important, Aaron alluded to this, but we are, in terms of world history, if you look back, and we mentioned comic books and literacy and all these television scares, they’re often perpetuated by people who are new to the media. So older generations who aren’t growing up on it, so you’re not a digital native in the term. But in terms of the history of the Internet, we are in infancy. The scares over Facebook…

48:05 Milton Mueller: Actually, adolescence.

48:06 Trevor Burrus: Adolescence there. But scares over Facebook might be a weird footnote to history in 30 years, or scares… Just like while people were scared of violent video games or video games in general, and then a lot of people grew up on video games, became adults and realized that that wasn’t really a problem. And Internet generations are gonna understand more better what the problems are and what they aren’t, rather than people who are baby boomers. And so, some of us just chill and wait and see.

48:34 Milton Mueller: Well, I hope so. That’s an optimistic approach. The other thing that could happen is that we institutionalize the dominance like we did with AT&T around 1920, and we say, “Okay, you’re a big dominant monopoly, and looks like you’re gonna be here for a while. So we’re going to create our regulatory system around the presumption that telephone system is a monopoly. And we’re gonna regulate prices and regulate this and regulate that.” That could happen, particularly when you have Mark Zuckerberg giving us four ideas about how to regulate the Internet.

[music]

49:36 Aaron Ross Powell: Thank you for listening. If you enjoy Free Thoughts, rate and review us on Apple Podcast, or in your favorite podcast app. Free Thoughts is produced by Tess Terrible and Landry Airs. To learn more, visit us on the web at www​.lib​er​tar​i​an​ism​.org.