E58 -

Paul and Matthew discuss the history of, and threats to, Section 230. Jennifer Huddleston rebuts the argument that Section 230 was a gift to big tech.

Hosts
Paul Matzko
Tech & Innovation Editor
Guests

Matthew Feeney is head of technology and innovation at the Centre for Policy Studies. He was previously the director of Cato’s Project on Emerging Technologies, where he worked on issues concerning the intersection of new technologies and civil liberties, and before that, he was assistant editor of Rea​son​.com. Matthew is a dual British/​American citizen and received both his BA and MA in philosophy from the University of Reading in England.

As the threat of government regulation of the internet mounts from both the political Left and Right, Paul and Matthew sit down to talk about the foundational law that made the internet as we know it possible. Ironically, Section 230 was one of the few bits of the Communications Decency Act of 1996 that wasn’t struck down by the courts as a violation of the First Amendment, giving the internet more legal protection than it would have if the moral scolds of the 1990s hadn’t passed the law in the first place. Then, Paul talks with Jennifer Huddleston from the Mercatus Center about her research into the common law origins of Section 230, which rebuts claims that the amendment was some kind of unprecedented “gift” to tech companies.

What was the primary purpose of the Communications Decency Act? What are the ramifications of Section 230? Without Section 230, what would the internet look like? Why is censorship important to the conservative movement?

Further Reading:

Transcript

00:05 Paul Matzko: Welcome to Building Tomorrow, a show about the ways tech and innovation are making the world a better place. As you can tell from our tagline, we’re a pretty upbeat show focusing on the good being enabled by emerging technologies. It’s not that tech doesn’t have downsides. If you listen to this show, you hear us talk about the potential pitfalls of emerging technology, the ways it’s responsible for social disruption, government, corporate misuse and a litany of different ways that tech can go sour. But what I find increasingly commonplace today is a cottage industry of techno-​phobic doom and gloom obsessives; a species of moral entrepreneurs who build their careers by fixating on the potential pitfalls of innovation while failing to properly appreciate the upsides of technological disruption.

00:51 Paul Matzko: I was recently speaking at a conference for college age libertarians and conservatives and I was struck by how speaker after speaker called for government regulation of big tech and the internet. Any allegation of anti-​conservative bias by the speakers, from companies like Google or Facebook, it was met with the jeers and cheers of the crowd. It was really a telling moment. It showed to me that the surge of anti-​tech backlash is coming from both sides of the political spectrum, and it’s winning the hearts and minds of even folks who ostensibly ascribe to principles of limited government, private property rights and free markets, all principles that would be violated by creating some kind of government speech police to patrol the internet.

01:35 Paul Matzko: We are facing a groundswell of bi-​partisan support for government regulation of the internet unlike anything we’ve seen in past two decades, and it’s really been accelerating over the past six months. And I think as a podcast that means that Building Tomorrow has to get down to brass tacks and examine some of the basic protections and liberties that made the internet this enormous engine of technological innovation, global communication and economic prosperity that made the internet possible. And no single decision is more important to the rise of the internet than these two little sentences in a mostly defunct law from the 1990s called The Communications Decency Act. It’s kind of an inauspicious start. Even at the time, nobody realized how important these two sentences would become. You’ve probably heard them referenced before. We talk about them on the show frequently. They are known as Section 230.

02:29 Paul Matzko: Without them, the development of the internet would have been radically different. But before we talk about the specifics of Section 230, how it shaped the internet, let’s start with a little history lesson about the moral and technological panics of the 1990s. That’s the moment that birthed both Section 230, the law it amended, and really the internet as we know it today. In the mid-​1990s we’re at the tailend of this decade-​long bi-​partisan culture war waged by groups that called themselves family values organizations, that was commonly the phrase, family values, and their goal was to restrict “indecent content from reaching children’s eyes or ears.” This is coming out of the cultural revolutions of the ’60s and ’70s, the rise of rock and roll music, drug use and that kind of thing. There was this idea that this kind of cultural libertinism was leading the youth astray, into lives of crime and deviancy and disillusion.

03:27 Paul Matzko: Now, perhaps the most famous moment in this culture war backlash, in regards to what children could see or hear, were the 1985 congressional hearings with Tipper Gore. This was Al Gore’s wife, meaning that his wife Tipper had real influence; the ability to get a congressional investigation called. And Tipper Gore and a variety of other spouses of government officials created what was called the Parents Music Resource Center and held these congressional hearings about 15 groups. They called them the Filthy 15, so points for alliteration, groups like, ACDC, Black Sabbath, Madonna, Twisted Sister. What they said was, these groups were peddling just indecency, obscenity, they’re swearing, they’re referencing sexual acts in their music. It’s bad for the children. They’re targeting all these famous musicians. These are the most popular musicians of their day and saying, what they are doing should be shut down by the government to protect the children.

04:32 Paul Matzko: Now, there was a musician pushback. Rock legend Frank Zappa said, “If it looks like censorship and smells like censorship, it is censorship, no matter whose wife is talking about it.” John Denver testified. He compared this, the hearings, to like a Nazi book burning. There was real pushback from a variety of major artists, which prevented, I think, the government from acting on the hearings, but the hearings themselves were kind of implicit threat that if the music industry didn’t do something, the government would do it for them. And so, it was out of this moment that the music industry agreed to self-​regulation. This is where those little parental advisory explicit lyrics. They’re very recognizable, it’s like black and white label that go on CD covers.

05:14 Paul Matzko: Though as Dee Snider of Twisted Sister said, “Well look, these labels are badges of honor. I mean, our fans love knowing that this has a parental advisory sticker, ’cause it tells ‘em which CDs they should want to purchase.” So there was a certain amount of backlash in the 1990s. This was a real moment of moral panic across all of American culture, the idea that new technologies were going to twist the brains of children in ways that could not be remediated. They are permanently affected for ill by new technologies enabling new forms of culture. And it’s in that moment that the Internet arrives on the scene in at least consumer access to the internet. Technology itself is older, but it’s in the early ’90s that folks are starting to go online in real numbers.

06:03 Paul Matzko: And so, all of the same family scolds worried about music and movies and violent video games are worried about the potential for indecency and obscenity online. And the question was this: Should the Federal Communications Commission, which regulated out obscenity and indecency for radio and television broadcasting, these are the ones who… It’s the FCC who said that network television couldn’t show nudity or certain swear words or violence during prime time television when kids were ostensibly awake, should those same rules, and should the FCC regulate the internet? Should they apply those kind of bans on obscenity and decency to protect the kids online? Good luck with that. It seems quaint, from a contemporary perspective, the idea to keep people from swearing and being indecent online. But that was the intent behind the Communications Decency Act of 1996.

07:00 Paul Matzko: The purpose of this law was to make it a criminal offense to share online pornography or other obscene or indecent material with anyone under the age of 18. One of the relics of this law is… Like a lot of adult content will be gated with a little click button to accept or decline that you are above or below the age of 18, that’s a relic of this law, but it’s impossible to enforce, it’s kind of a meaningless artifact of what’s left from the Communications Decency Act.

07:28 Paul Matzko: Now, what counted as obscene or indecent was always ill-​defined. I mean, under the strictest interpretation, most of the content that you stream, that you consume online from YouTube to popular TV shows, Netflix’s Big Mouth, or HBO’s Game of Thrones, all of that would’ve fallen afoul of the rules unless HBO, Netflix, YouTube could have guaranteed that no minor, no person below the age of 18 could access their content. And that’s impossible. It’s not really a reasonable standard. They would have been liable for any case of a kid accessing their content online according to the terms of the CBA of 1996. Now First Amendment advocates, thankfully, rallied against this law, they challenged it in the courts, and the court struck most of it down. Court said, “Look, indecency is in the eyes of beholder. And given that gray area, it should be parents who decide what they want their kids to see, it’s up to parents using… They can use internet filters, they can use… Restrict their kids’ access in various ways, they should do that on a case-​by-​case basis, not a one-​size-​fits-​all government rule.

08:39 Paul Matzko: And I get this. I grew up in a very conservative, fundamentalist Christian community and let me tell you, you wouldn’t want those folks to have the power to decide a blanket government-​enforced rule for what Internet content counted as indecent and what doesn’t. I wasn’t allowed to watch Teenage Mutant Ninja Turtles ’cause it encouraged rebellion and violence. You don’t want that standard for the Internet, no matter how much better the world would have been if Michael Bay had been barred from making those awful Turtles reboot movies. I think we can agree that the government shouldn’t be telling us what terrible Bay movies we should or should not watch, but only one of the pieces of the law survived these court challenges. Most of it got struck down. Government said, “Look, your standards are too vague. This is not a legitimate government power, it’s unconstitutional.”

09:25 Paul Matzko: The one bit that survived ironically, was Section 230, the operative bit of which is just two little sentences. 230 was not part of the original bill. It’s an amendment that was added by Christopher Cox, Ron Widen, two congressional representatives. What they worried, they worried that the law, as written, unamended, would have violated the First Amendment and they’re not wrong, as the courts found. Widen warned, “Of an army of sensors that would spoil a lot of the net’s promise.” So the irony is that this problem bill was invalidated, but the protection, Section 230 was sustained, and what that meant was, the bill, the net effect of all of this legislating, all of this action, these hearings was to carve out a significant new freedom for the Internet, granting them a kind of protection against moral scolds that the music, video game and movie industry did not have.

10:22 Paul Matzko: To discuss how significant Section 230 was for the creation of the Internet, I brought in Matthew Feeney, Cato’s Director of Emerging Technology. So, Matthew, what do you see as the ramifications of Section 230?

10:34 Matthew Feeney: Well, Section 230 which is, as you mentioned, getting a lot of play recently has a few parts to it. I think the most important part of it that people like to cite often is Section C1, which the important provision there is that no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

11:02 Paul Matzko: Okay. What’s that mean?

11:02 Matthew Feeney: So something that people have been discussing a lot recently when it comes to the social media sites, but it’s important to point out that this of course doesn’t just apply to big tech, it applies to basically any website that allows for users to contribute content, whether that’s a comment section or posting a review, anything, anything like that. The important part of it though, the crucial part of that particular section is that the website itself, or the owners of the website like Facebook or Twitter, for example, won’t be considered the actual publisher of whatever the content is, which is important because a lot of that content could potentially be content that you could be civilly liable for, or potentially even criminal, so whether you defame someone, you libel someone if you posted child pornography, illegal content, and I think part of the interesting part of the history of 230 are the court cases before it that really, I think, highlight why it was so important that it pass. You had a case, for example, in 1991, the US District Court of the Southern District of New York decided a case called Cubby V. CompuServe, and there you had this interesting case where the court decided that CompuServe was a distributor rather than a publisher of content that were on the forums, but it said…

12:32 Matthew Feeney: The court interestingly said that CompuServe could potentially maybe be held liable for defamatory content if it knew about it, right? So it said, you couldn’t be held liable for defamation but if you actually knew about it, then maybe you could, which sort of implies that a hands-​off approach is best, that to avoid liability CompuServe should just not moderate, should just keep its hands off.

12:56 Paul Matzko: So let me put this in brass tacks. So let’s say, to use a contemporary comparison, and something that’s not just social media. So let’s say Yelp. You’re Yelp, you go on to Yelp to review a restaurant. In a world in which Yelp is liable for all the content you upload, and basically all the content is user reviews of restaurants and things, let’s say I go to the restaurant, and I say something bad about the restaurant, give it a single star and complain about how rude the wait staff was and etcetera. That restaurant doesn’t appreciate it, and accuses me of libeling or defaming, I always forget the difference, but of some sort of misrepresentation of them, and wants to sue me. Now, that’s something that has to be decided in court. But once upon a time, prior to Section 230, they could have sued Yelp for hosting my review that they claim is libelous or defamatory, right?

13:52 Paul Matzko: But what you’re pointing out here with the CompuServe case is that here the court says, and we’ll use Yelp here for CompuServe. Well, you can sue Yelp, if Yelp knew that your review was defamatory or libelous and didn’t remove it. But if they didn’t know, if they don’t do any kind of moderation of any of the content, then you can’t sue them. And that creates a perverse incentive right?

14:19 Matthew Feeney: Well, I think if you’re someone who uses the internet, then you don’t want to live in a world of the CompuServe case, right? Because it turns out that people actually like content moderation to a certain degree.

14:29 Paul Matzko: Yeah, yeah.

14:30 Matthew Feeney: But the interesting thing about the CompuServe case is it’s followed a few years later by this Stratton Oakmont V. Prodigy case which is a ’95…

14:41 Paul Matzko: Stratton Oakmont, that’s the movie, Wolf of Wall Street, right?

14:44 Matthew Feeney: The Wolf of Wall Street guy. Yes, so any of you who have known the DiCaprio film, Martin Scorsese, this case does not get a mention in the film, but…

14:53 Paul Matzko: It should.

14:54 Matthew Feeney: Yes. This was an interesting case which was heard by the New York Supreme Court, where someone anonymously posted on a Prodigy forum, saying that this company was engaging in fraudulent activities.

15:07 Paul Matzko: Yeah. And they were.

15:09 Matthew Feeney: Which, yeah. But what’s interesting about this case though is that it comes… The New York Supreme Court came to a bit of a different conclusion, than the Southern District in CompuServe, where they actually said that providers could be held liable for users’ speech and that actually Prodigy is a publisher, because it engages in certain activities. Namely, it posts a guideline for users, it had software that screened out certain offensive material and that was the holding of the New York Supreme Court. Which also doesn’t sound great, because before Section 230 passes in ’96, you have CompuServe and Prodigy, these two cases. One is saying, well, you’re not liable if you just have a totally hands-​off approach and another is saying, if you have content moderation policies, it seems like, of any kind, you could be held liable. And this is the problem that the authors of 230 are trying to fix, which is the crucial bit that I read earlier, which says, again, Section 230, C1. ‘No provider or user of an interactive computer service shall be treated as the publisher or speaker.’ So that’s an important part.

16:18 Matthew Feeney: But I want to also highlight that section C1 2A of it is another important part that protects companies to engage in content moderation. Specifically, it says, “No provider or user of an interactive computer service, shall be held liable on account of,” and then it goes on, “any action, voluntarily taken in good faith to restrict access to, or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing or importantly, otherwise objectionable, whether or not such material is constitutionally protected.”

17:00 Matthew Feeney: And this is a really crucial piece of the legislation here. It’s saying, look, number one, if you’re an interactive computer service, you are not liable for content that users post. So that’s a good protection. And the other is, oh and by the way, you can do whatever you want, to get rid of material that your company finds offensive. And it’s not, I think, an exaggeration to say this legislation allows the internet as we know it to exist. Because so much of the internet now relies on websites hosting other people’s content, whether that’s Facebook or Twitter, the big ones, YouTube of course, Airbnb, any website that deals with online dating. The internet as we know it, it’s not a complete immunity though, there are carve-​outs. So it doesn’t protect against for example, copyright infringement. So, YouTube has a strong incentive, even after 230 to make sure that people don’t just upload entire films onto their YouTube channels, right? And this has become a focus of recent political debates in large part because of accusations of censorship and so forth.

[music]

18:25 Paul Matzko: As far as why this is so significant, I agree completely, it’s that as much as people complain about the particular content moderation decisions of internet companies… We’ve been talking a lot about content moderation in these episodes, because it’s a very important trend right now in the tech industry. If we don’t fix this, the cultural and political backlash could destroy the internet. So it’s a very very important topic right now, but it’s a reminder that this current debate over exactly how we moderate content is healthy, compared to what might have been. Because in a world without Section 230 or Section 230 like protections, the internet would look a whole lot more… It would probably be something called the internet or something related, but you’d have, rather than having big websites that have large audiences that try to do some moderation, but don’t control things too much, you’d end up with a series of tightly walled gardens…

19:30 Matthew Feeney: Yes.

19:31 Paul Matzko: With very homogenous communities, very heavily regulated, probably smaller user bases, because they might not even be able to be free, because you have to pay for that moderation, that tight moderation somehow. So you wouldn’t have the size, the network effects from things like Facebook and Twitter, likely or everything else would be just free for all. Well it would be 8chan. 4chan, 8chan would be the dominant expression of, you know.

19:58 Matthew Feeney: Well that certainly was, I think, absent 230… It’s impossible to do a historical counterfactual, of course. But it would have been interesting to see in an alternate universe, whether eventually the courts would have had to figure out whether the CompuServe or the Prodigy standard would’ve been the best one, right? And…

20:14 Paul Matzko: Yeah.

20:15 Matthew Feeney: One is… The CompuServe is basically 4chan and Gab, which, where… Not that that would have been the design, but that would have been the result, because companies would have had an incentive, “Hey, if we don’t moderate, we can’t be held liable, so anything goes.” And anything that’s supposed to be member protected is on the platform, which suggests, I imagine, a kind of environment that most people wouldn’t find appealing. But the other is, if Prodigy wins out, if that sort of standard emerged perhaps in 230, you would have a very boring internet, because websites would have to take an approach, “Well, look, the courts are gonna consider us liable because we have content moderation guidelines, so what we should do is make sure that every single piece of content, every photo, every comment, every essay, every posting is screened beforehand.”

21:05 Paul Matzko: Doesn’t have a hint of possible exposure to any kind of liability.

21:09 Matthew Feeney: And this is over-​correction on both sides, but neither of those are particularly desirable. And the worry, though, is that, at least sitting from my perspective, is that this kind of important legislation is under attack at the moment. And it’s important to point out that it can… Section 230 is important, but we shouldn’t deny the fact that it can lead to undesirable outcomes, especially for people on the receiving end of harassment. One of the most important cases shortly after 230 passed was Zeran vs. AOL, where someone anonymously, on an AOL forum, I think, used by militias, in the wake of the Oklahoma City bombing was posting offensive t-​shirts about how great the bombing was and posting this guy’s phone number as the contact user.

22:02 Paul Matzko: Sort of early doxing.

22:03 Matthew Feeney: Right, and it was… But this guy had nothing to do with the creation of the t-​shirts, it was just a random phone number that had been thrown out there. And he was receiving angry, understandably furious phone calls constantly, to the point where she had to just unplug the phone. But actually, the first federal circuit court that heard this case said, “Well, you know what? 230 protects AOL here. You can’t sue AOL, even though it hosted the content.” But I think we shouldn’t… In discussions about this piece of legislation, I think it’s important to highlight… How to put it? Pervasive myths about it, because I think they’re unhelpful. One is the important… Well, it’s not important. The widespread myth that Section 230 creates this publisher versus platform distinction.

22:56 Paul Matzko: So, what does this mean, this publisher-​platform…

22:58 Matthew Feeney: Well, you’ll find this a lot. Whenever you have, especially debates on, ironically enough, social media, you’ll have people who jump into the comment section saying, “No, no, no, no, this… Facebook and Twitter, they’re acting like publishers, not platforms, that’s really, really important.” This is… I don’t know how else to put it. This is a legal fantasy. There isn’t a distinction in 230 between publishers and platforms, because…

23:24 Paul Matzko: All it says is publisher.

23:26 Matthew Feeney: Sorry?

23:26 Paul Matzko: All it says is publisher, in 230.

23:28 Matthew Feeney: Well, it says publisher, but also discusses interactive computer services. And when people think of publisher, they might think of something like the New York Times or the Wall Street Journal. And it’s true, if you wrote an op-​ed that libeled me or defamatory content, I could, if it was an op-​ed, I could sue not only just you, but also the New York Times. But it’s silly in the 21st century to think of the New York Times, for 230 purposes, to think of it as a publisher, or purely a publisher, because the New York Times also has a comments section. So, the New York Times’ comment section is exactly the kind of interactive computer service that Section 230 discusses. And the New York Times, like Facebook, needs a certain liability protection in order for its comment section to be remotely interesting or of value to its subscribers or users. And that’s… Despite the fact that I think this is rather clear to anyone who actually reads the legislation, this is a pervasive myth, that there’s an important legal distinction for 230 purposes when it comes to so-​called platforms and publishers. Another myth is that 230 was passed or is contingent on political neutrality. And I think the history, you did a really good of outlining, really shows that this wasn’t part of the debate, actually.

24:48 Matthew Feeney: Nonetheless, what you’ll see is senators, such as Senator Cruz, who has held and participated in numerous hearings about social media alleged censorship, has quoted the findings from Section 230. One of the findings is, quote, “The internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.” You’ll always see, if you see Senator Cruz at these hearings, he’ll ask Jack Dorsey, Zuckerberg, Google representatives, “Do you consider yourself to be a politically neutral forum?” as if the answer to that question means anything, really. All Section 230 here is saying is, it’s a finding, it’s saying, “The internet and other interactive computer services are an interesting forum for diversity of political views.” There’s nothing in the legislation that says, “You lose this liability protection if you aren’t politically neutral.” Nonetheless, despite being a Princeton/​Harvard-​educated lawyer who was the Solicitor-​General of Texas and someone who has argued cases before the Supreme Court, he continues… And allies continue to say, incorrectly, that in order to enjoy 230 protection, interactive computer services have to demonstrate political neutrality.

26:08 Paul Matzko: I see this as… There is a political utility or partisan utility, I suppose, to this line of argumentation. There have been decades worth of work. I study the rise of the new right in the 60s and 70s, and it’s been going on since then, easily, decades of work to encourage conservative audiences to distrust the mainstream media. Doing air quotes around mainstream, ’cause that’s an ill-​defined concept, what counts as mainstream, what doesn’t. But to distrust the media as an institution, as kind of… People used to think of it as the fourth branch of government, in a sense, a check on government abuse. And some of those complaints are legitimate, and this is not the place for that conversation.

26:53 Paul Matzko: But I see this as an extension of that, that was a politically useful rhetoric. Don’t trust what these Liberals and Democrats in the mainstream media are telling you, turn instead to conservative alternatives. Like talk radio and then eventually Fox News and the like, you can get the utility there, they’ll give you the truth, the conservative truth. And whether or not that’s true. It doesn’t matter at this point for this conversation. But you can see the utility now shifting from radio and television, we have an alternative. Don’t trust the main outlets to now, the internet. Don’t trust… So it’s a new set of similar institutions, social media versus traditional media, digital versus radio spectrum. But I think there’s the potential of the next couple of decades or at least in the coming years, for it to serve a similar kind of partisan utility. As a way for conservative politicians to rouse their base distrust. These big media institutions.

27:50 Matthew Feeney: I think what you’re seeing here is another example of a persecution complex, it seems to be a necessary component of American conservatism. And at the moment it’s like you outlined, it’s Silicon Valley. Silicon Valley is, it’s a bunch of lefties, who live in the Bay Area and they hate Conservatives and they’re engaged in a concerted effort to make sure that Trump fans don’t have a voice on the internet. I find the evidence of this actually lacking. And what’s interesting is, this goes back and forth, you’ll find plenty of left-​wing complaints about social media content moderation as well, despite the fact that I think that the evidence for this kind of censorship campaign is lacking. Conservative politicians are running with it. Now it might just be good politics. You’ve just outlined a compelling case as to why this is important to the conservative movement. And whether or not it’s a legal fantasy to say the things that Senator Cruz has, he no doubt actually saying it to people who have real jobs and don’t have the time inclination to research the stuff, find the narrative compelling, and will be more likely to support candidates who use this kind of rhetoric. Senator Cruz is one, Senator Hawley, from Missouri is another. But I think we should expect more of this as the Republican party seems to increasingly embraces this kind of anti quote, coastal elites populism.

29:18 Paul Matzko: You mentioned Senator Josh Hawley from Missouri who’s the youngest senator in the Senate. He’s gonna be around for a while, and I think represents… He just spoke at the National Conservatism Conference. There is a… There is in… Who knows what the history, how history will look back at the next decade. So that’s an imponderable. But there is an arguable case where the future of the political right is much more represented by voices like Josh Hawley’s, that he might run for president some day and the like. And so, it’s a… This implies this is not just of the moment this is going to be a continuing issue. This kind of skepticism about, Big tech in 230 in particular. And actually Hawley himself proposed the piece of legislation, but didn’t go anywhere in DC. But I think it shows how the Overton Window is starting to shift when it comes to tech. Once upon a time, there was a broad bipartisan consensus that the internet is good. Silicon Valley is doing good useful work in both Democrats and Republicans basically kind of protected the internet from calls for regulation. There’s a bit of a consensus, and that’s starting to break down on both sides for different reasons, sometimes overlapping reasons… But Hawley’s proposal was to give…

30:35 Paul Matzko: And the way he put it was, “With Section 230 tech companies get a sweetheart deal that no other industry enjoys. Complete exemption from traditional publisher liability. In exchange for providing a forum free of political censorship.” So it’s your point about this neutrality, kind of standard and there’s this…

30:54 Matthew Feeney: Notice that he’s also calling them publishers.

30:56 Paul Matzko: Publisher versus platform kinda thing going on there, so this is inaccurate. But he used that argument to call for giving the FTC, Federal Trade Commission, the power to certify or license online platforms that prove themselves, and the burden of proof would be on companies to show, “Look we’re being neutral in our content.” Which is a breath taking extension of government regulation of speech.

31:25 Matthew Feeney: It is, and it’s also directly aimed at people today call Big tech. It is limited to companies that I think have a minimum of 300 million users. And also I think there’s also a revenue requirement. I think it’s half a billion dollars or something like that. I could be wrong but my only point is that it’s directly aimed at the Googles and the Facebooks of the world. And it’s very, very concerning. I think people should really consider what the implications of this kind of legislation are. It’s that a private company would have to demonstrate to federal bureaucrats that they are being politically neutral in order to get a certification that they are allowed to function.

32:06 Paul Matzko: Yeah.

32:08 Matthew Feeney: And like you said, this is… People don’t appreciate enough I think, especially in this town, that what happened in Silicon Valley over the last couple of decades is rivaled only by really the movable type printing press in Europe a couple of hundred years ago. Insofar as it’s a revolution in the ability of people to express themselves to explore new ideas, to publish things. And it seems as if in the name of some kind of national conservative populism that I don’t really understand, that this should be somewhat stifled. That actually the burden should be on a private company to demonstrate to bureaucrats that they are politically neutral, whatever that means. But I think… I’m not being flippant when I say that, I think, “Well what does true political neutrality imply or look like?” What… If you’re Google, how would you even prove that? I don’t… And especially, I haven’t seen any good explanation of what Google looks like in this Hawley world. Where it’s had to demonstrate to the FTC that it’s politically neutral.

33:11 Paul Matzko: Well and the cautionary tale too with this proposal is I mean folks immediately, in the kind of DC area, started calling it a fairness doctrine for the internet, and that’s someone who’s written about the actual fairness doctrine for radio and television broadcasting. That is what it is. And the same thing was true with the fairness doctrine. It was ill-​defined. It was well-​intentioned. It was like, “We want radio and television broadcasting to be fair, to be equitable, to represent both conservatives and liberals. Who could be against fairness?” You know?

33:41 Matthew Feeney: Well right.

33:41 Paul Matzko: And… But what it ended up being used to do was it was responsible for the most successful government censorship campaign in the last half century. It was… The fairness doctrine was used to shut down right-​wing broadcasters because what counts as fair is in the eye of a bunch of bureaucratic beholders, and he who appoints the bureaucratic beholders gets a lot of power over fairness doctrine enforcement, and they use it to shut down, ironically, the ideological ancestors of the conservatives currently making the argument for a fairness doctrine for the internet, right? It’s just bizarrely…

34:14 Matthew Feeney: Yeah, and people should consider what exactly this would look like.

34:19 Paul Matzko: Yep.

34:20 Matthew Feeney: If Google had to demonstrate political neutrality, would they have to demonstrate that when someone Googles the Sandy Hook Massacre that Alex Jones’ allegations that this is all a total hoax and that these people…

34:33 Paul Matzko: Both sides. Yeah, right.

34:34 Matthew Feeney: And that this is all actors and part of a gun control gun grab. Should that appear just below Wikipedia or Reuters, The New York Times, The Wall Street Journal? This is like…

34:45 Paul Matzko: Oh, yeah.

34:45 Matthew Feeney: Or even a website that hosts the reports about the atrocity?

34:51 Paul Matzko: Yeah.

34:53 Matthew Feeney: These are the serious questions that people sitting at Google would have to consider in Hawley’s world.

34:58 Paul Matzko: Or when the White House, a few, I think last year, did a… They were encouraging people to send in complaints about anti-​conservative bias to a White House website. But one of the proofs the White House gave that there is an anti-​conservative bias in the online news, among online news outlets, was that the number of articles that were critical of Trump during the election was larger than the number of articles that were critical of Hillary Clinton. Maybe just entertain the possibility, and the same argument could probably could be made that… Entertain the possibility that you have a situation where the idea of neutrality means they’re meant to be an equal number of good and bad things you say about someone. You can imagine the trouble you get into when it’s like, “Well, maybe there’s more bad articles about a candidate because that candidate’s less popular. Or did some bad things that are newsworthy, or you know.”

35:55 Matthew Feeney: Right. People should make sure not to confuse being unbiased with being objective.

36:00 Paul Matzko: Yeah.

36:00 Matthew Feeney: In the sense that to draw another example, if you Google flat Earth theory, you’re gonna find a lot more articles refuting it than…

36:11 Paul Matzko: Yeah, right, right.

36:12 Matthew Feeney: Arguing in favor of it.

36:13 Paul Matzko: Not neutral.

36:14 Matthew Feeney: Right. But is that a demonstration of bias? It’s not necessarily like Google has a vested interest in the earth being round.

36:24 Paul Matzko: They’re reflecting public opinion.

36:25 Matthew Feeney: Well, they’re reflecting what people believe in the world, and… Yeah. And all of what we said in the last few minutes, I think just compounds the fact that this would be a almost impossible standard to reach.

[music]

36:42 Paul Matzko: So what’s our response to these kind of criticisms, like what what do we say? I mean something we were talking about before or back to you was there are… I mean there’re alternatives. It’s not like people are locked into Facebook or Twitter or any of these outlets. Why does that matter?

37:00 Matthew Feeney: It matters because competition is preferable to anything we’ve just discussed as far as the Hawley proposal and other proposals to treat social media companies as publishers. The first thing I would say to conservatives listening who believe that there is this concerted anti-​conservative campaign in Silicon Valley is to…

37:23 Paul Matzko: Yeah.

37:24 Matthew Feeney: I beseech you to look at the evidence, that this is… All the evidence I’ve seen of this is based on anecdotes, misunderstanding of how these algorithms function, misunderstandings of how the companies function, and also ignore a bit of history, namely that left-​wingers have had similar concerns in the past about alleged bias. If that’s not convincing though, if you believe it, then exit is always your right, that there is the ability to start alternatives. Nothing is stopping people who want to have websites that house conservative speech. Now some people might argue that these companies are monopolies and that competition is impossible. Maybe that’s a topic for another podcast, but I just don’t buy that.

38:10 Paul Matzko: Yeah.

38:10 Matthew Feeney: Our colleague Ryan Bourne has written quite a bit about big tech monopoly concerns. I’m not convinced that it’s impossible to compete.

38:18 Paul Matzko: There are alternatives even if they are smaller, right?

38:21 Matthew Feeney: There are alternatives.

38:22 Paul Matzko: You could use DuckDuckGo instead of Google.

38:23 Matthew Feeney: You can use DuckDuckGo. You can use Gab. People…

38:26 Paul Matzko: Right.

38:27 Matthew Feeney: Gab is not at risk of being shut down, it just doesn’t have as many users. That’s Twitter. I also think… I’d much rather private failure than public failure. If we have a world in which something like the Hawley Proposal passes, you’re just a really bad bureaucratic decision away from the internet as we know it being really hamstrung and ruined, and I’d much rather have the environment we have at the moment, which is YouTube, Facebook, Twitter, Airbnb, Tinder, all these companies come up with their own content moderation policies, and some are better than others, none of them are gonna get it perfect, but they all have an incentive to try to do the best they can for the most number of users, and that situation and system is better, especially in a system where competition is possible.

[music]

39:18 Paul Matzko: Now 1’d like to bring on Jennifer Huddleston. She and Brent Skorup are from the Mercatus Center, and they just wrote a paper recently about how Section 230 is evolutionary rather than revolutionary, how the courts were already slowly making their way towards a system like what Section 230 provided, but it was taking time to get there, and time and money and delayed development that we were able to forgo because of Section 230, and this kind of contradicts some of what’s coming out of Congressional hearings right now. Could you explain that for our audience?

39:52 Jennifer Huddleston: Certainly. So one of the things critics on both the left and the right often criticized Section 230 for is that it’s this kind of idea of this… It’s this deviation, it’s this huge gift to the tech company, and what my paper with Brent Skorup shows is that, actually it wasn’t so much a deviation or a gift, as much as it was an acceleration of trends away from strict liability for publishers that had begun in the 1930s. So, a good 60 years before Section 230 in the Internet were It’s even a conversation, things that we saw with earlier mass media technologies like the radio and wire services developing via things like wire service defense and conduit liability.

40:38 Paul Matzko: So you’re like the Associated Press or some wire service and you’re sending a news story. You’re relaying along the wires, should you be held liable for the content of what you’re just transmitting? Is that the kind of idea?

40:53 Jennifer Huddleston: Well, certainly looking at radio and if a host said something defamatory on the radio and different other forms of media, what should be done? And again, just like with Internet platforms, you certainly could say go after the individual, but can you go after the conduit that is carrying that information?

41:12 Paul Matzko: And you’re saying that even in these other media forms, there was some kind of protection for the platforms or the conduits that they couldn’t be sued successfully in court.

41:24 Jennifer Huddleston: Right. So we had started to see an erosion of strict liability, of just saying that carrying this type of information was defamation per se into a much more nuanced look. We also see this with things like libraries that might have been found to have a book that would contain some material and looking at things like libraries and news stands, and how much responsibility could you be reasonably expected to have over everything that was contained there?

41:53 Paul Matzko: So you don’t want someone to be able to sue the library because they carried a book that they just happened to not approve of.

42:00 Jennifer Huddleston: So a lot of this involved different kinds of questionable material, or even defamatory material, and the question of how much responsibility given that something like defamation is not always clear cut in a lot of cases.

42:14 Paul Matzko: There’s this evolving legal defense going on for radio, print matter, television. Take us up to the 1990s. So it’s the early days of the Internet. How do we see that evolving into the digital age?

42:29 Jennifer Huddleston: Right. So we have two very different… Two courts come to two very different conclusions. In one case, the court says that the Internet intermediary is not liable, and in the other case, Prodigy, they say that they are. And out of this kind of diversion came Section 230 where there was this concern about could the courts potentially go down the wrong path? And that’s part of the interesting phenomena with Section 230 is while it does resemble this shift away from… Let me start over. While it does resemble this kind of continuation of what was evolving at common law, it did accelerate it at a really important time that allowed the Internet to really flourish with all sorts of user-​generated content. A lot of times when we talk about Section 230, we think of things like social media sites, search engines, or even in the earlier days, things like online message boards. But it also impacts a lot of other areas of the Internet that rely on user-​generated content. So things like sharing economy platforms, and review services also have been allowed to really flourish under Section 230. Whereas, if it had continued to evolve at common law, there would have been this concern about what do we do in those cases where the courts may deviate from what was going on traditionally.

43:54 Paul Matzko: So we might have still ended up at a point where we’ve eroded strict liability for conduits or platforms, but it would have taken many more years, a lot of frivolous lawsuits, a lot of just delayed development, I suppose.

44:09 Jennifer Huddleston: Absolutely.

44:10 Paul Matzko: Interesting. Well, Jennifer, thank you for your time, and we’ll put a link to your paper, in Brent’s paper in the show notes. And for our listeners, until next week. Be well.

44:26 Paul Matzko: Thanks for listening. Building Tomorrow is produced by Tess Terrible. If you enjoy Building Tomorrow, please subscribe to us on iTunes or wherever you get your podcasts. If you’d like to learn more about libertarianism, find this on the web at www​.lib​er​tar​i​an​ism​.org.