E443 -

How big is too big for Big Tech?

Hosts
Trevor Burrus
Research Fellow, Constitutional Studies
Aaron Ross Powell
Director and Editor
Guests

Will Duffield is a research assistant at Cato’s First Amendment Project.

Matthew Feeney is head of technology and innovation at the Centre for Policy Studies. He was previously the director of Cato’s Project on Emerging Technologies, where he worked on issues concerning the intersection of new technologies and civil liberties, and before that, he was assistant editor of Rea​son​.com. Matthew is a dual British/​American citizen and received both his BA and MA in philosophy from the University of Reading in England.

SUMMARY:

Why are we talking about “Big Tech” now in a way we we weren’t 5 years ago? Cato’s own Matthew Feeney and Will Duffield join Trevor to discuss how the 2016 election changed the political landscape, the value of moderation, and how digital infrastructure influences a platform’s power.

Transcript

[music]

0:00:07.6 Trevor Burrus: Welcome to Free Thoughts, I’m Trevor Burrus. Joining me today is Matthew Feeney, Director of Cato’s Project on Emerging Technologies, and Will Duffield a policy analyst in the Cato Institute Center for Representative Government. Welcome to Free Thoughts, gentlemen.

0:00:21.6 Will Duffield: Thank you.

0:00:22.7 Matthew Feeney: Thanks for having me.

0:00:24.5 Trevor Burrus: Now, we have this idea of big tech, and it seems over the last couple of years, we are talking about big tech a lot, which we can get into what that all means, but why does it seem that we didn’t really talk about this maybe five, seven years ago, but has anything important happened in the recent years that made suddenly Facebook and Apple and whatever else seem like concerns?

0:00:50.9 Will Duffield: Well, after the 2016 election, they became salient to politics in a way that they hadn’t been before, while previously, people used them to talk about politics, they were mostly on the sidelines, however, with the election of President Donald Trump, the power of social media came into the fore.

0:01:10.5 Matthew Feeney: Yeah, I think Donald Trump has a lot to do with it, especially left-​wing reaction to the emergence of Trump, I think involved a lot of complaints about social media platforms being irresponsible in checking alleged Russian interference in the election, the spread of misinformation and disinformation, and since Trump’s election, I think the right has gripped onto a narrative that these companies are staffed by left wingers who are intent on stifling conservative speech, and I think both of those concerns have prompted a lot of what we’ve seen surrounding big tech in politics recently.

0:01:50.0 Trevor Burrus: Does it matter?

0:01:53.2 Will Duffield: The centre Left has really fallen out of love with big tech or social media, as a lot of its sort of predictable effects and uses have come home. When Obama used it to great effect and we saw things like the Arab Spring in the Middle East, there was a celebration of its emancipatory potential, but when that emancipation was of populist attitudes in the American heartland, it looked different.

0:02:21.4 Trevor Burrus: Yeah, that was actually exactly my question, does it matter… And we’ll get back to this, but does it matter that it was Donald Trump versus Obama? Because Obama famously attacked the electorate, let’s just say tried to influence the electorate with a… At that time extremely novel and complex social media strategy that was praised, and as you pointed out, these companies are very, very left-​wing in terms of their employee base, so it seems that using this power to elect Donald Trump might be the thing that upsets them more than using this power to elect Barack Obama.

0:03:00.8 Matthew Feeney: Yes. Yeah, I think that’s certainly possible. What I think we’ve seen over the last few years is an understanding across the political spectrum that these kind of social media platforms can be used for a variety of reasons for good or ill, not just by political candidates, but foreign intelligence agencies, by charities, non-​profits. And all of that, I think, has complicated the conversation, because I find it’s often very difficult to pin down exactly what people are upset about. Even when you think about the right-​wing complaint about the alleged bias in Silicon Valley, it’s not clear if that’s a complaint about a culture at large, it’s like, This is just an extension of a complaint about San Francisco being very lefty, is it a complaint about a particular business model, is it a complaint about a particular company? I think a lot of the discussion about big tech lacks clarity and it’s very unhelpful. Something I think we should point out is no one has a definition of big tech that is consistent. It’s not clear to me that it means the most visited websites, Wikipedia is very rarely mentioned, for example, in these contexts, there are sometimes discussions about social media that people neglect streaming services, it’s all very, very complicated and lacks a lot of clarity.

0:04:19.1 Trevor Burrus: I wanna go back and we’re gonna clarify as much as we can, but I think we should start even at the bottom, ’cause there’s a lot of misunderstanding on what social media companies are actually selling. What is the business model of a social media company, and that they connect us and they let us see our friends and see what they’re doing, but what is the actual product that they’re selling at the end of the day?

0:04:46.2 Matthew Feeney: Well, this is something that often doesn’t get discussed enough because the right gets away with, and the left to some extent, get away with calling these companies monopolies when I think if you actually put on your Econ hats and look at what’s going on, it’s clear that most of these companies are competitors with one another, because what they’re selling is not social networking, no one listening to this podcast ever received a bill at the end of the month charging them for their month of social media networking consumption, what they’re selling is digital advertising space and Google, Facebook, Twitter are all competing with one another to be the place that businesses want to place their digital ads in, and that’s where the competition is. So it is wrong to think of there being a market in social networking or search per se, because these are all giant companies competing with each other over primarily digital advertising space.

0:05:39.9 Will Duffield: And so they aren’t just competing with one another, but also a host of other ad providers, be they billboards, network television, that sort of thing.

0:05:51.6 Trevor Burrus: So in their minds, what is a good product then? I think this is important when we’re gonna get into the content moderation, but what is in the most abstract and you guys have had more meetings with these companies to see their viewpoint, but what does a good product look like if its… It has ads, but it has some amount of social interactivity, but ultimately they care about how many ads they sell. Correct?

0:06:16.5 Will Duffield: It’s someplace that people enjoy spending time, because if people are spending their time on some platform or website over another, they’re viewing the ads on that platform or website rather than something else. Now, you’ll often hear a complaint that because of this, platforms are incentivized to do anything they can to capture people’s attention, to sort of titillate them and keep them engaged and while engagement is important, it’s not a one-​off phenomenon. If you saturate your platform with sort of outrageous inflammatory content, people will start to tune it out pretty quickly. There’s a reason that places like 4chan aren’t all that popular. So when we think about what platforms want to do, what they’re trying to cultivate on their platforms and ultimately why they moderate, it’s often in order to maintain this a pleasant environment, both for users and for advertisers.

0:07:19.7 Matthew Feeney: Yes, and it’s actually a rather difficult balance because you want to bring as many people to the platform, you want them to be interested in a new MacBook or lawn mower or whatever, while also making sure the content on the platform doesn’t repulse whatever the critical mass there is, and they use a lot of information about people to figure out how to tailor this, and something I try to point out to people is that these companies are only doing something very familiar, just a lot better. So for example, it was worth beer companies buying advertising space for television during the Super Bowl, ’cause they just could make a guess that a lot of beer drinkers watch the Super Bowl, so it was worth them spending millions of dollars to place ads there. And what you see Facebook and Google doing is doing basically the exact same thing, which is using information that they have about customers to put advertising in front of the correct eyeballs. It creeps a lot of people out in ways that billboards don’t seem to, and that’s a whole other discussion in complaint you primarily see from the left, but yes, that is the underlying business model.

0:08:25.6 Will Duffield: And because it’s targeted, it can be sold much more cheaply, which is great if you’re running a small business or just a consumer of some niche product, because most products don’t warrant a billboard or Super Bowl advertisement. They aren’t such a mass phenomenon and it’s been hard to sell those, those niche things previously, but now it’s much easier to connect with the consumers who actually want them.

0:08:55.9 Trevor Burrus: Now, when it comes to Google, this is a little different. Matthew, you talked about competition between Google, but Google, and these other companies, Facebook, Twitter, for example, but Google is not a social media company. It seems like it’s selling a different type of product, so how does… How do Google and say Facebook compete?

0:09:16.1 Matthew Feeney: Well, they are competing in the sense that they are hoping that a digital advertiser, this is someone who seeks to place a digital ad, will spend their dollar on advertising on Facebook over advertising with Google, and Google has over many, many years developed algorithms that ensured to Will’s early point that people who have a tailored product will see ads when they search for them on Google. And it is true that Google doesn’t run a social media network, although we should mention that now, I suppose technically, Google is a subsidiary of a bigger animal called Alphabet, and YouTube is one of Alphabet’s other businesses. And YouTube, I think, certainly is a what can be called a social media network. But I think it’s a good question because oftentimes, as I said earlier, there’s a little bit of a lack of clarity where people seem to be on one sentence complaining about a search engine, and then the next sentence, complaining about a social media site, it can all get rather confusing. But despite all that confusion, rhetorically, it’s still fundamentally the case that Facebook is competing with Google for the money from people seeking to advertise their goods online.

0:10:31.3 Will Duffield: They’re also competing within the search market. We tend to think of general search as the only kind of search in town, something like Google, which queries the internet for some sort of information, but lots of searches begin other places. When we’re looking for different sorts of things, we use different platforms to search for them. Product searches may begin within Amazon. If you’re looking for an old high school flame, you might look through Facebook, or if you’re looking for a video, you might start within YouTube itself. Google has multiple search products that in some cases compete and cannibalize from one another, but this is fine because we, the consumers, the people who use all these different search tools, then have the most ways to search for things.

0:11:20.1 Trevor Burrus: In the process of doing this, Google, Google, of course, does not want… You search the Loch Ness monster, it does not want a completely random assortment of websites to come up concerning the Loch Ness monster from the smallest conspiracy theorist website to Wikipedia, because that becomes quite unusable so it puts an algorithm on it and it de-​ranks and ranks different search results, but we have to admit that that is a disturbing level of power to kind of frame the world given to a company that is not in any way politically or democratically accountable. It’s proper to mainly be concerned about that power.

0:12:03.3 Matthew Feeney: I think it’s fair enough to be concerned, something that oftentimes gets lost in the conversation is that Google search engine is powerful because it uses links to sources as a proxy for reputation. So for example, when you search Loch Ness monster, the Wikipedia page for the Loch Ness monster will be one of the first hits, not necessarily because someone at Google headquarters has thought, Well, by design we’ll always make sure Wikipedia is on page one, it’s that a lot of people link to the Loch Ness monster page when searching for it, and this I think does actually inform some of the conservative complaints about Google where they might take issue with a lot of what might be on Wikipedia, for example. And in this kind of conversation, I always return to the classic question, our colleague Peter Van Doren often cites, which is compared to what and at what cost, which is, Well, yes, it’s true that you can run a search engine like Google does, which is to use links as a proxy for reputation, but it’s not the only way you could do it, but the…

0:13:09.0 Matthew Feeney: You need a way to ensure that when someone searches Jaguar, that they’re getting searches for the animal they want, not the car that they may want to buy, and that requires a lot of heavy lifting behind the scenes to ensure that people are getting what they want, so Google’s method is one way to do it, it seems very popular, but it’s by no means the only way to do it, and I don’t blame people for being wary of Google, because it is very popular, very powerful, a lot of internet searches do occur there, but the question is, Why hasn’t someone come up with a system that would be better, it seems to do what a lot of people wanted to do.

0:13:50.7 Trevor Burrus: Do we know that Google… Aside from the… Whatever the algorithm is doing in the background, do we know that they… Whether they intentionally de-​rank pages?

0:14:01.0 Will Duffield: They will, yes.

0:14:03.0 Trevor Burrus: They will.

0:14:03.7 Will Duffield: And they have both for their own reasons and in response to government requests at times, however, I think it does illustrate that this is far from a process merely of Google deciding exactly what comes up in each search result, because while sometimes they do intercede, this is mostly an algorithmic output, and people are constantly trying to change their pages, get linked to by other things in order to improve their search ranking, and Google is constantly retuning its algorithm in an attempt to surface what it considers to be the best information. But this is a more combative or adversarial process that’s always going on than I think a lot of people give it credit.

0:14:58.8 Trevor Burrus: Now moving to social media… I guess I’ll just first ask the question that in some ways is the elephant in the room for a lot of people, Are social media companies censoring conservatives… Or maybe more specifically, are they censoring conservative viewpoints at a higher rate than other viewpoints?

0:15:19.0 Matthew Feeney: Well, the glib response would be, Define conservative, and a lot of people I think here, would I think say, Look, we understand that you want to remove white supremacist content, but it’s not fair to call that conservative, we’re concerned about mainstream Republican candidates being censored. I, in my work on this, have yet to find any conclusive credible evidence that there’s a systemic censorship campaign in Silicon Valley to hold the spread of conservative messages. I understand that a lot of conservatives find that hard to believe, given what they see on the platforms, but… No, I am not convinced that there is a concerted effort to stifle conservative speech, and I should mention that if you tell left-​wingers that there is, they’ll look at you like you have grown a second head because to a lot of people on the left looking at these social media platforms, they see the spread of MAGA content, the spread of white supremacist content, the spread of domestic terrorism content, and they can’t believe that anyone outside looking in could say that there’s a bias in favor of left-​wing content.

0:16:40.9 Matthew Feeney: And something else I should mention, and I’m sure Will has something to add, but I’ll just mention that it’s an interesting trend in American Conservative history that American conservatives are quite good at seizing onto new communications platforms. The rise of conservative talk radio being perhaps one example, and you do see on Facebook in particular, that a lot of conservative pundits do have widely shared articles, but nonetheless, that does nothing to dampen the rhetoric that there is an anti-​conservative crusade in Silicon Valley.

0:17:12.9 Will Duffield: I think there are some rules that cut against conservatives more heavily, Twitter’s ban on deadnaming prohibits certain expressions of conservative views on gender, however, yes, as Matthew says, the relative effect here can’t be ignored. If this were some media outlets that were so tightly controlled by the left, or liberals in Silicon Valley that could affect these sorts of changes, it wouldn’t be home to such a wide variety of right-​wing content. I think Facebook in particular has become more and more of a boomer platform over the past half decade at least, and with that age cohort comes a lot of conservatism, so we should expect that particularly pundits who speak to that group, people like Ben Shapiro, would do well on the platform and they have. So I think the general relative effect of social media, particularly in relation to cable news or the kind of dominant national papers like The New York Times has been to advantage conservatives by giving them a huge new megaphone that… When they had less access to the big megaphones of the ’90s or early aughts.

0:18:50.0 Trevor Burrus: With Donald Trump, though, I think a lot of conservatives believe that, especially in the 2020 election, that the big tech left wing, these left-​wing companies, and it is true if you look at their… I think Twitter in particular, the political donations to Democrats versus Republicans is something like 75 to one. So these companies, you can assume that the average employee… If you took them as a sort of the same mainstream democrat and/​or especially a mainstream democrat in Silicon Valley, they really did not want Donald Trump to be re-​elected. It was regarded as an existential threat to the country, if not the world, and so therefore, it makes sense that maybe there were conversations that there was some sort of discussion that if they can just put the smallest finger on the scale and say, “De-​rank Breitbart” or not have this stuff show up, then they could save the country from what in their perspective was a possible cataclysm, and so I think that a lot of conservatives believe this is what happened and that it’s reasonable to believe that that is what happened. Is there any truth to that, or is there a reason to be suspicious I guess?

0:20:03.1 Will Duffield: I think what really muddies the water is the presence of Russian propaganda regardless of its effect in 2016, because that was seen as a matter of national security. Now, I don’t think that Russian propaganda tipped the election to Donald Trump, their ad spend was relatively minor and they were mostly saying the same things as other American Conservative populists. However, because you’d seen this foreign propaganda effort, platforms were pushed to take steps and wanted to take steps to limit how foreign countries could interfere in our elections, and over the four years between 2016 and 2020, they implemented new policies to prevent even the appearance of foreign meddling. On Twitter, one of these new policies was a prohibition on hacked materials. It was first used against a trove of leaked police files in August of 2019, but… 2020, but no one really cared about that. It wasn’t terribly politically salient, however, when it was applied to the New York Post story about Hunter Biden, it kicked off a hornet’s nest. A lot of people felt that Twitter had intervened there to tip the election. Now, your concerns about the power of these platforms, particularly over the press or media entities are I think perfectly reasonable.

0:21:48.1 Will Duffield: However, in that case, being knocked off Twitter for a short period of time does not mean that the article didn’t spread, and looking at the Google trends around searches for Hunter Biden, you see a kind of twin the hump with the second larger peak after the Twitter ban as discussion of the New York Post-​Twitter ban really drove a lot of interest in the underlying story that might not have been there otherwise. Beyond that, I don’t think there are any particular incidents that during the election that… Or in lead up to it that people can or have pointed to, and I’m skeptical that the New York Post-​Twitter fracas did anything to diminish support for Donald Trump.

0:22:39.7 Matthew Feeney: I do think it’s funny that… It’s possible that Twitter just knows a lot about me obviously, but I oftentimes hear about these Twitter content moderation controversies, because Twitter tells me that it’s trending and that a lot of the news surrounding what goes on one these platforms trends on social media, but yes, I share Will’s assessment that I’m not aware of anyone likely having changed their vote because of the Hunter Biden story, that doesn’t mean there weren’t people out there who did, but I think post the 2020 election, I think that is the story that conservatives most often cite in content moderation concerns because it did seem like it was a thumb on a scale in a really important political election, and there is something… I think people in Silicon Valley realise this, but there is something rather odd with an American company saying, “Our goal is to get as many people online as possible, but actually, if you’re the President of the United States, we might make an exception.” I understand how people think that’s a little odd.

0:23:50.3 Trevor Burrus: Well, we can… We can criticise certain content moderation decisions too, because the Hunter Biden one seemed to be a little rash maybe in hindsight, and we could talk about other ones such as the lab leak hypothesis with the COVID-19 and whether or not this should have been shut down. But the…

0:24:07.9 Will Duffield: Oh, I mean…

0:24:10.0 Trevor Burrus: Go ahead.

0:24:10.1 Will Duffield: One of the worst ever, and really doesn’t get much attention at all, but early in the pandemic when the CDC was still telling people that masks didn’t work and weren’t helpful then Facebook and most of the other platforms prohibited ads for masks in an attempt to prevent price gouging or prevent shortages, but again, at that point, people can’t find the masks as easily and now, what makes that sort of worse or most concerning in a way though, is that the government did lead the charge, and I think looking around at other content moderation decisions, so long as there’s a diversity of rules, if one platform among these 12 or so big big sites decides to ban something, then that’s one thing, when led or spurred by the government, they all turn to ban something, I think that should raise more of a concern, it looks less like a business advertiser-​led decision there, not to say that there aren’t cascades that happen in the real world. When ISIS came along, then it sort of prompted a crack down on video content everywhere, as ISIS videos join the mix of what people uploaded and advertisers across all platforms reacted to it, but again, more of a cascade than Jen Psaki, asking for something to be removed and platforms doing it.

0:25:43.6 Trevor Burrus: So why do we let them do this though?

0:25:45.6 Matthew Feeney: Well, the answer to that question, I think that depends on what you think an unmoderated kind of platform looks like. So there are easy cases or comparatively easy cases where… What might a good example be… So listen to this, you might remember the Tide Pod challenge, right? Where there were teenagers filming themselves eating Tide Pods, and I hope Allison is… Know well enough that you probably shouldn’t deliberately or accidentally ingest detergent. Not good for you. And you don’t need to be an MD to know that that’s going to have serious adverse effects on your health. But a new virus emerges and it’s clear that it’s serious, but there are still a lot of unanswered questions about its transmission, what we know about the effectiveness of masks and what not. And you’re the CEO of a company that has millions of pieces of content every day, and you make a serious error on allowing certain content to spread, you don’t wanna be held responsible, not legally, but even morally, or in the eyes of the world, as making a pandemic worse?

0:26:52.3 Matthew Feeney: And I think it’s understandable that Facebook might say, “Well, we don’t know the answer, so we’ll defer to the CDC. So if the CDC says something is nonsense, then we’ll say it’s nonsense.” And of course that turns out to be a disaster when the CDC screws something up. But I think it’s understandable that they would want to do something because the potential for real world harm is pretty significant when you’re coming to the effects of a pandemic, but there are obviously imperfect applications, once you decide to do content moderation there.

0:27:25.7 Trevor Burrus: Well, we let them do this in such a way that their own political opinions can drastically skew the world. The more broader question of why don’t we demand or apply like a First Amendment standard to something like Twitter, because if they decide that they don’t want people to talk about anti-​immigrant views, and I’m radically pro-​immigration, but I think that the anti-​immigrant view is a legitimate political viewpoint, and one that is worth discussing. But if Twitter decides it doesn’t want that to be discussed, then they can do various things to make it not discussed, and that becomes an issue. So why do we let them do this kind of content moderation where they make so many mistakes, and why don’t we just say, have a First Amendment standard and let people talk about whatever they wanna talk about.

0:28:15.8 Will Duffield: On one hand, I think it would tend to make these platforms unusable, no matter what the purpose of a forum is, there’s always some speech that’s going to be off-​topic or even just spammy, if someone attempts to contribute, maybe they have something good to say, but they do it 15 times a second, no one else will be able to speak. So that kind of at the outset, makes a First Amendment standard difficult. There are almost no public spaces in America that are actually run under only the First Amendment. Instead, the first amendment sits behind, it prevents the government from interfering in whatever private rules do govern how people are expected to conduct themselves in the place. That could be a bar that could be a sporting Stadium, whatever.

0:29:11.3 Matthew Feeney: I would add that the First Amendment is not something that the founders put together so that we’d all be able to host really interesting dinner parties. It’s not a sort of broadly, free speech is great provision to the Constitution. The point of it is to prevent the government form censoring speech. So the government can’t come in and tell a newspaper, “Well, we really like that candidate, so please don’t publish that damning op-​ed.” And what that means though is that the responsibility for what good speech and appropriate speech looks like is on private institutions like Twitter and Facebook. And so on the one hand, there’s this utility point, which is, if you’ve just said, “We’re gonna turn Twitter into a First Amendment theme park.” That you’re just gonna be unusable just because of spam, and it would be very difficult to actually navigate it usefully. But secondly, there’s the legal point, which is, I think, fortunately, we all live in a country where the government is not allowed to censor speech that in many other countries would be censored.

0:30:13.1 Will Duffield: So in the United States, a lot of speech that would be classified as hate speech in European countries is permitted as images of cruelty to animals is permitted, and a wide range of racist speech is permitted. I think you can simultaneously say, I’m glad the government doesn’t censor that, but also that it should be legal for private companies to decide to distance themselves from that sort of content. And I focused in this comment on racist speech, but the same I think applies to comply to medical speech and conspiracy theories.

0:30:50.9 Will Duffield: I think it’s really important to recognize how unique and strong the first amendment is in protecting speech, not merely because it’s such a safeguard of our liberties, but because it explains why we’ve ended up with the politics of the internet that we have. The internet dramatically lowered the cost of speaking for almost anyone. Eugene Volokh has a great paper in which he refers to this as cheap speech. All of the new things that people use the Internet to say, which previously they wouldn’t have thought worth the cost of the paper and ink. And some of what they say is this off-​the-​cuff cheap speech isn’t very nice or good, some of it’s even illegal, and the courts have had to deal with a lot more libel and defamation claims over people’s Instagram restaurant reviews and the like than they ever had to before. In other countries, the deluge of cheap speech has been met by new laws either restricting what people can say or placing new duties on intermediaries to filter speech, but in the United States, the First Amendment prevents that. Government hasn’t been able to make new laws to restrict what people can do with with the internet. However, the demand to censor some of this new speech hasn’t gone away, instead it’s sort of routed around into or through other policies. And you’ll see Anti-​Trust for instance, used really more in order to warrant an attempt to control who controls which media.

0:32:33.8 Will Duffield: Or product liability used to constrain what kind of speech tools are available and increasing the relevancy politicians informally command or demand that platforms remove or leave up certain speech, a phenomenon called Jawboning, where instead of using any legal power to censor that they don’t have, they’ll instead threaten some consequence that something they might be able to do, an inspection, end of government contracts, that kind of thing, if the platform doesn’t privately make some speech decision that it would like. So the First Amendment is great in providing us with such a free internet and such safeguards for our speech on it, but it doesn’t mean that the demand for censorship has gone away and in many cases, it’s found sort of novel avenues to weasel in.

0:33:34.7 Trevor Burrus: We talk about some of these alternatives to Twitter that have… That have existed or still exist. I’m not actually sure. One of them is Parler. Another one was, I think it’s called GETTR. And then there’s The Truth, which is, I think that’s the name of Donald Trump’s, which I’m not sure what the status of that is right now, but they’re trying to compete with different content moderation, but what happens to these alternative content moderated places when they get into the game of content moderation?

0:34:09.9 Will Duffield: I think very quickly, they end up having to come up with content moderation policies of their own, born of their unique circumstance. Parler was launched as a free speech platform, tried to be a home for conservatives who were upset with Twitter. Of course, these Conservatives wanted to see conservative celebrities there and listen to their speech, these were important users for Parler to retain. So when trolls began creating fake parody accounts in the names of mega celebrities, despite its orientation towards free speech, Parler, I think quite understandably cracked down on these because they threatened its core premise, which was being a home for these right-​wing speakers and their audience. And if someone was going to come in and prevent that from happening, then you either give up the platform or you have to make a rule about it.

0:35:12.9 Matthew Feeney: I do think it’s worth emphasizing here that the First Amendment is oftentimes a distraction in these debates, because I think it is viewed across the global spectrum as a talisman that just represents a free speech value, to say that you want a first amendment space or that you believe in the First Amendment is nine times out of 10 a way of saying, I like free speech. And the reason that Parler, GETTR and all these others emerge, is because there’s disagreement in a country as big as ours over what is acceptable speech, and that is something we should expect. What I think is interesting about Parler in particular, is that they ran into trouble with the app stores where people would download the mobile app, so whether you are an Android or Apple, a lot of people don’t think about this because it actually goes deeper into the stack as people say, which is Google and Apple have policies about what you can and cannot do if you want to sell your app on their store, and so there you had Google and Apple saying, Look, you’re allowed to make whatever you want, but it’s not gonna be on our store, so you could still access Parler just using a web browser and typing in www​.par​ler​.com or whatever it was, the issue raised it’s head in a significant way when it was yanked from these mobile app stores and there, I think you get into a slightly more interesting conversation about the role of app stores and the cloud providers and web hosts, not the platforms themselves.

0:36:50.7 Trevor Burrus: That was my next question. How should we view that as differently, so we went right to my next question, going down the stack, as you said.

0:36:56.7 Will Duffield: I think their power as gatekeepers is concerning, but at the same time, there are ways to ameliorate or circumvent it. And in many of the cases of these sort of dedicated right-​wing apps, they haven’t done so, and have instead turned to the political bully pulpit at the first sign of trouble. And I think this causes a bit of a divide within the internet community about these sorts of apps, where they will sort of wrangle an older cypher-​punk ethos in their failure to take what some understand as a baseline reasonable steps that others have to avoid the power of these gatekeepers, and then complain about that power when it’s exercised. Truth Social is built as an iPhone app first, and so yes, they’re very vulnerable to Apple’s App Store policies, however, there are platforms which built web first, have built in a very open source fashion, something like Reddit which it has its app, it has to comply with App store policies and change how it presents content a bit within the app, but you can get the pure unfiltered Reddit through your web browser, and because it was built browser first, then that takes priority, the whole product isn’t being shaped by the iPhone app, and so there are thoughtful steps that can be taken here, but I think unfortunately, when platforms are spun up for political reasons, then they’ll often look to politics first as the arena in which to voice complaints or make changes.

0:38:45.0 Trevor Burrus: What should we think about the arguments? We’ve heard these arguments about common carrier provisions applied to platforms like Facebook and Twitter. But what about applying them to the web hosting services, deeper down into the internet and saying… Because we had different websites. The Daily Stormer was one of them that just couldn’t find web hosting. And sure, we can all say, we don’t want Nazis with a bunch of websites influencing people. But what if that happens to you? What if you get… Your mainstream libertarians, just suddenly free-​market views are considered to be immoral and on par with Nazis so then they can take down Cato’s website, they can take down lib​er​tar​i​an​ism​.org. Should we be more concerned about the web hosting than we are about the websites like Facebook and Twitter?

0:39:38.9 Will Duffield: I think the market for web hosting itself is pretty thick. There are lots of providers, there are lots of domain registrars and this includes providers of last resort like a company called EPIK, which the Daily Stormer did end up with. I know a number of other sort of far-​right sites have ended up there over the years. Now, this isn’t a foolproof solution. The fellow who runs this host has his own personal preferences, I know he’s, on Christian ethical grounds, objected to a pro-​suicide site in the past and kicked it off. That doesn’t seem to extend to Nazism but ever. If there is a place anywhere within the stack where you could imagine common carriage making sense, it is with some of these low stack service providers. Things that look and function a little more like commercial infrastructure and don’t sell ads, and therefore content moderation as a selling point of the platform. But at the end of the day it’s still dangerous in a sense to begin regulating willy-​nilly, especially at the lower levels of the stack when you expect or want changes at the edge layer.

0:41:09.8 Will Duffield: You’re sort of expecting a couple extra steps or things from regulation because often there’s an edge platform that doesn’t exist and you want it to exist. And it might not exist for normal market reasons where it might not exist because there’s something missing lower in the stack. Merely mandating what you want lower in the stack may be insufficient to get you the edge experience that you feel you’re missing.

0:41:36.4 Matthew Feeney: Something that is worth highlighting here is that common carriage is a species of regulation that emerges out of natural monopoly. Classically, this is how people are supposed to think about it. And as Will said that there’s quite a bit of competition within web hosts already. And so it seems like on classical sort of understanding of common carriage is that this would be incoherent even if you weren’t talking about platforms. This conversation really became more and more common in technology policy space in the wake of a justice Thomas opinion, where he outlined that maybe Google and other companies like that could be considered common carriage. But there I think it’s just a conceptual error to think because you have to ask yourself, “Well, what’s Google carrying? And how is it the only provider?” Yeah, just say I thought rather incoherent. But like Will said, the argument is still flawed, but makes more sense the deeper into the stack you go.

0:42:45.6 Trevor Burrus: When we look at proposed regulations, and of course, there have been a lot of proposed laws that… Some of them are just entirely political posturing, whether it’s Josh Hawley or Elizabeth Warren. But this idea that some companies; Facebook, or I guess, Meta now, has asked for regulation. What should we… As libertarians, we say, “Be always wary of a business that asked to be regulated,” which is a good general rule. But insofar as certain regulations are proposed, what should we be concerned about in terms of what it could do to innovation, internet in general, if some of these regulations got passed?

0:43:24.7 Matthew Feeney: You need to first think about the costs associated with compliance. This is something libertarians talk about all the time, which is, private firms will object to regulation until they view them as inevitable, and then they will seek to write the regulations themselves. And it’s not surprising that powerful marketing companies that have millions of dollars at hand, thousands of lawyers, thousands of engineers, all well placed to go to Capitol Hill and tell senators, “Look, I understand that you want to regulate, but here’s a good proposal that might require a lot of investment into content moderation or something similar.” There’s a lot of costs associated with that sort of thing. Europe is a market to look at when it comes to regulations on technology companies. I think that in the wake of GDPR, where with the Europe’s big tech regulation, that you did see that Google and Facebook did quite well out of that as far as market share. When you hear about companies siding with certain politicians or legislation, listeners should keep an ear out for what that legislation might require and what it might do to the market.

0:44:36.7 Will Duffield: Yeah. If you aren’t happy with the current business models of social media, the last thing you want is regulation that’s going to lock in those business models and force every new competitor to essentially follow in their footsteps. I do think sometimes dominant platforms attempts to pre-​comply with regulation that they either support or don’t, can illustrate the flaws of regulation before it’s even passed. Facebook has made its own platform compliant with a proposed bill called the Honest Ads Act, which would regulate political advertising online in a number of ways and really expand the definition of what the FEC can police beyond the traditional electioneering statements about candidates to issue statements. Now, Facebook has struggled mightily to decide what speech is political, whether certain product advertisements for guns or an anti-​Trump Spice Company are considered political. And in doing so, it shows what a bad idea the Honest Ads Act would be because you would be expecting the FEC to come up with these sorts of definitions for everyone and for platforms with far fewer resources than Facebook to apply them at scale. So sometimes when we look at this proposed regulation and platforms attempts to comply with it, we can see why it would be a very bad fit for the ecosystem at large, even if the dominant platforms can bear it.

0:46:13.8 Will Duffield: I will add quickly that some listeners might be thinking, Well, couldn’t you just write legislation for only the big players just to help competition? You could say, Look, well, only if your revenue is at least half a billion a year does this apply or so many users, and I’ll take those in reverse order just so that listeners can ponder them. And the first is because a website has a lot of visitors doesn’t mean that it’s wealthy, the best example of this is Wikipedia, one of the most visited websites on the planet, but it’s a nonprofit and only employs a comparative handful of people compared to Google and Facebook. Secondly, on the revenue point, my argument on this is that it just provides another incentive for smaller companies to sell to big companies, where if you know that you’re building a social media company and you’re slowly gaining revenue, but you know that once you tip over into the half a billion dollars that you’re gonna have to comply with new regulation, actually selling to Google for a few billion dollars or so or a few million actually looks kind of attractive, and so I don’t think it solves the competition problem at all.

0:47:22.7 Will Duffield: It can introduce real issues as companies scale as well. Imagine if you’d had a law saying, Companies with more than 50 million users have to have a non-​algorithmic feed as well, like Twitter used to. Well, now, TikTok can never really be a proper competitor to YouTube or Twitch or Facebook because it relies entirely on an algorithmic feed, it never had anything else, and it knows that at this size, it’s going have to change its business model or its product offering dramatically. So that threshold always poses an issue when companies near it.

0:48:00.5 Trevor Burrus: Now that was my… Going next with TikTok and other things that I probably am not even aware exist, ’cause that’s… The oddity of these tech companies is as soon as 40-​year-​olds are paying attention to them or 50-​year-​olds, they probably have been way left behind by the 15-​year-​olds and what they’re doing. And so, as you pointed out earlier, I think, Will, that Facebook is not used by 15-​year-​olds, as a rule, they’re using TikTok and Snapchat and other things. Facebook is sort of a boomer phenomenon, and we’ve seen that some of this misinformation concerns comes from the fact that boomers share this stuff more than other people, but I think in 10 years from now, it would be hard to say what platforms might be dominant, 10 years from now you’ll have people who are 18 be 28, and there’ll be something new, and TikTok is of course very different in terms of moderation, so I guess the question is, what’s coming next? If we think of some of these highly… The ones that everyone talks about as being legacy players, what’s coming next for content moderation in terms of… And social media, in terms of the difficulties with regulation and creating sort of rules that apply to all these, even… Not guessable technologies coming in the future.

0:49:21.9 Will Duffield: The guessable one is virtual reality. Everyone, certainly Facebook wants to create the metaverse, and if you line up all the different sorts of content and platforms on this kind of spectrum, you can place text, Twitter-​based, have static searchable text on one end and live video all the way out at the other, with VR, then extending the line further, because instead of something that you can search back through or even easily record as a viewer, you’re suddenly talking about ephemeral interactions, they may not even be verbal, but instead someone gesturing at you in virtual reality and so for platforms to moderate this in any kind of top-​down fashion is going to be much, much more difficult than it is to search through Twitter and remove the bad tweets.

0:50:25.6 Trevor Burrus: I’m picturing a VR space where your Ben Shapiro of 10 years from now holds some sort of virtual speech that people have turned in a virtual universe and… Or some Nazi holds a virtual speech, and then everyone gets very upset about how these people are influencing the election via misinformation. Do we just have to accept it then? That does seem quite impossible to do much about even on a private level, so just to accept misinformation and that people will be spouting off hateful rhetoric and just… That’s how things are?

0:51:03.0 Will Duffield: I think the anxieties about other people’s speech and other people having the wrong or awful ideas will always exist, so long as there’s a diversity of views out there, however, there are reasons to believe or hope that new mediums may not have the same issues of scale as our current crop of platforms. It seems as though given concerns about cancel culture being the main character of Twitter, being publicly shamed, that sort of thing, that a lot of particularly younger users are moving to more private platforms, they’re having conversations in a Discord channel rather than publicly on Twitter, and in some ways, VR re-​imposes the physical constraints that we lost when we moved to the early internet, the hypothetical demagogue giving the speech, unless he has a kind of digital assist, is only going to be able to speak to so many people, only so many people can crowd in and see him, and someone else may get up there on stage and say something else, so I don’t know if the current crop of scale-​driven problems will exist within every new social media platform, but concerns about what other people may come to believe will always be with us.

[music]

0:52:45.4 Speaker 4: Thanks for listening. If you enjoy Free Thoughts, make sure to rate and review us in Apple podcasts or in your favorite podcast app. Free Thoughts is produced by Landry Ayres. If you’d like to learn more about libertarianism, visit us on the web at lib​er​tar​i​an​ism​.org.