E55 -

What was the real beef between Crowder and Maza? Matthew Feeney and Paul Matzko dive in to the mines of content moderation.

Hosts
Paul Matzko
Tech & Innovation Editor
Guests

Matthew Feeney is head of technology and innovation at the Centre for Policy Studies. He was previously the director of Cato’s Project on Emerging Technologies, where he worked on issues concerning the intersection of new technologies and civil liberties, and before that, he was assistant editor of Rea​son​.com. Matthew is a dual British/​American citizen and received both his BA and MA in philosophy from the University of Reading in England.

As recent scandals on social media platforms have shown, content moderation is hard, thankless work. The lines between political satire, hate speech, historical documentation, and obscenity get blurry very quickly even in a single country, let alone when trying to create a one-​size-​fits-​all global moderation standard. Companies like Facebook and Twitter are attempting to routinize their content moderation processes, but Matthew and Paul discuss whether those efforts—however well-intentioned—are too little, too late.

What happened between Crowder and Maza? What debates are happening in the tech space about content moderation? How do we determine hate? How does Facebook respond to questionable content? What is too radical to be posted online? What are the limits to hate speech?

Further Reading:

Free Speech Online: Unfriended, Building Tomorrow Podcast

Speech Police, Building Tomorrow Podcast

Practical Problems with Regulating Tech in the Public Interest, written by Will Rinehart

Transcript

[music]

00:09 Paul Matzko: So I think if you were even slightly connected to online social media in the past couple of weeks, you had the Crowder V Maza imbroglio show up in your feed, in some way, shape or form. And for those of you who haven’t, don’t worry, you didn’t miss out on all that much. The rest of us are over saturated with it but let me fill you in, Crowder, Steven Crowder is a conservative YouTuber who said nativist and homophobic things about Carlos Maza, which is sadly commonplace on social media.

00:43 Paul Matzko: Maza then complained to YouTube, which at first didn’t do anything, saying that Crowder’s words, while hateful, didn’t violate the terms of service because hate speech alone, depending on, there’s lots of qualifiers there but hate speech isn’t necessarily a de-​qualifier for YouTube’s terms of service but then the situation went viral, the fact that YouTube wasn’t pulling down Crowder’s words or somehow punishing him for his comments, so the situation went viral. Folks criticized YouTube’s flaccid response and YouTube said, “Oh, well, better do something.”

01:20 Paul Matzko: So then they de-​monetized Crowder’s channel, which then Crowder parlayed into a victimhood narrative that went viral on the right. So it’s the kind of situation, no one was very happy with how they handled it because… So it was kind of a mishandling kind situation. How did this pop up on your feed, Matthew?

01:40 Matthew Feeney: Oh, yeah, well, I do tech policy in the 21st century so I have a Twitter account and it’s the kind of thing that was certainly making the rounds because it was prompting the kind of debates that I think we’re getting used to in the tech space; namely, conversations about how some of the largest, most popular social media companies and content platforms handle content moderation and when people say “content moderation”, they’re talking about processes that sites such as YouTube, Twitter and Facebook go through in deciding what to keep up and what to take down and these are very difficult decisions to make, involve a lot of difficult questions.

02:27 Matthew Feeney: There are a lot of grey areas and there’s, on top of the fact that it’s difficult, there is an added… How best to put this? I suppose a cultural problem, which is namely that a lot of people feel particularly aggrieved and feel that a lot of the takedowns and demonetizations are politically motivated by the people who work at the headquarters of these companies, which are of course, based out west in California.

02:55 Paul Matzko: And I think it can be tempting, when you look at any given situation, to think it’s easier than it actually is. Neither of us are going to be sympathetic with the things that Steven Crowder said, they were hateful things but determining what’s hateful versus what is not… Determining hate can get really complicated really quickly. Determining things like nudity, things like violence, all of these are really complicated decisions.

03:28 Paul Matzko: I thought, actually, before we get to some of those complicating factors, we would talk about what that process looks like and without talking about any specifics, you’ve kept an eye on how some of the corporate content moderation processes have evolved over the last couple of years. So why don’t you walk us through? Let’s say something pops up, something like the Crowder-​Maza or any kind of potentially objectionable content, how does Facebook or YouTube or Twitter, how is that discovered? And then what does that process look like as it makes it’s way up the chain?

04:04 Matthew Feeney: I think something to point out at the front end is that for a lot of content, some of this may be automated, depending. So there’s known offensive content that’s automatically flagged by machines, that artificial intelligence can automatically flag a lot of this stuff and take it down. The more complicated, I think, question or I don’t wanna pretend that that’s easy but the one that I think is more on the minds of people these days is when the humans get involved.

04:35 Matthew Feeney: Someone sees content that they find offensive or they think is hateful, they can flag it and depending on the company, it will end up in front of a human sooner or later and these human beings will have to make a decision by consulting the policy that the company has about content and then deciding whether that content is compliant or not and this is difficult and one of the things I think that’s missing from a lot of the conversation about content moderation is people seem to think this is really easy and that social media sites are making it too complicated but I think it would be good if anyone listening…

05:22 Matthew Feeney: Just imagine that you’ve set up a website like Facebook. It’s a website where you want people to come, it’s gonna be free, it’ll be funded through advertising but you’ll allow people to post photos, a commentary, to share links to news, to different websites and you’ve got to decide, “Well, what kind of content do we want to have here? What will we allow and not allow?”

05:46 Matthew Feeney: And you can imagine being at a meeting with some of your co-​founders and the co-​founder who might be called Paul will say, “Well, I think we shouldn’t allow images of nude children on this new website we’re building.” And of course everyone at the table says, “Yes, that sounds like… This is, that’s just good policy.”

06:04 Paul Matzko: Good point Mark. I mean Matthew.

06:06 Matthew Feeney: Thank you, yes.

[chuckle]

06:08 Matthew Feeney: So we’ll go around the room where… Okay, so we’ll write that down in the policy. That’s good policy and then inevitably, a policy that sounds good and is good on paper runs into problems. Someone who, let’s say they’re a high school teacher and they want to educate their followers who might also be public school teachers say, posts that famous photo from Vietnam about the napalm bombing, which features a photo of a naked Vietnamese girl running down the road.

06:42 Matthew Feeney: So based on the policy that we don’t allow images of nude children, we take that down and some people think, “Well look, we really had that policy to stop child pornography.”

06:51 Paul Matzko: Pedophiles, yeah.

06:53 Matthew Feeney: For pedophiles. But this is historically significant, it’s a well-​known photo, so maybe we should keep it up and then someone else who’s doing a thesis say, on extermination camps, posts photos from Auschwitz, which include not only images of nude children but dead people. Now we have, “Okay, well, do we want to allow footage of dead people?” Well, no. We find that offensive but then not only do you run into the problem of, “Well, what about historically significant photos but what about fictional portrayals?” Or what about…

07:22 Paul Matzko: Or if somebody shares that photo but to make a point you don’t like.

07:25 Matthew Feeney: Right, yeah.

07:26 Paul Matzko: Like to say, “Well, Auschwitz was good.”

07:28 Matthew Feeney: Right. That’s…

07:30 Paul Matzko: So it’s the same photo documenting a historical event but if you make a carve out for that, how do you… Layers of complexity, yeah.

07:36 Matthew Feeney: And you can have situations where you’re not dealing just with the kind of stuff we’ve discussed already but cell phone footage of events that people film, like these horrible videos of kids bullying each other or sometimes even beating each other up and they might film it and put it up on Facebook. And when I was watching a, I believe it was Channel 4 undercover documentary, where a reporter embedded himself in Dublin, where one of the Facebook content moderation centers is, they were told, “Well, if you put a photo like that up and it’s condoning the behavior, the bullying, then we take it down but if an anti-​bullying charity puts it up and comments, ‘this is horrific, we need… ‘, then we’ll keep it up and not take it down.”

08:22 Matthew Feeney: And all this is very, very difficult. We’ve… I believe, I don’t wanna say for sure but I wouldn’t be surprised if YouTube had a policy about cruel, cruel footage of… But I’ve seen footage on YouTube of chefs preparing food in some foreign countries that I certainly view and say what they’re doing to someone is cruel. You’re viewing animal suffering and I can easily imagine that some people viewing that would find it emotionally disturbing and the only reason I’ve been going on as long as I have is to just emphasize the point that these are very difficult questions. There are a lot of gray areas and when thinking about these kind of problems, we shouldn’t let the perfect be the enemy of the good. Right? That these companies will inevitably fail but the question is but what’s the alternative?

09:13 Paul Matzko: And to your point, it’s a complicated… It would be complicated for any individual, individual one of us examining our own set of norms, mores, virtues, ethics, to decide some of these questions on our own, let alone trying to come up with a standard that covers billions of people across the globe. A one-​size-​fits-​all standard that covers everyone.

09:37 Paul Matzko: I was thinking, as you were going on, as you were talking, about… And actually there is a line from, we’ll put a link to the article in the show notes but there was a line from an early… From an examination of Facebook’s content moderation policies in the early days, when things were simpler, when it was a college-​based social media system only. They had a slogan printed out on a piece of paper in their office that was, “Here’s our standard, Hitler is bad and so is not wearing pants.”

10:09 Matthew Feeney: Yeah.

10:09 Paul Matzko: So no Nazis, no pro-​Nazi stuff and no pants-​less.

10:15 Matthew Feeney: So, clever but…

10:16 Paul Matzko: Yes, but yeah.

10:18 Matthew Feeney: You have the inevitable question of, “Well, if you are someone who posts a video of a Hitler speech, is it pro-​Nazi or is it historically interesting?” If someone has a grandparent who served in the US military and might have come home with some Nazi memorabilia that they keep in the house, is a photo of that pro-​Nazi? You’re gonna run into these problems and that’s led some people I think… This kind of problem, plus the allegations of politically motivated bias has motivated some people to say, “Look, Facebook and Twitter and Instagram, YouTube, all these, they should just adopt the First Amendment as their standard.”

11:02 Matthew Feeney: And I can understand why some people instinctively think this is good. The United States has the best free speech protections in the world. The amount of speech that is allowed is vast but you’re gonna quickly run into problems because keep in mind that Facebook and Twitter are profit-​seeking companies and there’s First Amendment protected speech that they clearly do not want on their platform.

11:24 Paul Matzko: Right, that I don’t like on my platform. [chuckle]

11:25 Matthew Feeney: Well I would… Right. And so people don’t, most people do not want to see footage of animals being crushed to death or beheading videos or videos and footage of horrible things. It’s First Amendment protected. You can… But it seems reasonable to me that the people out in these companies have made the decision that, “No, we shouldn’t have the First Amendment as our standard because our sources of income might dry up to a significant degree.” And that’s understandable.

[music]

12:00 Paul Matzko: Yeah and this actually comes up in conversations right now about Section 230, Josh Hawley’s proposal to combat what he sees as conservative bias by requiring them to apply to the FTC for licensing. Or register… I forget the term. Certification. Certification that they have been trying to avoid political bias and very quickly well-​intentioned efforts, like if you say, we’re going to… The State is going to guarantee that private outlets, private platforms like Facebook and the like are going to avoid content that we generally find disagreeable, whether it’s child porn or whatever or hate speech or etcetera, whatever the intent, very quickly that can become, it can have a real chilling effect on kinds of speech that is fine, that we would probably recognize as fine but that becomes questionable, that represents a risk.

13:03 Paul Matzko: So if you force platforms to remain content neutral, you can have one of two effects; either they can become more hands-​off, which turns platforms like Facebook and Twitter into something more like 4Chan or 8Chan or even the bad old days of Reddit, where they say, “Look, if we’re gonna be liable for the potential that one of our users is a crypto-​Nazi and posts something praising Hitler and we’re gonna be sued, we face the potential of liability for that well, we’re just not gonna… We’re not gonna do any content moderation whatsoever.”

13:38 Paul Matzko: And I don’t think that’s the world most of us want ’cause… Those of our listeners who’ve been to 4Chan or 8Chan, I’m not sure we want all of our social media to look like that. I’m fine with them existing, I think they should have the right to exist but that’s not what most of us want from our social media.

13:52 Paul Matzko: But the other flip is that they become even… Like they start moderating out stuff that we weren’t trying to get rid of; satire, historical video. So you can either encourage their content moderation to be non-​existent or to be really stringent. If you want well calibrated reasonable policy, content moderation rules from these companies, using the State, applying a First Amendment kind of justification or whatever it maybe, it’s a really, it’s a blunt instrument. Let me put it that way.

14:27 Matthew Feeney: Well, right. This is, the bill we’re mentioning introduced by Senator Hawley, I think is problematic for a whole host of reasons but one of the interesting things it’s really making social media speech politically controlled. The subject to the FTC, which is one of these alphabet soup agencies and we’ll end up like you mentioned. If you’re a lawyer at Facebook and imagine a world where this bill is implemented into law and you’re at Facebook, you think, “Well, okay, so we need to get certified by the FTC and to do that, we have to have a good faith effort to be politically neutral.” Well, what does that mean?

15:10 Matthew Feeney: And it could mean and I think it probably does if you’re doing a fair reading, it means that the KKK get a Facebook page and so does the Nazi Party of America and they get to post what previously would have violated Facebook’s hate speech policy and I can see why some people feel that this kind of approach is the way to go, especially given the media environment where people are being told that there is a crackdown from these companies on certain speech but one, I do think that fear is based on no evidence and secondly, I don’t think people appreciate how it would ruin the internet that most of us have grown up with and I think people should proceed with caution when it comes to those kind of proposals.

16:00 Paul Matzko: There’s a flip side from that, the moral of the story from the era of broadcast regulation. Then the fairness doctrine’s actually, that applied to television, radio broadcasting, is fairly similar to what Hawley is proposing. Some of the mechanisms are a little different but it’s close and what happened there, you have the potential, you could see it either going towards, we’re willing to let everyone have a voice or else we face lawsuits from the American Nazi Party, etcetera.” So you actually have more hate speech when the point of this policy was to, in part, to discourage hate speech.

16:31 Paul Matzko: Or it can go flip in the other direction, this is what happened in the ’60s, which is that a lot of radio stations and television broadcasters, they stopped airing anything, any kind of political content that was vaguely controversial. Anything but the most anodyne, most mainstream kind of political positions. That’s okay, everyone… A critical mass of people agree that those aren’t offensive, even if they disagree with it but anything radical, anything conservative, anything too left-​wing, new left, new right, none of that. You can be kind of, have a narrow range of political opinions in the air, anything else is unsafe. You might as well just leave it off. So that chilling effect is a real risk too. It could flip either way and it’s hard to stay on the side but these rules are not gonna produce the outcomes that are intended.

17:20 Matthew Feeney: And actually, something that might produce the results that the aggrieved people here are reaching for, is to be found in competition and the market and I do think that a lot of people these days view Facebook, Twitter, Google, YouTube, all these companies as monopolies and I suppose we could dedicate a whole episode as to why I don’t think that’s true but our colleague Ryan Bourne just put out a good paper at Cato, which we can put in the show notes highlighting how many times in the past that we have assumed that big tech companies are monopolies and it’s turned out not to.

17:54 Matthew Feeney: And what you’ve seen are the emergence of social media sites where they’re trying to use free speech as their lodestar. So Gab, which, again I think mostly is a place for white supremacists to hang out but you have the Canadian Psychologist, Jordan Peterson, apparently trying to build a site where they’ll only take content down if ordered to by a US court and my response to this as a liberal is, good. Let’s see what works out in the market. You’re competing for users and let’s see what happens.

18:28 Matthew Feeney: My guess and I could be wrong, I’ve been wrong about a lot of things in the past predictions but my guess is that actually most people who seek out social media services aren’t there because they like the fact that people they don’t know can say what they want. They’re there because they want to share photos of their children’s birthdays, to find out what their family and friends are up to, to share interesting political news, to figure out where, find information about restaurants and whatnot. I don’t think there’s enough people out there who want to go to a site because its comparative advantage is some sort of First Amendment standard. Now maybe there’s enough people out there that they will go and it will frighten some of these so-​called tech giants out on the West Coast but I’m skeptical.

[music]

19:29 Paul Matzko: There’s, and I think this is a reminder of natural market forces are pushing against the kind of corporate dominance that you have from some of the largest social media platforms. So Facebook is the 600-​pound gorilla when it comes to what you’re describing, posting pictures of your kids and making connections with family. Twitter plays a similar role for conversation among, I don’t know, the kind of liberal elite and…

19:53 Matthew Feeney: The chattering classes.

[chuckle]

19:54 Paul Matzko: Chattering classes, people like us. The literati and so they’re very dominant in those spaces, even though there are alternatives but there’s a natural market mechanism which is pushing against that, which is that different communities of people are going to… There is no one size fits all content moderation system that’s gonna make everyone happy and because of that, it makes sense like in a natural market state for there to be regional versions doing what Facebook does but more closely tweaked and aligned with the interests of the local, of the region of people in that area.

20:35 Paul Matzko: So just to give one illustration, again, remember “Hitler’s bad and so is not wearing pants.” And that sounds, that makes sense when you just hear it, it’s common sensical to us here in the States but take nudity as an example. Facebook standard is to obscure female nipples except for breastfeeding and artistic nudes. And that kind of makes sense to us, okay, I get a sculpture of Diana shouldn’t be pulled off of Facebook and I get someone’s mother just breastfeed her kids shouldn’t be. But that makes sense to us, in a country that’s… That standard would not be the same in a more conservative religious country with taboos on public breastfeeding, they wouldn’t be so blasé about it.

21:16 Paul Matzko: They’re not going to be happy with Facebook’s decision. There’s room there for a local competitor who their standard is gonna be different, they might not allow that depiction of breastfeeding. Or even like how do you decide what counts as artistic nudes? What about Muslim countries with the tradition of representational art in general being taboo, let alone artistic nudes. I mean the cultural gap, we’re talking about policy spanning the whole globe of people from a range of faiths and traditions and politics and it’s essentially an impossible task that they’re being given and whatever you think of how they’re doing with that task, I don’t know how sustainable that is in the long run. I think the natural push is going to be for Facebook knock-​offs or clones or regional versions that are gonna be successful at competing on the mores of particular countries or areas. That’s my prediction.

22:15 Matthew Feeney: Well, yeah, I think there’s a lot to that and it’s very difficult of course, to understand satire versus serious commentary if you’re not actually fluent in the culture of the person doing the posting and oftentimes, comedy especially relies on a lot of assumed cultural knowledge within the audience and there’s just so much content to consider that it’s inevitable that there will be failures, that there will be misunderstandings and stuff will be taken down when it shouldn’t and even in situations where you might take an aggrieved party at their word.

22:53 Matthew Feeney: Going back to what you started with, Crowder, the Crowder incident is rather interesting because at least, Maza in his Twitter thread, was complaining about… He wasn’t complaining just about Crowder’s content, he was also complaining about some of the harassment he would get from people who he suspected were Crowder devotees and that’s… For argument’s sake, let’s give Crowder the benefit of the doubt. So is it not fair for him to then say, “Look, I can’t decide who visits my YouTube page, and I can’t just tell them… I have no control over who they text or tweet. Why am I being punished for this?” And I think of course that wasn’t the only thing Maza was complaining about and Crowder’s behavior was hardly laudable but that’s a difficult issue, where your content is being de-​monetized or taken down because of what your audience is doing, instead of you.

23:53 Paul Matzko: We generally, in a pretty digital sense, we have a pretty strict regimen when it comes to incitement. Like there… You can’t incite someone, you can be held criminally or civilly liable for inciting someone to violence. So like if I hire you to go put out a hit on someone, I am liable. I mean, I paid you to do it, it’s not just incitement. Or if I’m like, I am a political candidate and in the speech I say “Hey, after this rally, you should go out and rough up someone from the other side.”

24:23 Paul Matzko: And actually in the 2016 campaign there were a couple of times where Trump came close but stopped just shy. But you can be held liable for inciting someone to violence if you explicitly encourage them to do that. That’s not what typically goes on in a lot of these online cases, I mean it can but most of the times it’s unpleasant people attract other unpleasant people and then they’ll criticize someone and their unpleasant followers will jump on but there’s not actual legal incitement happening.

24:54 Matthew Feeney: Well exactly. If I tweeted, “God, I hate Paul, he is a menace.” It’s, you know, that might be the end of it for me but what if one of my fans decided to dox you, to figure out where you and your wife live and to post that information publicly. Now you’re into like pretty scary area, right?

25:15 Paul Matzko: Yeah.

25:16 Matthew Feeney: But my content didn’t dox you. Yeah, someone saw maybe I inspired them and this gets to your point of what’s the point at which there’s civil and criminal incitement? Now, that’s an interesting question but if you’re Facebook or Twitter, you don’t care what the law says, you’re just trying to worry about, “Is this content we want to allow? We don’t care about legal this or that, we don’t need to get the lawyers involved, we need to figure out is this like a good user experience for people who are on the platform?”

25:44 Paul Matzko: Yeah, yeah. It wasn’t for me, when you called… “I hate Paul.” [chuckle]

25:46 Matthew Feeney: Right. And that’s something, like I said earlier, that I wish more people would consider, it’s not that Twitter and Facebook are not interested in a legal standard because they are not a government, they are a profit-​seeking company with different incentives.

[music]

26:22 Paul Matzko: So even if we can figure out what counts as hateful and what doesn’t, this can get really messy because… You can imagine, let’s say if the FTC had the powers that Josh Hawley wanted to give them, over deciding what counts as politically neutral or what counts as biased, well, some of that bias and even hate speech can be in the eye of the beholder. Like what one person finds just beyond the pale and we don’t have to go full Nazi here and we don’t have to go Godwin’s law but even in lesser cases, was it fair, was it politically neutral for Facebook to allow an altered video of Nancy Pelosi looking drunk or ill on their platform?

27:10 Paul Matzko: Well, it’s satire. It’s kind of but if you pull it, you’re saying, “We don’t want to allow these kind of product or satirical critiques of one political party” but how about if it was the same video of Donald Trump? I mean, it gets really messy really quickly what different folks will see as irredeemably biased or offensive. You don’t have to go to the full extreme.

27:41 Matthew Feeney: Of course. Something that’s been fascinating and beguiling to observe in the last couple of years in this debate is, who views what kind of periphery as within their tribe and it’s been sort of strange that you have people who, I don’t think it’s a particularly controversial thing to say but I’ll say it anyway. People like Alex Jones who are sort of just beyond the pale of just rationality and I just think it’s, this is crazy stuff and it’s funny that a lot of the conversation about what some of these platforms have done to Alex Jones’ accounts, it tends to be people sort of right or center who are complaining about this and it’s sort of odd to think that, they don’t view themselves as Alex Jones but they certainly view, on the slippery slope they think they’re closer to The Summit than the Communist party or to socialists.

28:36 Matthew Feeney: And that’s been a very interesting thing to observe but I want to stress that I think a big part of the conversation we’re having now and the only reason we might be having this conversation today, is that there’s a perception about what’s going on in these companies that I think is compounded by ideological filters. That people are only hearing grievances from people within their broad ideological tribe and actually if you look at, there are people on the left who have complained about similar things in the past.

29:06 Matthew Feeney: When I was doing an event with our colleague, John Samples, on the Hill, I mentioned a letter that a socialist organization had written to Google saying, “Google is an anti-​left company that is trying to stifle leftist speech and this is an outrage.” There’s been a lot of criticism about how social media companies deal with transgender people, people who are changing their gender and anyone engaged in this debate, I think has to take seriously the complaints from across the political spectrum.

29:38 Paul Matzko: Cultural mores, the mean or median of a cultural more is constantly shifting. If you go back 10 years ago, the question of whether or not someone criticizing… Like the definition of homophobia and online discourse has shifted over the last 10 years and I would say for the better personally but in general, there were things that would have been said 10 years ago that probably most Americans would not have thought of as homophobic but that today would be. The problem is, is that if you set up government arbiters, in Hawley’s case, the FTC, they’re the ones who get to decide whether or not this is hateful, whether it’s politically biased, etcetera. You’re giving someone an awful lot of power over something that’s inherently unstable and changing.

30:29 Matthew Feeney: I do think that reasonable people can disagree about things like homophobia or racism. So is the statement, “I’m not against homosexuals, I think that… Fine people but I’m against gay marriage.” But is that homophobic? Is saying…

30:50 Paul Matzko: And to be clear, this isn’t your statement, you’re just…

30:51 Matthew Feeney: This is not my statement, no.

30:53 Paul Matzko: An abstract look.

[chuckle]

30:55 Matthew Feeney: But on social issues as slippery as they come, right?

30:58 Paul Matzko: Yeah.

31:00 Matthew Feeney: So this isn’t my position but imagine someone saying something like that or someone saying something like, “I support gay marriage but homosexuality just kind of weirds me out. It’s gross.” Is that homophobic? It’s a very…

31:13 Paul Matzko: Is that hate speech to the point that it should be removed or is it just unlikeable speech, hate speech that shouldn’t be removed, that’s just…

31:20 Matthew Feeney: Yeah. And look, whenever I think about these kind of questions, I think well, thank God I have the job I do ’cause I would not…

[chuckle]

31:28 Matthew Feeney: Anyone who’s pretending that these kind of decisions are easy, I think is just kidding themselves.

31:34 Paul Matzko: Maybe the framing device for our listeners is, imagine all the terrible things that your crazy uncle or aunt says at the Thanksgiving dinner table. Do you think they shouldn’t have the right, the fundamental right to say those things? And that’s the kind of question we’re having. You might not like what they’re saying but should they have the right to speak those things online or not? And who gets to decide that? Should it be up to private companies, in which case their competitors can rise that will allow them to say it? Should the government get to have a say in whether or not they get to say that? It gets really messy really quickly.

32:11 Paul Matzko: So I think that kind of puts a personal point on it. There are lots of people who you’re gonna come in contact with who you disagree with in very vehement ways but we live in the right space society that’s pluralist and people are going to disagree and disagree substantively and while there’s limits on what… There’s limits on… Each of us has to decide what those limits are for ourselves. There’s lots of people who I wouldn’t befriend in real life because their views are hateful but do we wanna live in a society where the government makes that choice or where the government forces platforms to make a choice for us and that gets messy really quickly.

[music]

32:56 Paul Matzko: I think it’s important to bear in mind that they are trying really hard right now. So some of the numbers I pulled, Facebook claims it’s algorithm detects or removes close to 100% of spam, 99.5% of terrorism-​related content, 98.5% of fake accounts, 96% of nudity and 86% of graphic violence. That’s just the algorithm, that’s no human moderator, just pulls it. That’s pretty good but then again, we’re talking about billions of posts per day from more than a billion users worldwide on something like Facebook and hundreds of millions for something like Twitter. So even 0.5% of a billion means 500,000 missed posts.

33:41 Paul Matzko: That means there’s some poor schlub somewhere that has to wade through those 500,000 terrorism-​related posts and decide whether this is a criticism of terrorism, a defense of terrorism, a historical document about terrorism and they have these complicated flow charts they have to go through. I was actually impressed going through what they do. Again, I don’t think it’s gonna work in the long run but it’s not for want of trying.

34:08 Matthew Feeney: Oh, goodness no. I was watching yesterday a documentary that The Verge put together where they interviewed humans involved in the in-​content moderation and the amount of mental trauma these people are going through for $15 an hour, I just think is really rough. They have night terrors, they have trouble functioning day-​to-​day and you would too. If you’re watching hundreds of videos a day of animals being beaten to death or children being strangled, torture, war crimes, the most horrific stuff you can imagine and just watching it day after day, hundreds and hundreds, it would stay with you.

34:51 Matthew Feeney: But a content moderation policy requires humans at some level to do a lot of this and it’s reassuring, of course that algorithms seem to be doing a good job at the heavy lifting but I think we shouldn’t forget that there’s a degree of mental strain involved with a lot of the humans involved with this.

35:12 Paul Matzko: And it is… When you think of your own experience and I’m sure it gets worse as you become more famous. If you have lots of followers on Twitter, you have a blue checkmark, you attract trolls to an extent that the more middling, the small fry like us, Matthew, don’t get quite to the same extent. So this isn’t to discount the trauma even, that comes from a Twitter outrage mob or a troll-​fest on your account, that can be really horrible. Both left and right, whether it’s Carlos Maza or David French, describing just the vitriol and hate and death threats they receive from online Twitter trolls.

35:55 Paul Matzko: Not to discount any of that but I think on a day-​to-​day basis, except for typically public figures, most people… It’s not like lots of beheading videos show up in my Facebook feed or lots of horrible images of child pornography show up on my Twitter feed or Holocaust denial stuff. Now, again, if you’re famous, you might get more of that, pictures of ovens and things. But it works pretty well for most users and the cases where it doesn’t work so well, it’s usually your crazy great-​aunt re-​shares some terrible post glorifying a political figure in simplistic terms. You’re like, “Oh, goodness, Aunt Gertrude, not again. That’s the 50th one of those. You should be able to see through this by now.”

36:44 Paul Matzko: The actual… For most users on a daily experience, the algorithms, the content moderation, works plausibly well. It’s a bit of a, it feels like in a real world setting, there’s a bit of a moral panic right now about, “Oh no, Facebook’s gonna destroy American democracy,” “Oh no, Twitter is going to… ” You understand what I’m saying, where I’m going with that? It feels disproportionate to the actual lived experience of most users.

37:11 Matthew Feeney: It is important to remember, it’s a cliché but it’s true, Twitter’s not real life and everyone who’s worried about what’s going on tends to be on these platforms and I of course, have a bias, given my professional interest here but I’m pretty sure, my track history of productions notwithstanding, I’m pretty sure that a long-​term consequence of the Trump administration however long it lasts, will be some kind of tech regulation or tech war because these big Silicon Valley companies are under bipartisan attack at the moment. We’ve been talking today about the content moderation debate so that’s only a part of it.

37:51 Matthew Feeney: There’s also concerns about anti-​trust and of course, concerns about these platforms not doing enough to tackle foreign meddling, especially during election time, concerns about bots and so on. I think this is one ingredient in a pretty disgusting pie, that’s forming a lot of the debates today.

38:10 Paul Matzko: And they didn’t do themselves any favor. A lot of the social media companies did fall… They were caught up short a few years ago with some of their slow response to controversies, even their content moderation process. They’ve been reaching out to tech people over the last year or so, trying to say, “Hey look, here’s how this process looks like. Why don’t you sit in? Why don’t you try doing some of this yourself and see how complicated it is?” They weren’t doing that a few years ago, so they did fall on their faces to some extent and they’ve responded by trying to be more transparent, more open and more rigorous in their content moderation policies.

38:49 Paul Matzko: If I can pull one historical example, this idea of private institutions, private organizations kind of having a court-​like system, a way of deciding, of being an arbiter over what is acceptable and what isn’t. It might feel a bit alien in 21st century America. We’re used to a formal government-​run judicial court system but there is actually significant historical precedent for this semi private but private institutionally run court alternative, adjudication system and the clearest example is the ecclesiastical courts.

39:36 Paul Matzko: So even in the US history, up until the mid-​19th century, the overwhelming majority of stuff that we recognize, that today you would take to a court system, whether it’s custody disputes, inheritance disputes, just tort lawsuits. This person… While today it would be you had a car crash, back then, your cow wandered in someone’s, other’s pasture, broke their fence. Just basic civil liability stuff. The overall majority of those decisions were handled by church courts.

40:08 Paul Matzko: And so, you’re in a place like Massachusetts, you’re both member… By law, you’re part of a Congregationalist Church or you’re contributing to a Congregationalist Church system and so you would go, the church would literally convene a court of clergy and parishioners and they would hear between these two parishioners, “Okay, tell us your side. Tell us your side. Here’s the ruling.” You could appeal it up to a higher ecclesiastical court system. It was this whole formal adjudication process that was done by private institutions. We have something like that today with Sharia law courts, which despite what right wing people think, aren’t super scary.

40:48 Matthew Feeney: I think even today and there are some parts of the United States where there are Jewish courts that adjudicate similar kind of disputes and that’s one of the more fascinating things about, I think long-​term, about the current debate we’re having about social media, is what kind of private government or private governance institutions emerge and it may well end up looking something like that.

41:12 Paul Matzko: And the advantages that they have is that it encourages people… People have buy-​in to them and part of the problem of taking everything to like a federal court system, can be that folks… It’s fine as long as people buy the authority of the system and the authority of the court and the advantage of keeping that court system small, local, decentralized, is that participants are more likely to have buy-​in. They’re more likely to trust. They might not be happy with the decision but they’re more likely to abide and trust in the authority of the court.

41:48 Paul Matzko: That’s the advantage of… And that’s the kind of difficult territory someone like Facebook’s gonna have. They don’t have the advantage of being, “We are the State. The reason why you should trust the ruling of this process is because the State says it so.” That comes with a certain amount of authority in most citizens’ minds, maybe less so in our minds as libertarians but it comes with authority for most folks.

42:09 Paul Matzko: Facebook doesn’t have that but they’re also this multinational mega-​corporation and so it’s not an all local or decentralized and the fact they’re trying to create one size fits all rules that they’ve only recently been transparent about, so you don’t have that decentralized, local, the advantages of that private kind of system I just described with ecclesiastical courts or with Jewish courts or Sharia courts.

42:34 Paul Matzko: And so they’re neither fish nor fowl and I think that they’re in an awkward position. So while I fully appreciate the difficulty of what they’re doing and I wouldn’t fault them for a lack of trying, I just don’t know how sustainable it is in the long run.

42:47 Matthew Feeney: I guess time will tell.

42:49 Paul Matzko: Time will tell and time will tell that we are done with today’s episode. Thank you for listening and until next week, be well.

[music]

43:00 Paul Matzko: Thanks for listening. Building Tomorrow is produced by Tess Terrible. If you enjoyed Building Tomorrow, please subscribe to us on iTunes or wherever you get your podcasts. If you’d like to learn more about Libertarianism, find this on the web at www​.lib​er​tar​i​an​ism​.org.