Welcome to Pop & Locke! For our first episode we welcome Peter Suderman and Paul Matzko to discuss the many Black Mirror dystopias.
On the show today is Peter Suderman and Paul Matzko with our hosts, Natalie Dowzicky & Landry Ayres. Throughout this episode they discuss five distinct Black Mirror episodes and what is or is not scary about the dystopian-thriller. One common theme is that the creators of Black Mirror seem to assume the worst will happen, when many of the technologies introduced in this world could have positive effects as well.
Transcript
music]
0:00:00 Natalie Dowzicky: Hi. And welcome to Pop and Lock. I’m Natalie Dowzicky.
0:00:06 Landry Ayres: And I’m Landry Ayres.
0:00:07 Natalie Dowzicky: We’re going to dive in to no other than inspired show, Black Mirror. The science fiction anthology series creator Charlie Booker asked if technology is a drug and it does feel like a drug, then what precisely are the side effects? Today we are going to pick the five scariest episodes for libertarians from the series, starting with the most realistic to just the downright eerie. In the studio to help us is Peter Suderman, Features Editor at Reason, and Paul Matzko, Technology and Innovation Editor as well as Building Tomorrow Podcast host here at libertarianism.org. Thanks for joining us, guys.
0:00:39 Peter Suderman: Thanks for having me.
0:00:41 Landry Ayres: Now, before we get started counting down which episodes we think are the most particularly scary for libertarians, what makes these dystopian scenarios scary for libertarians in particular, do you think?
0:01:00 Peter Suderman: I guess what I found frightening for libertarians is that all of the episodes that we’re gonna talk about today at least, end up suggesting that technology is a means of control and that it is a vehicle for loss of personal choice, and that technology, in some cases often de-coupled from government, but in other cases being used by government, technology is on its own something that robs us of what I think is the fundamental libertarian value, which is individual and personal choice. And these episodes are all built around the idea that technology is inescapable and that once it has you in its grip, you can’t get away from it.
0:01:48 Paul Matzko: It’s fundamentally pessimistic. Whether or not you’re a tech and innovation person like myself, libertarians tend to be optimists, We’re optimists about human beings. We’re optimists that if you free people to make personal choices, they’ll do cool fascinating, good stuff with it. On the net, you’re always gonna have bad apples, but on the net we tend to be optimists. And specifically when it comes to tech and innovation, each of these takes attack our set of technologies and says, “We’re going to imagine the most nightmarish possible vision of it being used to the absolute umpteenth degree in a horrific way.” It’s the most negative possible imagination, with very little positive imagination work going on. We know, as libertarians, that the history of economic growth tends to be to make things better. We have more and better things every generation because of technologically powered innovation. This is a world in which that’s not true. Technology is gonna destroy the good stuff we have. Watch out.
0:02:50 Natalie Dowzicky: Well, yeah. I definitely think that Black Mirror is feeding into our tech panic, or robots are overtaking the world and they’re taking that to an extreme, like Paul just said. So, what kind of criteria are we gonna be using to evaluate these episodes? What are specific things we’re gonna look towards that are worse than others?
0:03:07 Landry Ayres: Well, I think as Peter mentioned, the loss of the ability to choose, and there being sort of that top-down level of control, and the pervasiveness of the technology. I think one of the strongest ways that challenges technology to grow is when it is usually decentralized, and there’s a lot of options, it forces competition. And we don’t see a lot of that in the technology that’s examined in Black Mirror. Everything tends to be very monolithic in the way it’s portrayed, and I think that is part of what makes the technology involved so fear inducing.
0:03:45 Peter Suderman: It’s a show in which there appears to always be only one technology company making any given product and in which everything is a monopoly and no one ever seems to think that that’s weird. Now, even I get that impression. There’s a sense in which Google and Facebook and some of these companies really sort of like, “Wow, you assume that basically everyone that you meet is using Google,” is that most people are on Facebook, I should note I’m not. But it’s a reasonable assumption, but it is a show that takes that to an extreme and just assumes that technology is inherently monopolistic. And again, that not just that everyone uses it but that it is inescapable and unavoidable.
0:04:31 Landry Ayres: And the show is always framed in a way that they give you just enough information for it to suggest that perhaps maybe a company is involved in this, maybe it’s state mandated and there’s some higher power that people can’t opt out of it in some way. But they never give you enough reassurance to know that perhaps this isn’t in the world of the show as big a problem as this one storyline that uses it provides. So, I think that’s another reason that it can be kinda scary is it doesn’t give you quite enough to make a completely qualified evaluation of the whole world of the show.
0:05:08 Paul Matzko: Lest I sound overly critical of the show, I love Black Mirror, watched every episode. So, to point out that it’s overly pessimistic and it’s not as optimistic as I would be about tech and innovation isn’t saying that the show isn’t good at what it’s doing. It’s actually often quite good at what it’s doing. In fact, that’s true of the horror genre as a whole. Horror takes a kernel of legitimate fear, the fear that, “Oh no, someone might break into my house and do things to me and my family.” Now, that’s unlikely, it’s possible, but it’s unlikely. But let’s magnify that, “They’re gonna break into our house, they’re gonna terrorize us wearing animal masks, maybe they’re actually dopplegangers of us that live in secret underground chambers.” It takes legitimate kernels of fear and heightens them to make a point. Black Mirror is doing the same thing with tech. So, what’s interesting to me, I think, as we analyze whether something is libertarian or not, is to think first of all, is that kernel a legitimate thing we should be worried about, regardless of whether or not they’re heightening it? And thinking through to who they decide to make the agent of the problem.
[music]
0:06:19 Landry Ayres: So the fifth scariest Black Mirror episode that we decided, and obviously we can talk about whether you disagree with the way that we rank these, but the fifth scariest episode: The Entire History of You.
0:06:35 Natalie Dowzicky: Yeah, so this episode centers around a grain implanted in your skull that records everything you do, every conversation you have, every small encounter you have with everyone else. But the scarier part of this is very much like our old VCR tapes, our DVD players, we can rewind and replay our memories, as well as project them onto screens in our living room for everyone to see. So, Liam is the main character in this episode, he starts off with him having a job interview, and then he immediately gets in the car and replays all the memories and watches, and over-analyzes his entire job interview. But that was to just introduce us to the context of what a grain is. Later on in the episode, Liam sits down with his wife, Ffion, and replays the times that she’s been unfaithful. So, basically calls her out on her infidelity and her lying. And…
0:07:26 Peter Suderman: Well, he starts by being suspicious that she is and she denies it. And then…
0:07:30 Natalie Dowzicky: And then he’s like, “Let’s pull up the grain.” [chuckle]
0:07:32 Peter Suderman: Right, and then they go through replays and it comes out that, first, she was lying about a relationship with somebody who they met, and then it turns out that there’s much more going on and it sort of slowly unfolds. It’s not just that he replays these, it’s that it is a kind of slow, almost detective-like revelation, although he’s a drunk jerk as he is pressing her on this.
0:07:54 Natalie Dowzicky: So, it’s also interesting too, because if you think about a virtual paper trail that our phones leave, this whole idea of our memories can now be replayed and there’s no more he said, she said in a relationship aspect, because you can be like, “Oh well, did you actually say it? Let’s go back and look.” That’s essentially what ends up happening in the episode and they end up parting ways for infidelity. And so, what are our thoughts on this? Would we wanna live in a world like this? What are some upsides to being able to replay your memories? Which the episode didn’t touch on at all.
0:08:25 Paul Matzko: Yes, I want to live in that world. And here’s why, because again, this is the most pessimistic possible scenario involving someone who’s essentially peak level obsessive compulsive. He’s not a nice person. He’s not the person you wanna be married to, let’s be frank, ’cause he… And those people exist in this world, and they still are constantly interrogating their partners. Have you been faithful? I do wanna live in this world, because there is an optimistic version of the story. I don’t wanna live in his house with him, I wanna live in another house with my…
0:08:56 Peter Suderman: His house was pretty nice, though.
0:08:57 Paul Matzko: It was nice.
0:08:58 Peter Suderman: I would not mind living in that house, maybe without him.
0:09:00 Paul Matzko: In his house without him. [chuckle] That’s right, it was a pretty nice house.
0:09:02 Peter Suderman: Yeah, if I could live in the world of the Black Mirror architecture, that would be great.
[chuckle]
0:09:06 Paul Matzko: It’d be ideal. Yeah, it’s very minimalist, and yeah, it’s pretty nice.
0:09:10 Natalie Dowzicky: Yeah, it’s very modern.
0:09:11 Paul Matzko: But there’s an optimistic alternate version of this story. Imagine a world of perfect recall, where he has a car accident, which he does in this. But in this car accident, it’s the inciting incident. His own recent memories of his family are wiped, because his grain is destroyed, but he’s able to reconstruct his past, reconstruct past memories of his wife and kid, because of their grains. And so, rather than losing through amnesia his entire past, who he is, he can recapture part of that. That’s the optimistic scenario. This allows you to re-forge to protect connections that you have with your loved ones. So there’s this, the heightened, scary version, but hey, think about the optimistic scenarios. I wanna live in that world where people have that kind of recall.
0:09:56 Natalie Dowzicky: Do you think people would be more honest in this world or more upfront, knowing that someone could pull the card on them like, “Oh, let’s take it to the grain?”
0:10:04 Paul Matzko: [chuckle] Well, we can ask that right now. Are people more honest in Britain, ’cause there’s CCTVs everywhere versus the US? In a sense, there’s a social recall because of the prevalence of surveillance in the UK that we don’t have in other countries. It’s an interesting question, I don’t know, I don’t know.
0:10:21 Peter Suderman: As a journalist, I definitely wanna live in this world, because I’m constantly trying to record things and it’s a pain. And also, if you could, say go confront politicians by just saying, “Here’s what you said.” “No, no.” “This is what you said to me. We’ve got it on the grain, it’s all recorded right there.” I mean, I think it would change a lot of my job, a lot of the world of journalism very specifically, but also, just generally, it would change how businesses operate. If there was never any… I should say not never any, but if it dramatically reduced the potential for a lack of clarity about instructions, you wouldn’t have all of these meeting notes that are like, “Wait, what did we actually decide there? Who said that? Did we really mean that? Is this just a mistake in the notes?” And so, I think in a lot of ways that sort of recall would actually be pretty valuable. And the other thing that this episode doesn’t get into is that social conventions develop around technologies to mitigate the likely problems.
0:11:26 Peter Suderman: I know they have, to some extent, suggested that there is a social convention of when people get together, almost like when people get together now and they end up playing YouTube clips for each other. They just end up going over stuff that’s locked away in their grains, in their heads, but they don’t… It’s the argument that develops between Liam and his wife is presented like the kind of argument that no one has ever heard of before, that this has never happened in the history of the world in which in a world where everyone, nearly everyone has a grain in which all memories are recorded perfectly. It’s as if no one has ever dealt with this problem before. And that’s just a weird thing, even we have this now with social media, which is that there’s a lot of interactions that people don’t like. And we haven’t solved all of them. On the other hand, we do have a lot of conventions and a lot of people have ways of dealing with them, they’re expected, people aren’t like, “Oh, this guy was a jerk to me on Twitter,” or, “I got piled on today. I’ve never heard of that happening before. What do I do? This is a hellscape.”
0:12:26 Peter Suderman: Instead, people are like, “Yeah, I had to block a lot of people, it was pretty annoying. I’m thinking about maybe I will switch my account to private,” something like that. There’s just, there are responses that develop. And one of the things that Black Mirror does is it always present these technological problems as essentially novel, even in worlds where they shouldn’t be.
0:12:45 Paul Matzko: You think of the evolution process that you’re describing on social media. First, think about… I joined Facebook in 2007, the first generation of Facebook users were… They didn’t guard what they put on at all, and I count myself in this. Pictures of yourself drinking or goofing around or… Well, society, people developed defense mechanisms, responses, social norms in response to this technology. People are a lot more guarded than they used to be. We’d expect this to be the same in this world, but it’s imagining this technology dropping like Facebook did in 2007, with no evolutionary process towards. It’s the first time anyone’s encountered the grain, the first one time anyone’s encountered social media. So it’s not a very realistic world in that regard.
0:13:32 Natalie Dowzicky: Or perhaps they’re still pretty early on in the adoption, so they’re just learning everything that comes from it?
0:13:37 Paul Matzko: Yeah, but you’d expect grain… These grains are pretty advanced. We’re not prototype, first time. Everyone has this, so this is a product that is cheap and ubiquitous. This is multi-generations of grains in this world, so…
0:13:51 Landry Ayres: Yeah, ’cause you can see it at different iterations throughout in different episodes of Black Mirror’s. While there’s similar technology that is consistent throughout, or at least tangentially related, there’s the theories that they all exist in the same universe. So, I think that’s also something to consider. So, for me, it’s scary in that it could be used. It’s emblematic of a lot of technology panic, I think. So, that’s why it’s, for me, the fifth scariest. Yeah, there’s potential, but there isn’t… It’s not spooky for libertarians, specifically, because of a certain value that we hold or anything like that.
0:14:29 Natalie Dowzicky: Yeah. I also didn’t think, of the ones we chose today, it was the least scary or one I didn’t really think it was scary at all in this sense. Because I could easily see us being able to, like what Paul just said, adapt to this environment where everyone has grains. But I also thought, and this was mentioned at the very, very end when he pulled his grain out, that I thought it was an interesting way to end the episode, partially just because he was basically admitting to himself he’d rather be in our world, so to speak, rather than in his world where everyone can see his memories. I thought that was the scariest, if I’m using the word scary, or the most thought-provoking part of the episode in that sense. But then again, it didn’t really have an effect on the overall trajectory of the episode.
0:15:13 Peter Suderman: No episode where we all said, like, “That’s a nice house,” could be all that…
[chuckle]
0:15:18 Paul Matzko: I feel like I said that several times. All of the houses. The dwellings in that episode were really just like, “Oh, man.”
[chuckle]
0:15:27 Landry Ayres: All right. The next episode, a little bit scarier, moving up on the scare scale is White Christmas.
0:15:34 Natalie Dowzicky: All right, so in this episode, it’s best explained in three parts. Matt is our main character, he’s in all three parts of the episode. And we are in a world where everyone has a Z-Eye implant, so here we are with another implant that goes into our head. And it allows you to see, or it allows others to see what you’re seeing as well as it allows you to control what you’re seeing in terms of time periods. So Matt, our main character, runs a seedy operation of online pervs who like to watch other people go on dates and like to… [chuckle] He’s running a side business. [chuckle] And that ends up getting him in a little bit of trouble, because he… What happens on one of the dates that everyone is watching, one of the men in their group gets murdered by someone who it’s believed to be a schizophrenic. And then, it cuts to part two of the episode, where Matt gets a woman to copy her consciousness and downloads it into what is referred to as a cookie, which looks conveniently like a very small Amazon Alexa. [chuckle]
0:16:44 Natalie Dowzicky: And basically, this woman has effectively trapped herself inside the cookie. Her consciousness exists inside the cookie, and then she can control her outer body that’s outside the cookie, but she feels trapped. And Matt can also control time within the cookie. He tortures her a bit, and when she doesn’t wanna do something specific, he says, “All right, that will be six months solitary.” And then, the third part of the episode, which is the juicy stuff is, we get a different set of characters. We have Joe and Beth. They’re in a bit of an abusive relationship. And Beth literally blocks Joe from her Z-Eye, which means he now shows up as a gray blob and he can’t talk to her, he can’t see any pictures of her. All the memories of her are her as a gray blob. And she ends up having a child, which Joe believes is his. He goes on to stalk her, even though he only sees her as a gray blob and sees the child as a gray blob. Beth ends up dying tragically and Joe goes and tries to see the baby and ends up killing the guy that was taking care of the baby. All this to say, Matt gets tied back in because Matt is having ongoing conversations with Joe about what happened and, “We’ll go through the whole story of you and your love interest,” I don’t think they were married, “You and your love interest.”
0:18:06 Natalie Dowzicky: And Joe eventually confesses to being killed by this guy. Or Joe confesses to killing the guy. And then, we realize that we were in Joe’s cookie, essentially, and that Matt manipulated Joe into eliciting a confession. So then, we get zoomed back out into the real world, where Joe is effectively punished for his confession to murdering someone.
0:18:31 Paul Matzko: So, it wasn’t even his confession. It was his Sim’s, his copy’s confession.
0:18:34 Natalie Dowzicky: His cookie, his cookie.
0:18:35 Paul Matzko: And so, what they did was, they convicted somebody based on pressure tactics in a simulated world against a simulated copy of that person. And so, the original human being never admitted to doing the crime that the original human being did. And it’s a somewhat scary vision of a criminal justice system in which your self can be turned against you without your knowledge or your acceptance.
0:19:03 Natalie Dowzicky: Absolutely. And it’s also from an interesting standpoint too, the idea that this was settled, but the idea of manipulating time with your consciousness or your not-self inside the cookie. Because what we saw as a few minutes was, I think Matt said five years to Joe. So, it took five years to get that confession out of Joe’s cookie self, but it literally… He just manipulated time. Whereas we’re thinking… If we think about how we interrogate people now, we’re in for 17 hours or so. To Joe, that felt like five years. So, I think that was also a very interesting case.
0:19:44 Paul Matzko: There’s a critique there. I think it’s a little more subtle, but there’s a critique of isolation, of criminal justice, of… We put people in a solitary confinement, and there’s all kinds of research which suggest our perception of time is relative, and you put someone in solitary confinement, it itself can be a form of torture if you leave someone there long enough. This just amps that up. That’s a real issue, you can drive someone crazy, to confess, to manipulate them. It’s a power play using solitary confinement for a few days or weeks, or even hours. While here, what happens if you could ramp that up to eons for these cookies in the space? So, there’s a critique there of our criminal justice system as well. It also points to, again, we have a societal evolution or lack thereof problem here, which is… So, imagine in this world, people are creating digital slave clones of themselves to be their personal Alexa, which is what happens to the one lady, or to extract confessions. Don’t you think people would have noticed? Again, it’s a world in which it’s novel to these people that this is an ethics issue of some kind. She seems unaware of what she’s just done, the lady who’s you know…
0:20:58 Natalie Dowzicky: Oh shit, the one who was originally in the cookie was completely unaware.
0:21:00 Paul Matzko: Completely unaware.
0:21:01 Natalie Dowzicky: When she woke up in the cookie, she was like, “What’s going on on?” And one of the reasons she was tortured for six months in confinement was because she didn’t necessarily understand why she was there.
0:21:11 Paul Matzko: Right, which if had… You would expect the clone would know everything she knows, so we know she’s completely unaware that this is a thing that even exists. The criminal, he’s like, “What? You mean you cloned me and got a confession? I’ve never heard of that before.”
0:21:24 Landry Ayres: Well, I think it’s interesting, because at one point they do show the woman after she has installed her cookie, and she seems to have no reference that what is going on inside the cookie is actually personified as her personality. So, it’s almost as if perhaps the way that we’re viewing the technology is from a sort of… Magical realism is, I think, the wrong term, but it’s a fact that that’s not maybe what the technology was created as, was to create a personification of this person that will function and do these tasks, but that if we take this part of your brain out and we put it in there, then it can do all of these things, but there is some plane and a way that we don’t realize that that’s actually creating another almost sentient being inside the cookie.
0:22:16 Peter Suderman: Yeah, the implication I got from it was just that the companies know and are hiding it from the consumers, as if Apple’s Siri was actually a real… Effectively, a real-life person, an intelligence on par with human beings, but Apple, of course, has presented Siri just as a robotic assistant who we don’t have to worry about. You don’t have to worry about insulting your Siri…
0:22:40 Natalie Dowzicky: Siri doesn’t have rights.
0:22:41 Peter Suderman: It doesn’t have rights, it doesn’t have feelings, it’s not a thing that you have to worry about. And the suggestion here is, actually, what if she did?
0:22:48 Paul Matzko: It asks us to suspend belief, and I suppose, in this case, I’m struggling to suspend belief that you can a role in which these detectives are like, “Yeah, yeah, crank it up,” and this wouldn’t get out, that that’s what’s going on here. There’s a lot of people involved in these companies and these processes. Word would have gotten out, and you would expect in this world there would be an AI rights movement, there would be… People would recognize it. There would be a push of laws limiting their use and when you can use it and when you can’t use it, whether these simulacra are allowed to be used for what they can’t. This is a world which none of that’s happened. No societal evolution at all. It just got hid up, covered up. It requires me to suspend too much of my own belief.
0:23:31 Landry Ayres: I also think it brings up the topic of isolation as punishment that you brought up, Paul. It raises that a little bit more towards the end, when we see that you can block people, not just as individuals, but that perhaps some people can commit crimes so egregious that they’ll receive an almost literal scarlet letter on themselves that isolates them and allows everyone to see them as a blocked-out blob, and that sort of isolation, even though they can somewhat interact, and that that is another extreme that can be used as punishment.
0:24:05 Natalie Dowzicky: And so, it’s isolation essentially that you know what you’re missing. It’s isolation that, you’re walking around and you see everyone else interacting. You’re a blob to them, no one respects you, no one says hi to you, none of that interaction, but you see it all happening almost like you’re watching it on TV and people are just walking on by. Which happens to Matt at the end, because even though he helped what we’re assuming are detectives elicit this confession, he is still being punished for his earlier involvement, and he thinks that he’s gonna get off from helping the detectives out, and they’re like, “Oh, no, sorry, everyone’s blocking you. This is your new normal.” So, what we didn’t touch on is, would we live in this world? What are good things that could come out of any of these…
0:24:52 Paul Matzko: Slavery’s great for people who aren’t slaves.
[chuckle]
0:24:55 Landry Ayres: But I was gonna say, if I existed in this world…
0:24:57 Peter Suderman: No, I feel like it’s bad for whatever we think a soul is or…
[chuckle]
0:25:03 Peter Suderman: Even if you’re not a believer in souls, it’s bad for you if you’re… It’s not great.
0:25:08 Landry Ayres: I wonder, if I was a normal person with no knowledge of Black Mirror and this viewpoint into what happens in the cookies, would I have someone take out a part of my brain that can do all of this assistant stuff for me? Maybe. But knowing what I know, I’m like, “Oh, of course not.” But hindsight is 20/20.
0:25:28 Paul Matzko: There’s two different… You have two scenarios. One, that the confession, torture, and if that’s truly intelligent, you’re torturing a human being, just a digital human being. That’s horrific. The other, you can imagine, if you tweak that scenario, it’s assume the AIs have to be like people. In theory, in some future society, you could create something that’s intelligent. And again, this is a permeable line. What counts is full artificial intelligence? What passes the Turing test? But we assume that intelligence must look human, it must be driven by the same desires as human beings, and we know that’s the assumption because that’s how they get AI in this world, it’s extracted from human beings. There’s a possibility for AIs that enjoy doing what that one does, controlling the house? I mean, extracted from a person, but is a super-intelligent Siri inherently necessarily unethical? I don’t know about that.
0:26:24 Peter Suderman: I also think that they kind of overlooked opportunities that are suggested within the world to create a nicer and better life for these AIs. And on the one hand, I certainly would not subject a copy of myself to unlimited slavery, where the only thing that they can do all day long, with no rest and no break, is just manage my life and be my personal… That seems awful. I wouldn’t do that to me. On the other hand, we know that you can create virtual environments. We know that you can have people give them something to do that isn’t just managing a person’s house or their life. And why couldn’t you have six or 10, a team of yourselves, all of which live in a beautiful Black Mirror-style mansion and have access to whatever they want and work an hour a day, and then the rest of the time they can write novels, or they can play video games or do whatever it is you would be doing during your leisure time, and actually perhaps even have a more enjoyable life than you might. I think I like subject to myself to that.
0:27:31 Landry Ayres: I’d be down.
[chuckle]
0:27:32 Natalie Dowzicky: But it is like, “I’m signing myself up.”
0:27:34 Peter Suderman: Knowing that they would knowing that they would be perhaps even happier than I am.
0:27:37 Paul Matzko: It’s… Again, it’s like anything is possible in this space, but because we’re Black Mirror, we’re just going to imagine the most horrific possible possibilities, not the good potential.
0:27:45 Natalie Dowzicky: Well, because the beings that are in the cookies are… Well, one, she’s in just basically a control room that’s all white, and there’s nothing there but a white bed. And then, the other scenario with Matt and Joe is, they’re in an old run-down cabin, and it doesn’t seem to have much electricity… It’s just not an ideal Black Mirror architectural beauty. [chuckle] They’re in very bleak scenarios.
0:28:11 Peter Suderman: But why not create that idealized world? And one problem you would have with it would be that you might get lonely. Now, you would have, perhaps, many of yourselves to talk to, but also, there appears to be kind of no internet in the traditional sense, no connection between all of these simulated beings. And so, why can’t they talk to each other and be friends with each other? And again, to bring up a different science fiction work, Neil Stevenson’s new book, Fall, imagines the development from being one, actually, from proto-being one of a whole world of virtual people who build the society, essentially, inside a bunch of servers as humanity migrates into the server afterlife. And it’s a fascinating and much more… It’s also a 1,000-page novel, [chuckle] but it’s a fascinating and much more detailed look at how that might work. And he doesn’t suggest that there would be no problems or that everybody would be perfectly happy all the time. What he does suggest is that people would… That society would develop, people would have friends and family and sort of normal lives and expectations, and that there would be ways that it would resemble, not perfectly, the world that we live in now.
0:29:20 Paul Matzko: Can I offer a non-tech related reason why this is world I wanna live in?
0:29:23 Natalie Dowzicky: Absolutely.
0:29:24 Paul Matzko: It’s a world that has John Hamm in it, and I’m a John Hamm stan. Call me Hammster.
[chuckle]
0:29:28 Peter Suderman: Our world has John Hamm. Hammster.
0:29:32 Paul Matzko: But in this one, John Hamm will whisper into my ear if I’m a cookie.
0:29:36 Peter Suderman: Can we get John Hamm in my cookie?
[chuckle]
0:29:39 Peter Suderman: Yeah, that’s what it… Can’t you get John Hamm to… Is this a market up… Maybe you want yourself to be your digital assistant, but what if you want John Hamm to be your digital assistant? Couldn’t you get rich being John Hamm and being like, “Take my copies. I will chat… Right.” This is a premium digital assistant, it’s John Hamm and he’s gonna be John Hamm in your ear all the time, except John Hamm, the real one, just gets to live in his Black Mirror mansion.
0:30:04 Natalie Dowzicky: If you think about it, all the people that want the special voices for their GPS now, you can get Morgan Freeman to voice your GPS. It’s only a few steps until we… Everyone gets a little John Hamm cookie. [chuckle]
0:30:16 Landry Ayres: Hamm cookies.
[chuckle]
0:30:21 Natalie Dowzicky: Alright, so let’s move on to the next episode. Next on the list is going to be 15 Million Merits, and this is from the first season.
0:30:31 Landry Ayres: Correct. This is only the second episode that came out, originally released in the UK. In a world where most of society, as we can see it, at least, must cycle on exercise bikes in order to earn currency called merits and perhaps power their world, we see Bing, who meets Abby, and he actually convinces her to take part in a American Idol, X-factor-like talent show in order to escape this world around them. However, after doing so, and after essentially being drugged, the only offer of escape that she’s given is to appear in an adult entertainment show by one of the producers. So, in order to seek revenge, or something like that, Bing then devotes all of his time to earning more merits so that he can appear on the show again, and he performs a dance, but in the middle of doing so, stops and has a shard of glass that he’s hidden up his sleeve, and threatens to hurt himself on air unless they listen to his demands. And he goes on this rant about the phony-ness of it all and how fake everything seems and disingenuine it all is. But rather than actually seeming to accomplish much, it then flashes forward to seeing him ranting again, only to end with an advertisement and pan out and showing him in much more lavish living quarters with a view of what looks to be the outdoors and greenery, and seemingly, very little has changed for the masses.
0:32:13 Natalie Dowzicky: So, he’s a sell-out.
0:32:14 Landry Ayres: Yes, essentially. What is particularly spooky, or not spooky, about this episode?
0:32:21 Paul Matzko: We just described the life of Logan Paul. It’s every day, bro, the grind around the treadmill, the life of an Instagram influencer. It’s all fake, it’s phony, it’s performative outrage, performative anger. It trains you to be a fake and a phony. It’s just, it’s a heightening of that.
0:32:36 Natalie Dowzicky: This was the first episode of Black Mirror I watched because I was told by lots of friends not to watch the first episode of the first season just because it would turn you off for the rest of the show. [chuckle] So, I watched this one first. And honestly, the first thing I thought of was, “Do these people who are feeding into this machine, as we’re calling it, do they have any experience with any other type of life?” ‘Cause you could get a scenario where Bing you would think knows what other life is like or that he was… I don’t know if he had other experiences or this was a new change or what have you, but it didn’t seem like any of the other characters were really questioning why they were doing what they were doing, and I was like, “Do they not have 2019 experience of real life? What kind of quality of life is going on here?” That’s what struck me as the most odd, and I guess going back to suspending disbelief, I was like, “Well, I would have liked to see more of it being them using experiences from a previous life or kind of understanding how they got to this point.”
0:33:42 Paul Matzko: There’s another fun house mirror, still dystopian, but fun house mirror version of this story, which is WALL-E. Think of the people in WALL-E. But in this world, in the imagined world of WALL-E, it’s the flip. That people use entertainment and comfort to keep people literally fat and happy and floating, they’re mindless consumers. So, what’s funny to me is that you have… Both of these are heightened versions of current concerns that actually are very similar, but end up in very different places. One in which people are constantly on the treadmill to produce entertainment for some other group and others where we’re in a post-scarcity world where you can have all the big gulp easy one, and everyone’s literally fat and happy. Both, though, are for all you fans of Italian and continental philosophy, Antonio Gramsci. It’s Gramscian hegemony. You can’t escape it, even if you want. You’re kept… Even attempts at resistance, ultimately buttress the strength of the machine holding you in check, until a plucky, little robot comes along and defies the system.
0:34:47 Natalie Dowzicky: Finds a plan.
0:34:47 Paul Matzko: Yeah, finds a plan.
0:34:48 Peter Suderman: But Natalie, to your point, I think part of what makes this episode interesting and makes the metaphor work in certain ways… And I agree, it’s weird how contained their lives are and how little they question, “What else is going on?” On the other hand, I think the argument is, that’s all of us going to work every day and not really… Just to build up credits in this system so that we can have mindless entertainment in the evening, so that we can go back and do more of it, so that there’s… And there’s never anything real, and there’s no purpose to any of it. We’re just riding the bike or running the treadmill for all of our lives. I don’t think that’s what you’re doing at Cato, and I don’t think that I’m doing at Reason.
[chuckle]
0:35:27 Peter Suderman: But then, I would think that, if I’m part of the system. I think the argument of that episode is… And part of what makes that part of the metaphor works is that it is trying to say that people don’t question and don’t know what else there is. And maybe there even isn’t anything else, because what does he have at the end is, he’s still participating in the system, he just has a nicer apartment and a view of some greenery.
0:35:53 Landry Ayres: Yeah, and you see, once again, going back to the theory or… It’s almost confirmed now that they all exist in the same universe. There’s another episode where you see the talent show that they are on actually playing in the background of someone who seemingly lives a life very much like we would live in 2019. So, there is the element of, okay, maybe the entire world isn’t like this scenario that Bing lives in. So, how did they end up there? Did they opt in somehow? And that by doing so, they’re provided at least a steady stream of… A trickle of consistent living quarters or something like that. So, once again, it makes me wish that we got a little bit more of a pan-out to see how these technologies are interacting with one another. But it obviously is way too much to tackle, and that’s part of the appeal of the Black Mirror is that you get this self-contained, mysterious worlds that we don’t necessarily get fully explained to.
0:37:01 Peter Suderman: The other connection, I think, that we should probably just bring up since we’re talking about both episodes, is that the song that the woman sings in 15 Million Merits, in this American Idol-like Show is Anyone Who Knows What Love is Will Understand, which is an Irma Thomas song that also is the karaoke song that is sung in White Christmas. And then, I think also appears… It plays in the background on the radio in another episode.
0:37:32 Landry Ayres: I believe it’s in be right back, the one with the doll. They listen to it in the car, and they have romantic memories and it’s something where they hope… It’s integral to their relationship. So, it’s something that obviously carries a lot of weight in the series of the show.
0:37:48 Natalie Dowzicky: But they do a lot of small things like that, where they hint at… Where people start creating all these elaborate theories of how characters might be related or how text might be related, just small hints like that where producer is probably on the other side, and they’re like, “Oh no, we just… These worlds just… They’re just here, they’re just little pockets.”
0:38:03 Landry Ayres: It’s a fun little thing for the viewers to enjoy and point out even if they didn’t intend for them to all exist in the same timeline or something like that.
0:38:12 Natalie Dowzicky: So, what do we think about this little pocket of the world? Are we gonna go on a bike for hours to get our merits? Are we already in this world?
0:38:21 Paul Matzko: Yeah. On the treadmill of productivity.
0:38:24 Peter Suderman: I will say that as somebody… Maybe… Do any of you here play online video games at all, where you have to spend a lot of time doing what’s called grinding, where you have to level up your… World of Warcraft, even stuff like Anthem, like a bunch of these games. That’s exactly this. You go and you log in and you do the same thing. Destiny is a very popular version of this. You log in and you do the exact same thing basically, every day. And sometimes there are some new levels that come out, your character virtually levels up, and the point of leveling up is so that you can get new weapons and new stuff for your avatar that will then allow you to take on more powerful enemies that are, basically exactly as difficult as the previous enemies now that you… They’re a little more powerful, you’re a little more powerful. And the cycle just continues, and continues. And the end point of this is that you start another game, or you restart this. And there’s something… There’s times when I do it and I’m like, “Oh man, I really… That was eight hours I will not get back.” And there’s other times when I’m like, “Actually. This is kind of fun and fascinating.” And these games are really smart and complex and interesting, and also, surprisingly social.
0:39:37 Peter Suderman: And that’s one of the things that this episode overlooks or decides to eliminate, ’cause these people don’t have any relationships to speak of, and human connection is just totally absent. And if you look at Fortnite and how a lot of the most hard-core fans play Fortnite, sure, it’s a game that they wanna do well at, and sure it’s a game they log in and spend a lot of time playing just so that their avatar can have a new hat or a new weapon. On the other hand, they also spend a lot of time talking to their friends there. And it is for a certain subset of players, it’s basically a social network. And perhaps an even better social network than Facebook or Twitter in terms of how it encourages users to interact together, to achieve goals, to actually talk to each other rather than just post nonsense about politics. There is politics that sometimes comes up, but yeah, so there’s parts of our lives that are already like this. And what’s weird is… Maybe not weird. We’re choosing to do that. If you’re playing, no one is forcing anyone to play Fortnite as far as I can tell.
0:40:39 Natalie Dowzicky: I’m sure someone out there is. [laughter]
0:40:39 Peter Suderman: And there’s about a…
[overlapping conversation]
0:40:41 Peter Suderman: Right. That it’s an episode of Black Amur. Is someone being forced to play Fortnite?
0:40:47 Paul Matzko: Yeah.
0:40:47 Peter Suderman: No, but for the most part, when we engage in those experiences, they’re ones that we actually seek out and choose as forms of relaxation.
0:40:56 Paul Matzko: Yeah, or even the treadmill of work. Most of us could actually work less. You could survive on a part-time job. It would come with consequences, you get fewer stuff, but there are people who opt out of working. They’re beach bums, they go… They’re climbing bums like the dude who climbed Yosemite. There are people who opt out of work and they’re fine with that. But most of us choose to work, we choose to work as much as we work. Why do we climb on the treadmill? Well, because of the relationships, relationships at work with co-workers, relationships with family. We’re able to support a family using the money… The things that… We do it because we choose to do it and because we enjoy it, we find meaning at work. Actually, when people stop working, the rate of mental illness, depression. The elderly who work live longer. So they’re framing it as, “Here’s this meaningless empty treadmill.” Well, that we’re all on and yes, we all are, but they take out all the good stuff about the treadmill.
0:41:56 Landry Ayres: That’s what makes it so spooky.
0:41:58 Natalie Dowzicky: Yeah. Alright, so let’s move on to our next episode. So we’re gonna hit on White Bear. We have two episodes left, so White Bear will be the next.
0:42:07 Landry Ayres: This one for me is… I think there’s something about it It’s one of my favorite episodes, because I just didn’t anticipate the sort of twists that came along. I don’t know if that was just me. But in White Bear, our main character Victoria wakes with no memory but sees a photo of her and a young girl which we can assume is most likely her daughter, and then…
0:42:29 Paul Matzko: That’s what she is.
0:42:30 Landry Ayres: Exactly.
0:42:31 Paul Matzko: It’s meant for us to empathize with her and follow her and that.
0:42:35 Landry Ayres: Exactly, we go along with her on this journey where she’s wandering through an almost abandoned cityscape where she begins to be hunted by these sadistic and violent hunters and no one will assist her except for a very select few of people. Most people just stand far away and seemingly view her through their cellphone cameras. And eventually after a long, long scenario, where she’s almost able to defend herself, we learn that Victoria is actually a convicted criminal who has plead guilty to aiding her fiance in the abduction, torture, and murder of a young girl. And so as her punishment she is basically sentenced to have her memory wiped every night and relive this hunting scenario that people can then come visit and take part in as part of the White Bear Justice Park. What did you think?
0:43:41 Peter Suderman: I would say just in terms of sheer intensity, this is the most intense and most viscerally frightening of the episodes, even of the one even including the one that we’re gonna talk about next. It’s a fascinating kind of a thought experiment. At first, you think it’s a fairly straightforward critique of social media because people are following her with their phones and this is her… Then she walks outside and she sees all these people in the windows, and they’ve all got their phones and they’re taping her, but they won’t talk. And then there’s, of course, the hunters that are coming after her as well, and so the hunters are just… The people are taping, are taping awful acts… Are there and they seem to be excited when the violence gets worse. And at one point she’s told by somebody else that there was this signal that went across the television and after it, most people snapped and people realized that they could just do anything they wanted because there were no rules anymore. I mean, kind of a… It is a very similar to a description we’ve heard a lot of how people act online which is, “Oh wait, you’re online, you’re anonymous people, there’s no rules anymore.
0:44:50 Peter Suderman: Now, people are just going to be horrible to each other, not just jerks, they’re going to commit a kind of verbal violence.” And then, of course, what we learned is that she is someone who has done this herself, but forgotten it. And so after the whole… After the episode establishes her as a sympathetic character, as somebody who you relate to, it spends its first, what, three-quarters of the episode, doing that. In the third act, what you find out is that the person who you’re supposed to feel like, “Oh this is me and I can relate to just being horrified by all these things people are doing to me online.” No, actually, you did it first and you’ve just forgotten, and you as… You are part of the problem and in… This system has been set up to punish you for being just as bad, arguably much worse, as all of them. And there’s a kind of circularly to it, and the twist that is really fascinating and the twist implicates the viewer. It is like, “No way, you think social media is horrible. It’s horrible because you’re horrible on social media.”
[laughter]
0:45:48 Paul Matzko: Well, and you also have a certain… Having an amusement park as a… We bay for blood. The rise of online outrage culture. We get a visceral thrill off of hunting down people who… Sometimes people do outrageous things worthy of outrage, but sometimes someone makes a mistake and we define that person by the mistake and then we go after them for digital blood. And, so it’s critiquing that that people would turn that into an amusement park. You buy a ticket to go Disney World, like you’d buy a ticket to go to the White Bear Corrections Amusement Park, whatever they call it. So, there’s a critique there as well of internet outrage culture but then there is too I think they are criticizing… Not criticizing. They are depicting a kind of perfect tech. So they’re able to erase her mind, stage this whole process, perfect tech, perfect process, helping courts administer the perfect punishment that you can calibrate this for individual. I don’t know, there’s something there that is off-putting to me that’s assumed that you can individualize punishment in this way and then invite the public to come in and participate in this kind of mob, perfect mob justice, technologically enabled perfect mob justice.
0:47:13 Natalie Dowzicky: I think what was most terrifying to me was more so what they were saying about group think. So this is essentially an entire episode on group think in the sense that these people are videoing and they’re enjoying it, right? They’re not there just because they’re a neighbor there or because that’s how you fit in. They’re really buying into this type of system, which is scary in the sense that I would have loved to see after we got that twist someone standing up and being like, “Oh, maybe this is bad” or “Maybe this is off-putting” like Paul just said, but I think the entire episode is more of like a larger conversation about group think and how it ties to we were saying ostracizing someone on social media for something they tweeted or, and then the mob comes after them just because someone famous or with a blue check, whatever.
0:48:03 Paul Matzko: Did either of you guys… As I was watching it, I felt a little uncomfortable in a post-“Me Too” moment ’cause I kept wondering if she had been gas-lit or controlled by her fiance. I don’t know if they meant those to hint in that direction but that’s what it felt like to me. And that might be because of I’m looking through. I don’t remember picking up on that when I watched it years ago when it first came out, but now it almost feels like I felt bad for her not just because she was supposed to be the point of view character, because I kept wondering is he… She’s complicit sure, but was she being controlled into participating to this crime?
0:48:39 Landry Ayres: I think that’s really interesting, and I think that’s what makes it a little bit scarier perhaps is that hindsight that we have. I know that her case is very much based on a real case that happened in the UK of a couple that did this to a child. So I can’t say that that… I mean, I don’t think that that was the scenario. I can’t speak to it perfectly, but I think that’s really interesting and brings up some of the issues that if this is something that can be used and tailored to specific people, what’s the prevalence of it? Is it only saved for these very, very violent… And who gets… Is it really just this one judge that they identify that gets to determine this punishment and who gets it?
0:49:25 Landry Ayres: And I was thinking about it, not necessarily a Libertarian thinker, but something that Libertarians borrow a lot from is Foucault calls it The Spectacle of the Scaffold and how that, the mob and the execution and public display is a way that they shore up power as a way of providing that to the mass. But usually there was also some resistance to that and people would try and set prisoners free and would retaliate against it because they saw it as this hegemonic power and unjust. But in this world it’s very, very much accepted at least from what we can see, which to me is what makes it a little bit scarier and the fact that as Peter pointed out that it flips it and implicates the viewer and makes you realize that you’re as much a part of it basically in both sides. That really there is no one that becomes a just character that you can follow and empathize with. It’s very pessimistic in that way, which is of course, that’s Black Mirror.
[laughter]
0:50:31 Natalie Dowzicky: So I can’t think of a scenario why I would wanna live in this world, but is there any positive aspects that could come out of this world that may be Black Mirror missed?
0:50:44 Peter Suderman: I’m struggling.
0:50:46 Natalie Dowzicky: So am I.
[laughter]
0:50:46 Landry Ayres: I thought… One thing is, when they fire the gun and then she is in the chair and it spins around and the auditorium swoops and I was like, “The stage manager, that is amazing. They need that for Broadway.” ‘Cause I would be like, “This is amazingly well done. The production value.”
[laughter]
0:51:06 Natalie Dowzicky: Oh yeah.
0:51:08 Peter Suderman: There’s times when I would like to have somebody wipe my memory at the end of the day and just start over. I don’t know if I would necessarily want to live in a… I certainly wouldn’t wanna live in a world where that was the criminal justice system or part of it and all. It does seem really hard to scale. It’s like literally how many of these pods exist? Is there just the one? Is it a monument? It’s very odd, it doesn’t think through the implications of what it if this is… For a show that is all about systemic effects, it just doesn’t seem to consider that this is a system that doesn’t scale and doesn’t replicate easily.
0:51:45 Paul Matzko: I want a world in which they do this for all the crimes. So you got pulled over for a speeding ticket. So you get your memory wiped and you have to go five miles over the speed limit permanently and then…
[overlapping conversation]
[laughter]
0:51:58 Peter Suderman: You’re driving behind a slow horse pulled trailer for the rest of your life. It’s sort of Harrison Bergeron, but yeah. I did find a fascinating way this episode inverted The Truman Show, which if you know anything about that movie, it’s a reality show about… There’s a single real person in the show, and then everybody around him in his life as an actor and they follow him from birth to death. But the original script for that movie was dystopian. And the movie itself is very sunny and beautiful, it was shot just a few miles from where I grew up in Florida, in this kind of weirdly super planned town. But the original script took place in a city and it was very dark and dreary and the whole thing was just that it would be miserable all the time. And then the the director, Peter Weir, switched that around. But it really seems to channel the Truman Show in the sense that there is one person who doesn’t know what’s going on, is being filmed by everyone for the entertainment of the masses, and is kept in the dark until the very end.
0:53:03 Natalie Dowzicky: I didn’t even think about that comparison till now, but that definitely… Yeah, that definitely makes sense. Alright, moving on to our scariest, Men Against Fire.
0:53:14 Landry Ayres: This episode follows Stripe, who was a soldier, whose job is to hunt mutant human creatures that they call Roaches, with the aid of a neural implant called a Mass implant. After a series of encounters, it begins to malfunction and he discovers that these Roaches are actually just other humans, perhaps of a differing ethnic group than he is a part of, and the implant actually alters his vision to see them as dehumanized. And also that he had previously agreed to have his memory wiped, just like in White Bear, in order to kill them without remorse. So for me, this is particularly scary, and they mentioned this before, not just because you have this very, very powerful, seemingly government-sponsored unless it’s some sort of private security force, which I don’t see as being the case, sponsoring essentially one of the steps in genocide and dehumanizing and killing a specific ethnic group. But that’s not the scariest part for me. The scariest part is that we can assume that there has been so much propaganda or some way that they have convinced people without this Mass implant that the Roaches are dehumanized because there are people without these implants that still consider this ethnic group to be essentially something that needs to be eliminated.
0:54:50 Paul Matzko: So what I found profoundly scary about this episode is that it is incorrect. So the episode is Men Against Fire, which is a reference to a book written by a World War I veteran general military historian named Samuel Marshall, who wrote the book by the same name, during the Vietnam War. And basically, all the ideas from his book, the actor who plays Douglas Stamper in House of Cards, I forget his name.
0:55:14 Landry Ayres: Michael Kelly.
0:55:15 Paul Matzko: Michael Kelly, he sits them down and says, “Here’s why we had to do this, we had to put this implant in, because we found that only a very small percentage of soldiers are willing to shoot the enemy. And so we had to put this in to make the enemy look horrifying. We had to dehumanize the enemy so you’d be willing to kill.” This was Marshall, the real-world Marshall’s argument in his book, Men Against Fire, during the Vietnam War. Later on, researchers, military historians and others, went… In fact, he made all that up. All those numbers, all the statistics were fabricated. It turns out soldiers are perfectly willing to kill other people.
0:55:50 Natalie Dowzicky: And that’s why I wasn’t surprised that he opted in to not feeling remorse, because I was thinking while I was watching it, I was like, “Well, in theory, many of our soldiers opt into that all the time, or maybe they don’t intentionally opt-in, but know that’s going to be a consequence or what have you.” But it didn’t surprise me that he opted in, although it’s sad, and that wouldn’t be something I would opt into. But I think it was striking, in a sense too, that I don’t think… I think they’re just hyper… They’re hyper-elevating some of the tendencies that… So what I’m saying, our soldiers, soldiers in this world, may have already. They’re just hyper… Well, we also forgot too that they can see through drones and they can see… They use all the different technology, so that’s another added aspect in order to stalk these roaches. But I think part of it is like I didn’t see it all that different from how we currently act, in a ravaging wartime that is.
0:56:49 Peter Suderman: Yeah. I mean, I found this episode fairly frightening. I did not realize that all of the stats were wrong. I meant to look them up and had not gotten to that, but that’s fascinating because it obviously came from somewhere, from a source that now, I guess, is inaccurate. I guess what struck me about this episode was just how malevolent in a way that seemed over… That seemed just to be way too exaggerated to a point where it didn’t need to be, Arquette, the Michael Kelly character was. On the one hand, he sort of thinks, “Okay, we’re protecting the bloodline,” but he’s given all of these lines that are just such mustache-twirling villain lines that are not… And there is, I think, a better episode in which you actually have the Michael Kelly character who is vaguely sympathetic, who is making an argument that is not one that is designed to repulse you, but it’s one that is like, “Look, we have to protect our home, we have to protect our people, these, what we could call them invaders, for example,” in which it still right to kind of loaded language that has obvious political… But if you had done that a few years ago, it would have been different.
0:58:17 Peter Suderman: But the idea that… We’re meant to think that he’s serious about this and thinks it’s a real problem. On the other hand, we’re also just meant to find him utterly horrific. And the thing is, you talk to people in the military, you talk to people who think that it’s their job to kill for the state, many of them are sincere, earnest and even in many ways… Virtually, always, quite decent people who we all get along with. And they have real… They have reasons that are not just sort of like, “Oh, I’m a… I’m a monster for doing what they do.” And Kelly’s character is a monster in a way that I think… In some way, it’s meant to heighten the metaphorical impact of the episode. And it actually ends up weaken weakening it in a way because he’s such a cartoon character.
0:59:10 Natalie Dowzicky: So, do we think this world for Men Against Fire, somewhere we wouldn’t wanna live or are we already kind of living in it, but don’t know it as much ’cause we’re not as active… None of us are active military members, we’re not active in that world, or what do we think?
0:59:26 Peter Suderman: I don’t wanna live in this world at all, nope.
0:59:28 Natalie Dowzicky: No? [chuckle]
0:59:31 Peter Suderman: It’s terrifying for every reason. There’s not anything in this that I am interested in being part of. I don’t want the possibility of the government erasing my agreement to do something, I don’t want the possibility of soldiers who are manipulated by government authorities in ways that sort of change what they see in order to get them to kill people who are living amongst us. I don’t want… I don’t want ethnic cleansing of any kind for any reason.
1:00:07 Natalie Dowzicky: Nope.
1:00:07 Peter Suderman: Whether it is aided by technology or not. I’m pretty against all of this. It all just seems sort of horrible. I think… It’s a good episode. It sort of does… It does what it needs to do. It’s a little bit simplistic and on the nose in some ways, but no, I don’t live in…
1:00:23 Landry Ayres: Right. So, it’s spooky for libertarians en masse. As an idea, it’s spooky for the philosophy or the sort of… The ideals that we have. But it’s not spooky to me as a person because like you said, it’s so on the nose, it is very much cartoony. And I could tell that there was something… There was something coming and I almost got to it, I feel like when I watched it the first time. So…
1:00:56 Natalie Dowzicky: I also think it was like, yes, we labeled it the scariest in this sense, but it also… It wasn’t my favorite episode. It wasn’t an episode, I necessarily enjoyed or thought… I thought White Bear was a much more engaging episode and had me really gripped in watching the whole time. Whereas I saw myself watching Men Against Fire, and I was like, “Oh, I wanna predict what we are going for.” And two, I didn’t think it as much engaging to an audience, partially for it’s simplistic nature.
1:01:21 Landry Ayres: I guess we should ask, are there any honorable mentions episodes you find particularly spooky or something like that, that you want… That you would have included in your top five or…
1:01:34 Natalie Dowzicky: Or a favorite one we didn’t include?
1:01:35 Landry Ayres: Yeah. Or even things that you think are great for libertarians, that you’d be like, “This is a really great world for libertarians.”
1:01:45 Peter Suderman: Nothing comes to mind. Though, I would almost say that… I know it’s not Black Mirror, but the antithesis of Black Mirror is the Philip K. Dick show on Amazon, which is another science fiction anthology, that does… That is… I think, does more to portray the positive aspects of technology. Not always, it’s not just a relentlessly happy and positive show.
1:02:09 Natalie Dowzicky: Right.
1:02:09 Peter Suderman: I don’t wanna portray it that way. But it’s a little more interested in the ways the technology makes our lives better.
1:02:15 Natalie Dowzicky: I think… Well, my… One of my favorite episodes was the most recent. Oh my gosh, the social media one. What…
1:02:22 Landry Ayres: Nosedive?
1:02:23 Natalie Dowzicky: Nosedive. We didn’t talk about it today ’cause I was just more set on social media and I really like the actress in it. But besides the point, I think they did an awesome job with the movie, which we didn’t touch on Bandersnatch, which was a Black Mirror spin-off, and that was all about… It was one of those, you experienced it and you create your own movie and you can decide what the characters do. So, it was a larger discussion on self-worth, self-evaluation, being able to control the scenario that was going on for the characters. So I thought that was interesting and I would have liked to just talk about that a lot more.
1:03:03 Landry Ayres: Thanks for listening to Pop N Locke. There are a lot of great episode of Black Mirror we didn’t get to cover some of you might even find spookier than we did. Let us know what you thought on Twitter. You can follow the show @popnlockepod. That’s pop, the letter “N”, lock with an “E”, pod. Make sure to subscribe to us on Apple Podcasts, Spotify, or wherever you get your podcast. We look forward to unraveling your favorite show or movie next time. Pop N Locke is a production of Libertarianism.org. To learn more, visit us on the web at www.libertarianism.org.