E01 -

Welcome to Building Tomorrow! A new podcast by Lib​er​tar​i​an​ism​.org and the Cato Institute.

Hosts
Paul Matzko
Tech & Innovation Editor
Guests
Aaron Ross Powell
Director and Editor

Aaron Ross Powell was the director and editor of Lib​er​tar​i​an​ism​.org, a project of the Cato Institute.

Matthew Feeney is head of technology and innovation at the Centre for Policy Studies. He was previously the director of Cato’s Project on Emerging Technologies, where he worked on issues concerning the intersection of new technologies and civil liberties, and before that, he was assistant editor of Rea​son​.com. Matthew is a dual British/​American citizen and received both his BA and MA in philosophy from the University of Reading in England.

Will Duffield is a research assistant at Cato’s First Amendment Project.

Building Tomorrow explores the ways technology, innovation, and entrepreneurship are creating a freer, wealthier, and more peaceful world. In our first episode, we survey how major recent advances in tech have made it harder for the State to “read” citizens, deepened networks of trust between activists, expanded ownership of our bodies, and created new sharing economies.

Further Readings/​References:

Yes, an augmented reality cocktail bar is absolutely the best use of this exciting new technology.

Philosopher gnomes for the gardens of those with discerning taste.

A Building Tomorrow review of Michael Munger’s book, Tomorrow 3.0.

The first article of Will Duffield’s “Prototype” project on creating an uncensorable internet.

Transcript

[music]

00:05 Paul Matzko: Welcome to Building Tomorrow, a show where we explore the ways technology, innovation, and entrepreneurship are enabling people to build a free, wealthy, and peaceful world. For today, we’re gonna open with a discussion about kinda the big picture of where we see tech and innovation helping us advance freedom in America and across the world. My name is Paul Matzko, I’m the Assistant Editor for Tech Innovation at lib​er​tar​i​an​ism​.org, and with me are…

00:32 Will Duffield: I’m Will Duffield, a research assistant focusing on speech and technology, and the editor of Prototype, a project of lib​er​tar​i​an​ism​.org dedicated to charting a course for liberalism in the information age.

00:46 Matthew Feeney: I’m Matthew Feeney, the director of Cato’s project on emerging technologies, which handles all policy areas related to new technologies such as Blockchain, artificial intelligence, drones. My own personal research projects deals with the intersection of civil liberties and new and emerging technology.

01:05 Aaron Ross Powell: And I am Aaron Powell. I am director and editor of lib​er​tar​i​an​ism​.org at the Cato Institute, of which Building Tomorrow is a part, and lib​er​tar​i​an​ism​.org is Cato’s resource on the history and theory and ideas of libertarianism.

01:18 Paul Matzko: Well, welcome to the show, guys. And for this first roundtable, I thought we’d explore particular avenues in which we see technology right now. Some of these technologies are changing people’s lives for the better over the next year or two, some, over the next generation, during our lifetimes. But we’d explore some of these, explore the potential, maybe a little bit of the pitfalls of these areas. But just as a sense of, these are the kinds of things that we see this show highlighting in each episode. To start, how about I throw out something that’s very buzzy right now, involves encryption, involves Blockchain technology. Really, it’s a whole category of technologies that makes citizens less legible, in that… It makes citizens harder to track. It makes people harder to read, like when we use the word “legible,” we’re talking about reading people, reading who they are, their identities, what they do. And a lot of these new technologies make us less legible to big organizations or institutions, like the state, like the government, but also to corporations. It makes it harder for these large institutions to track what you’re doing. And ultimately, if you can track what people are doing, you can control what they’re doing. So where do you guys see these technologies heading in the next couple of years?

02:48 Aaron Ross Powell: Well, I’m probably, I think, in this room, the biggest booster of these technologies, or at least the most optimistic about them. Of, I think, all the kinds of tech that we will talk about, all the kinds of innovation that we’ll talk about in Building Tomorrow, this encryption moving communications, moving economics into realms where our privacy is protected from each other, but particularly from the state, I think, has the greatest potential to radically change the world in a much, much freer direction. And so everything from basics of encrypted chat, the text messages you’re sending, can’t be read by the government, even if they want to, even if they get a warrant. They just… It’s utterly inaccessible to them, that we could move our voice calls into that arena as well. And then I think, most excitingly, that we can move our economic transactions, if not all of them, at least most of them into a realm where the transactions aren’t tracked, that they aren’t accessible, that the amount of money that we have can’t be seen, where the money came from, who we’re giving it to can’t be seen. I think that this stuff is phenomenally exciting from a libertarian, from a pro-​liberty perspective.

04:11 Aaron Ross Powell: I mean, yes, the fear is that this enables all sorts of untoward things, black markets and bad stuff and actors doing things that we hope they wouldn’t. But that stuff happens now. I think that the more encouraging aspect of it is those of us who the state would maybe like to control, would maybe like to stamp down on, not just in the United States, where we’re… We’re pretty free as it is, we do okay, but in third world countries, in much more totalitarian regimes, activists, anti-​government people, the kind of people whose lives could be really threatened if they could be identified, that these people can operate in ways that are inaccessible to the state really can advance human liberty, really can make us better off. And in ways that don’t involve the off-​the-​gridness that you used to have to embrace, that we don’t have to disappear from the economic scene, we don’t have to cut ourselves off in order to be inaccessible. We can participate in technologically-​sophisticated networks that have benefits above and beyond while getting that privacy too.

05:36 Matthew Feeney: Right. So I don’t want to disagree with any of that and I share the optimistic vision but I do think that there are gonna be difficult conversations ahead. And anyone who takes a fleeting glimpse of the history of American law enforcement or the American government knows that there are serious concerns about government being able to track people, tracking their communications and their economic activity. Those are legitimate and well-​founded concerns. The annoying, however, though is, of course, that the state will not stop trying to, of course, gather all of this information and to track it, because there are what people inside Cato would even consider legitimate law enforcement practices, which are, we’re going after people who commit violent crimes, property crimes. And the world that Aaron has outlined is really, I think, on net. The benefits certainly are gonna outweigh the costs, but I think we need to be ready for a time where the emphasis is on the costs, in the media and in politics, because it will be very easy for spokespeople, for law enforcement to stand up and say, “It is actually impossible for us to investigate quite serious crimes.” And that’s one of the challenges that I’m excited about actually, with projects like these, to explore how we can get ready for those discussions, which I say, “Get ready”, that actually already happening, but at least play a role.

06:55 Will Duffield: I’m certainly excited about these developments, but I don’t want to make too much of them, or oversell the capacity of particularly tools that allow people to speak privately or anonymously. Because when that expectation is developed and fails to bear fruit, when someone believes that they are speaking in a fashion that can’t be understood or recorded by state actors, and it turns out that they can, the consequences are often fatal. We also need to think about our position within these emerging technologies, how we use them and how widely-​adopted they are. I think it’s easy for us who are all nerds, about Bitcoin, about Tor, about VPNs, to imagine that their use is much more widespread than it is. Now, nonetheless, it does allow individuals to opt out without dropping out, and that’s a good in its own right. But I don’t wanna make too much of them, particularly as more and more people come online and, for the most part, use the rest of the internet as it were, an internet in which you can be tracked, in which, through using it, you are made legible to a host of actors that you may not even be aware of.

08:27 Matthew Feeney: Yeah. I think, unfortunately, there’s a tendency for people not to educate themselves about certain tools and techniques they can use, they just engage in self-​censorship. Yesterday, I was doing one of the parts of the job that I really enjoy, which is to talk to students and highlighting one of… I think it was Pew surveys after Snowden, asking people about their behavior. And it turned out that there are people saying in these surveys, saying, “Well, I don’t discuss my private life much anymore. I don’t say things I used to.” So these people haven’t turned to Tor or VPNs. What’s quite scary is they’ve just engaged in some degree of self-​censorship, because the barrier to actually getting into these encryption tools, anonymity tools, they’re not insurmountable, but they’re evidently more than what a lot of people are willing to put out…

09:21 Paul Matzko: Well, the basic wrinkle, it’s a younger generation using Snapchat, or something ephemeral rather than Facebook.

09:29 Matthew Feeney: Yeah.

09:30 Paul Matzko: That’s a form of self-​censorship, of adjusting to expectations. They’re like, “Oh yeah. What I put on here is not gonna be… ”

09:38 Will Duffield: And it also would seem to deal more effectively with the threat factors that younger people are concerned about. If you’re sending something on Snapchat rather than sending it in Facebook Messenger, you’re concerned that you might not always be able to trust the significant other you’re sending a photo to. And that, rather than some threat of state coercion, is what you’re trying to guard against.

10:02 Aaron Ross Powell: I think the concerns though that, Will and Matthew, you have raised are less about the tech, or may be misplaced optimism in the tech and more about just where we are in the timeline of this tech. Because with all of this, with the encryption and the cryptocurrencies and everything else we’ve just discussed, we are… I think it is very easy to underestimate how early-​days we currently are. And so this technology looks the way that the internet did when most online communication was still dial-​up BBSs. It was something that nerds could use and it was very hard to use, and you could… If you tried to figure it out yourself and you weren’t a nerd, it was frustrating. But we don’t say that that time was wasted or that the enthusiasm then was misplaced. It was that all the enthusiasm were being channeled into building the tools that would eventually go mainstream.

10:58 Aaron Ross Powell: So, yes, people aren’t installing Tor right now, but everyone is taking advantage of HTTPS, without even knowing it. And so I think that the people who really worry about their privacy are gonna be invested in figuring out the right way to do it. They’re gonna know that Signal is better than Facebook Messenger for communicating stuff that they don’t want law enforcement, the state, to find out about. But over time, especially as this technology and these protocols become more embedded, quality privacy, quality encryption, quality crypto-​economics will just be baked into the everyday software that everyone uses, so they will… I think that 10 years, 20 years from now, most people will be using the stuff without even really being aware that they’re using it, the same way that they’re not aware that they’re using encrypted connections when they punch their credit card into Amazon.

11:52 Paul Matzko: And there’s this… The dovetail in something you said earlier, Aaron, some of the most exciting applications of this will be easier and potentially easier in places not where the innovation is actually being created. Not America, not from Silicon Valley, but in places that can kind of skip ahead. It’s not unlike how a cell phone adoption actually took off more rapidly, the adoption rates were higher in the third world than they were in the first world, in large part, because we already had this infrastructure for landlines and whatnot, that held people back. But if you think about that as not just the literal infrastructure, but as a way of thinking in places where people right now because of lack of economic development and state development, people are currently illegible, but there’s lots of downsides to that, like not being legible makes it hard to build a civil society in an economic order. They can actually leapfrog and build systems, both technological systems and cultural systems that incorporate these new technologies and skip the kind of intervening step. So it’s actually a cool way in which these technologies will not only have the greatest potential benefit in some of these places because of their regimes, but could also actually be adopted there more fully first.

13:15 Will Duffield: Oh, just to push back on that legibility point a bit, I think it’s perfectly reasonable to build a society that is legible. Most historic societies have been pretty illegible. However, it’s difficult to hook that society up to an integrated global supply chain when it can’t be understood, and there are both benefits and pitfalls to that.

13:38 Paul Matzko: Right. Well, but part of legibility is not only are you readable, but who gets to decide who reads you. So the ability to make yourself a citizen but then choose how your citizenship gets transferred, right, across kind of boundary, so there’s… But it’s an interesting topic. I put down here, we had disagreement about this, our farewell, about how James C. Scott, Seeing Like a State would apply. And I guess we don’t wanna get too down into the details, but Scott is saying, “Look, the nation-​state has an interest in making citizens legible so it can control them and do these big transformative projects top-​down and pose them on them.” And I saw that as well, we can break that chain by making citizens less legible to the nation-​state. But you saw on that a word of caution, right?

14:28 Will Duffield: Well, legibility has always required simplification, the use of second-​order, easier to comprehend metrics to track something, and with it, a shift in the thing being studied or put under the microscope. The internet often does a very good job of that. It provides for this recording, it can shift communication into more standardized forms. If you’re filling out, you’re using Outlook to write an email, you’re putting it in, you’re communicating as Outlook would have you speak and that allows everything you send through it to be collected, collated, and appraised in ways that otherwise wouldn’t be possible, where you’re having one-​off physical conversations with people or whatever else you might use in its absence.

15:28 Paul Matzko: Well, this is good. I think so that this conversation isn’t all about the issue of legibility, let’s move on to our next kind of avenue, which we see tech advancing, at least having the potential to advance freedom. One of the things I know that was building… Or the way in which tech can build new networks of trust, or deepen networks of trust between private actors. So, the classic example of this, that was arguably a little bit overhyped, but back during the Arab Spring in North Africa and the Middle East, where people were using Twitter, using other social media outlets to organize resistance against a series of regimes from Tunisia to Egypt, wherever. And whenever you think about the ultimate outcome of some of the Arab Spring movements, like in Egypt, they ended up with a guy who arguably was not all that much better than the previous dictator. But it does show the ability of these new forms of media and of technology to allow movements to coalesce more. Basically, the barrier to entry for movement formation was lowered because of this technology. It allowed people who were disconnected in time and across space to find each other and build an organization. And so, because of that, you’re building… Or building a new network of trust.

16:49 Matthew Feeney: Yeah, I think that something you just said is really resonant of some of the work I’ve done, which is the ability to find other people, which is not only if you have interest, which is really great, because there are people all over the world who have minority interest and can build communities online and that’s great, whether it’s political, religious, or whatever. But when talking about the finding of other people, that’s been really great for commerce and for finding out really new interesting ways to do very old kinds of things. And I think you’ve seen that in an exciting way with the rise of what’s come to be annoyingly called the sharing economy, which is that. So the reason why there were taxis is that when people arrived in strange cities, it was not worth their time to knock on strange doors in a strange city and asking if they had a spare car, and, “Look, I’ll pay you some money if you drive me.” And likewise, it wasn’t worth knocking on doors asking if there were spare bedrooms to sleep in, so hotels and taxis.

17:51 Matthew Feeney: But then it turns out that the internet is a great way of finding people who do have cars, that they’re willing to drive around for extra money and they do have spare bedrooms where they don’t mind renting out. And you have experiences like ridesharing and like Airbnb, and these sort of companies, which I don’t think are only just providing a similar service to taxis and hotels, they’re providing a much more exciting kind of service where people are actually becoming more acquainted with the communities they’re visiting, meeting more interesting people, and I think that’s a really interesting development that’s also a feature of this, finding other people.

18:31 Paul Matzko: Yeah, no one ever said, “Oh, I visited this city and, man, the people at the La Quinta Inn, they just… It blew my mind getting to know really the local culture through the desk clerk at… ”

18:42 Matthew Feeney: Right. Yeah.

18:42 Paul Matzko: But if you go to Airbnb, maybe you go out for dinner with your host, or maybe you chat with them in the evenings, you actually get to know a local person. The number of times where I’ve gotten advantage of local knowledge, staying with someone in Spanish Harlem and learning the best taco place and going there with my host. I wouldn’t have gotten that, but for the fact that I didn’t just have this mere transactional relationship, I was actually building a relationship with another private person, a real person on the other end of that exchange.

19:16 Will Duffield: I think the internet as a solution to the matching problem when it comes to bringing together people with goods to sell and those who’d like to buy them goes far beyond the gig economy as well. You see artisanal specialties that simply wouldn’t be viable if you only had access to your localized market in the past, that now people can really work at and specialize in. This past Christmas, I was looking for a gift for a philosophy-​inclined friend and found someone on Etsy who made small clay sculptures of famous historic philosophers. Now, if you’re living in a village or you just have access to the market in your city, it’s pretty hard to make a living doing that. But if you can ship them all over the world to folks with strange friends like me, that suddenly becomes a viable market niche, where previously, it just wasn’t.

20:12 Aaron Ross Powell: This discussion is absolutely correct, but we can look at it at a higher, broader level, in that one of the questions that comes up, one of the issues that flows behind a lot of libertarian political thinking is we’re advocates for liberty, for increasing freedom, scaling back the state, and so on. But what’s the good of that? Why does that matter? Why does it matter that we have more liberty, that we have more choices? That few of our actions are prescribed by the government? And a large part of that, I think, is this idea that we wanna be able to be the authors of our own lives, that we want to be able to construct the narrative of our lives, to build it the way that we see fit, to pursue our interests. And so the economic specialization, like Will just described, is a huge part of that, that instead of getting a job in the local warehouse, which might be the thing… There might be people who that’s what they wanna do, but some people really wanna make clay models of dead philosophers.

21:22 Will Duffield: Rather than just garden gnomes or something, which is where you’d be otherwise.

21:24 Aaron Ross Powell: Sure. And so now that opportunity exists, and that opportunity would not have existed without the technology and the entrepreneurs who built that technology enabling it. But that ties in too to these networks because a huge part of authoring your lives is choosing your peers, is who do I wanna associate with, who do I wanna call my friends, my extended family? And that’s decreasingly tied to the accidents of geography, that you can pick your peer group. And sometimes that can be very toxic. There are lots of toxic corners in the internet where very toxic people who would have just languished in loneliness and obscurity now can find each other and stir up these brush fires of incels, or whatever else. But I think that remains a minority part of it. And again, we’re still… Like I said before, I think we underestimate how early days we are in all of this. So we’re still figuring out what the social norms are, what the right behavior is, how these networks should function in that. So I think that’s gonna get better over time, but all of this technology, just to an astonishing degree, enables us to be both more autonomous in our self-​authorship and more powerful in how much authorial control we have over the shape of our lives.

22:50 Paul Matzko: To some extent, being a techno-​optimist, seeing the transformative potential of innovation means being an optimist about what it means to be a human. We’re seeing the potential in unleashing more agency for human beings. Now, because there are good people and bad people, and in all of us there’s some combination of the two, you’re gonna get bad consequences of that. But on the net, do you believe that people unleashed will do more good than they will harm? And I’m an optimist in that regard.

23:26 Aaron Ross Powell: I just think people have more strange positive hobbies that they’re looking for fellows to engage in with than weird grievances that they’re willing to form a community and an identity around. That level of dislocation, of alienation, and angst just is fairly rare.

23:46 Paul Matzko: Yeah, yeah. Well, there’s that sense of we’re weirding… A weird society. As people are freer to express themselves, we are less homogenized, less mass-​produced, you get your philosopher garden gnomes instead of the same mass-​produced ones everyone else gets here.

24:03 Matthew Feeney: Yeah, I don’t know if… I’m sure there’s probably a PhD thesis somewhere out there that’s been written on this. Question, I don’t know the answer to, that’s just occurred to me, in this conversation, is whether this emergence of the nerd takeover of a lot of popular culture has to do with this, that there were… So at least, growing up… I’m not… I like to think I’m not that old, but at least when I was into board games and Magic: The Gathering, and whatever I had to… After high school, I’d walk up to the local card game store, and it was above a Burger King, and it was all bearded, and older guys that I would hang out with. And it was a lot of fun, and it was great, but it was really something difficult that was translated through word of mouth, and I had to find the place, and it felt a little off the beaten path.

24:50 Matthew Feeney: And it turns out that now, it’s just so much easier for people who are interested in these things and they actually find out, it’s not actually niche, and some of these hobbies or pursuits that are usually associated with nerds, are actually flourishing or growing. I’m sure Aaron can tell me more about Games Workshop and some of the stuff I was never particularly… And they had Dungeons and Dragons, yeah, comic books, and all this stuff.

25:16 Paul Matzko: It is funny, ’cause I grew up hearing, in a fundamentalist protestant household, how there was the occult panic in the ’80s and ’90s, it was all about how Dungeons and Dragons was gonna turn us all into Satanists. So…

25:28 Matthew Feeney: How right they were.

[laughter]

25:30 Aaron Ross Powell: Inevitably, those people, those descriptions of D&D by the satanic panic folks were just… They were frustrating because they were… Sounded so much cooler than any D&D game I ever played in.

[laughter]

25:42 Paul Matzko: I know. The reality was a bunch of troglodyte types in basements, you know, just… Yeah, yeah.

25:48 Will Duffield: I do think there is, or you can see a downside to this, in that these communities become gentrified, as it were. It’s much harder to police boundaries in your strange nerd niche when anyone can come in, and join, and that does have an effect on the character of that community. Now, on the whole, when we’re talking about net benefits, it’s a good thing because more people are enjoying this hobby. But for those who felt alienated and found a place there, and now maybe can’t hack it, as all of the normies come in, and the character of the place changes a bit, they do lose out. And that is something to keep an eye on, or be aware of, as we celebrate this process.

26:36 Matthew Feeney: And I think that’s a risk that actually gets compounded with another exciting technology, right? Which is virtual reality, and the fact that it will become, I think, increasingly easier for people to spend the vast majority of their lives, in not physical different worlds, but that these worlds will become much more immersive, much more so than they already are, and that’s something we should keep an eye on.

27:00 Paul Matzko: With like the augmented reality, I just saw one of the first applications is the ability to measure things without… You just… I don’t know how you trigger the command, but you can tell just by looking, and your glasses will tell you, “Okay, that’s nine inches long.” So it’s the overlay before we get to full VR AR.

27:20 Will Duffield: We’re figuring out how to use it. I went to America’s first AR cocktail bar this Monday, and they’re still working out the kinks.

27:30 Paul Matzko: So it identifies what’s in your cup or?

27:32 Will Duffield: Gives you visuals around your drink that accompany the spirits that have been mixed together. Whether we really need that, I don’t know, but maybe we’ll enjoy it.

[background conversation]

[laughter]

27:51 Paul Matzko: I want your augmented cocktails, Will, damn it, and I want your philosopher gnomes.

27:55 Will Duffield: It was part of a show at Artech House, which is a gallery in DC that does modern tech-​centric installation pieces. Worth a visit for everyone.

28:07 Paul Matzko: Cool. Why don’t we do one more kind of big picture theme here. I had down some of this new technology is allowing us to expand ownership and control of physical reality, and that includes both our own bodies and nature. This is a little more sciencey, it’s like biotech, rather than, you know, I think what a lot of folks think of when they think of tech. But here, I’m thinking of technologies like CRISPR, technologies like prosthetics, push people… Pushing for on the philosophical side towards trans-​humanism, augmenting people.

28:44 Paul Matzko: If I had to pick any one kind of suite of technologies that I think would be most transformative, so if Aaron’s picking Blockchain encryption tech and… As the next frontier, for me, it’s this biotech. It’s stuff like CRISPR where you’re going to be able to target… The potential’s there, over the next generation, to eradicate all heritable diseases, through selective gene editing, to combat certain infections, infectious diseases as well. To transform… And there are pitfalls here, but to transform invasive species, we could essentially destroy disease-​carrying mosquitoes, ones that carry malaria and Zika virus, though the potential downsides of… The ecological ramifications of that are unsure at this point. But I mean, as far as transformative technologies that have the potential to really transform what it means to be a human itself, to me, there’s a lot of exciting potential there.

29:46 Will Duffield: So firstly, on prosthetics, it’s not changing someone’s DNA or their biology, but advances in prosthetics are under-​celebrated for what they’ve delivered. 50 years ago, if you lost your hand, you got a hook. Now you get… It might not be quite as good as the hand you had before, but you can pick things up, you can ride a bicycle, you can go out into the world and behave largely as though you still have your hand. And for those who have lost limbs, that’s incredible, really transformative. Now, CRISPR is incredibly exciting. It’s also pretty scary because it imminetizes a lot of questions we have about what a disease is. Autistic folks, for instance, folks often think of that as both a disease and integral to their identity, they’re living flourishing lives despite being autistic, and the idea that they would be erased, in a sense, through something like CRISPR to be very concerning. So how we end up utilizing these technologies and what we identify as problems to be solved by them matters a great deal for human liberty and identity formation.

31:00 Matthew Feeney: I think this brings it back to one of the concerns that I think we always should have, especially here at Cato, which is, this is exciting technology but think about the state, like the regulation of this stuff, and the last people you really want defining mental illness is the folks up on Capitol Hill or the people driving this kind of stuff. And so…

31:21 Will Duffield: Oliver Wendell Holmes?

31:23 Matthew Feeney: Right, of course, yeah. Too many imbeciles around. I do think that if you have… If you have a state that’s powerful enough to actually dictate what kind of conditions people are even allowed to have or what they shouldn’t, that’s a worrying place to be. But I share that, generally, from a bird’s eye view, this is really cool and exciting stuff that will, certainly, if guided by the right policies, allow people to live freer, healthier, and more prosperous lives.

31:53 Aaron Ross Powell: I think this is an area though that concern where the eligibility comes in as well. Because one of the really exciting things about, say, CRISPR, is how much potential it has to democratize this kind of technology, that it’s something that almost individuals can do. And if those individuals can, at the same time, be eligible and if the economic transactions for me paying Will, the CRISPR doc, to do these things to me can be hidden from the state, then the state has less of an ability to regulate and decide what we get to do with it. And it all… I mean, all of this, all the biotech ties back into that self-​authorship, that there’s… Nothing speaks more to self-​authorship than the ability to manipulate one’s own…

32:42 Will Duffield: What do you guys think about the potential for backlash for regulation? ‘Cause I can’t help but see a moral panic on the horizon, especially given how much certain people freak out about folks changing their gender, for instance. If you can make yourself into a catgirl, that would seem to be much more concerning to that crew.

33:05 Aaron Ross Powell: People freaked out about transsexuals and then they freaked out less. We always have moral panics, but the moral panics tend to go away and the underlying thing that caused them sticks around and seems to eventually become accepted.

33:18 Paul Matzko: Generally, we see this arc of like when a new innovation is proposed, whether it’s a cultural innovation or a medical innovation or technological innovation, at first, the only people who know about it are the people who are really kind of doing it, it’s this niche community, kind of. And they’re excited about potential, very bullish on the prospects. Then there becomes this point where you get wide exposure. People are like, “Oh.” CRISPR is now not just this thing that people in a few labs at research universities know about. It’s on the big screen, they use CRISPR, they create giant monkeys for Dwayne The Rock Johnson to be friends with him, beat up giant reptiles and… For those of you unfortunate enough to see one of the blockbusters this year. But then there becomes a point where… But they don’t really understand the technology, but at some point, often the expectations of the original community, if they see that, those benefits eventually get widely accepted as well. And so there’s always that constant arc. I think you see the same thing with technologies like CRISPR.

34:23 Matthew Feeney: You will see something similar to the trans issue, I bet you, because what you’ll hear from a lot of people is accusations at people who are trans are actually suffering from a mental illness, which brings me back to what I started with in this part of the conversation, which, you don’t want that kind of thing being regulated by congress. [chuckle]

34:45 Will Duffield: Largely because these technologies can be so transformative, a small moral panic and a small regulatory backlash can have effects that echo for decades. We, back in the Bush years, hamstrung our stem cell research as a result of an evangelical moral panic about it. And we still don’t know how far that set us back, especially relative to other places with, at least in that respect, more liberal regimes.

35:14 Paul Matzko: A lot of the CRISPR research that’s being done right now is actually in China. The practical research of actually implementing it. Well, this is a thing that’s true, every new technology, is that there is this window in which even regulators have a lot of ability to delay implementation or innovation by potentially decades.

35:40 Paul Matzko: So, that’s something we’ll talk about on the show, but we do… We’re gonna keep a focus on both, we’re gonna talk about the transformative potential of these technologies, while also, notes of caution like, here’s where this could go wrong, along the way. At the end of this episode, let’s spend a few minutes talking about… We’ve kind of assumed, throughout the conversation, that, “Hey, it’s okay to disrupt the established situation. It’s okay to make yourself less legible to the state. It’s okay to just start to… ” The government says you should use taxis. We’re gonna start using Uber or Lyft. We’ve made this assumption. Maybe we should turn that from an assumption into something we’ve actually discussed. Why don’t we start with you, Aaron. What are the ethics of circumventing state via private action using tech and innovation?

36:34 Aaron Ross Powell: Well, that’s a awfully large question that we certainly don’t have enough time to fully explore here.

36:42 Paul Matzko: You have 30 seconds. [laughter]

36:42 Aaron Ross Powell: Sure. But, by and large, from a libertarian perspective, most of the rules and regulations promulgated by the state, most of the commands it gives us are not legitimate, in the sense that the state lacks the authority, lacks the moral authority to make those claims on us in the first place, whether they’re, on the one hand, just unconstitutional, which clearly a lot, if not most of what the federal government does is unconstitutional, or from a more fundamental moral philosophy standpoint that governments don’t get to do these things in the first place, no matter what their pieces of paper happen to tell them. So, in that regard, there’s clearly no fundamental principle of justice that says that we must have a taxi medallion system. And in fact, it’s under, I think, most reasonable theories of justice… To give that kind of monopoly to a small group of politically-​connected people would not hold up to scrutiny. And so violating unjust laws is perfectly fine. In some cases, it’s probably obligatory.

38:01 Aaron Ross Powell: I think one of the things that really interests me about what we talk about on Building Tomorrow is this notion that… I mean, human liberty really matters. And it doesn’t just matter because it’s nice to have or in the long run, it makes us wealthier, but that human liberty matters and in a very acute way to people right here and right now, and often people who are the most disadvantaged. But the state tends to be incredibly slow-​moving, which is itself a harm. Even if the state gets it right, it usually takes a long time to get it right. And so, while it’s taking its time, people are hurting, people are dying, people are not living as good of lives as they might. And if these are the only lives they get, that’s a pretty enormous cost. And at other times, the state has all sorts of incentives not to expand the sphere of liberty for its citizens because it disempowers the state. It makes it harder for the state to do what it wants to do, states like to have control, and so on.

39:06 Aaron Ross Powell: And so what’s really exciting me about all of the technologies we’re talking about here, even with the hiccups that we’ve discussed or the possible setbacks that might exist, is the capacity for us, as autonomous individuals with moral standing of our own and rights of our own, to not wait for these men and women who wear the label of the state legitimately or illegitimately to decide it’s okay for us to live the kind of life we wanna live, that these technology, that these people out there building these businesses or creating these protocols or making these scientific breakthroughs are giving us the tools we need to simply say, “I want my life to be freer now and I’m gonna make it freer now.”

39:50 Aaron Ross Powell: And the state can catch up. If it wants to, the state can try to stop me, but I don’t need to pay much attention to it or I don’t need to pay as much attention to it as I did in the past. I think the ethics of that are pretty clear. We can quibble about where we draw the lines, but at the extremes, if the people of North Korea had a way to suddenly make themselves radically freer, even though the Kim regime didn’t want them to, I think that very few of us, except for maybe our president, would say, “No, you gotta wait for the government to decide to let you be more free.”

40:33 Paul Matzko: Well, there’s certainly this constant tug and pull, or maybe the better metaphor is a bit of an arms race, where private actors find exciting ways to innovate around just natural hurdles, around scarcity, around problems of geographic dislocation, and around the state, but because the state is often such a slow mover, it takes them time to catch up. And maybe catch up they will. It’s not like the Chinese government is the true innovator in Blockchain tech, but they’re going to find ways to use that to control, but by the time they find a way of using it to control their citizens, there will probably be, in 20 years, another innovation that allows citizens to escape that control. So there’s that constant back and forth, and I think we’ll see that surface in a lot of these technological areas as we go on with the show, but thank you all to our listeners for tuning in to the first episode of Building Tomorrow. Thank you to Matthew, Aaron, and Will for joining us in this inaugural episode. Until next week, be well.

[music]

41:40 Paul Matzko: Building Tomorrow is produced by Tess Terrible. If you enjoy our show, please rate, review, and subscribe to us on iTunes, or wherever you get your podcasts. To learn about Building Tomorrow or to discover other great podcasts, visit us on the web at lib​er​tar​i​an​ism​.org.