E440 -

Privacy may seem simple, but it’s much harder to define than it seems.

Hosts
Trevor Burrus
Research Fellow, Constitutional Studies
Aaron Ross Powell
Director and Editor
Guests

Neil Richards is one of the world’s leading experts in privacy law, information law, and freedom of expression. Professor Richards holds the Koch Distinguished Professor in Law at Washington University School of Law, where he co-​directs the Cordell Institute for Policy in Medicine & Law. He is also an affiliate scholar with the Stanford Center for Internet and Society and the Yale Information Society Project, a Fellow at the Center for Democracy and Technology, and a consultant and expert in privacy cases. Professor Richards serves on the board of the Future of Privacy Forum and is a member of the American Law Institute.

SUMMARY:

Neil Richards thinks privacy is much more than being left alone. Whatever it is, how do we achieve it in our modern, surveillance filled world, and just as importantly, should we? Is the difference between our revealed and real preferences significant enough to justify companies watching us around the clock?

FURTHER READING:

Why Privacy Matters, by Neil Richards

Transcript

0:00:07.6 Trevor Burrus: Welcome to Free Thoughts. I’m Trevor Burrus. Joining me today is Neil Richards, who holds the Koch Distinguished Professorship at Washington University School of Law, not those Koch brothers, where he also co-​directs the Cordell Institute for Policy in Medicine and Law. His new book is Why Privacy Matters. Welcome to Free Thoughts, Neil.

0:00:24.6 Neil Richards: Thank you for having me, Trevor.

0:00:27.5 Trevor Burrus: So since the dawn of the internet and probably social media more specifically, privacy gets a lot more attention, but it seems that that attention is because people think privacy is mostly dead.

0:00:38.7 Neil Richards: They do. And I think, as I explain in the book, I think that’s a dangerous fallacy, but it’s certainly a reasonable perspective for people to take. That was certainly what the Uber driver who took me through Silicon Valley from my hotel to give a book talk at Stanford a few years ago, said, and unwittingly subjected herself to a seven-​minute tirade about… Only seven minutes because the Uber ride was short, thankfully for her, about why privacy has continued to be important, why it’s about power, why human information confers power over human beings. And I realized it was a conversation I’d been having with, not just with Uber drivers, but with friends and family, students and colleagues, bartenders, waiters, the woman that cuts our hair, and I realized that it was an important enough misconception that it was worth writing down. And so the book is a representation of that seven-​minute tirade, hopefully in calmer, more measured academic language.

0:01:41.1 Trevor Burrus: Now, we could make the mistake of talking about privacy for an hour here, and think that we know we’re talking about the same thing, where actually people don’t really have a good agreed-​upon definition of privacy, and so you give a working definition of privacy for the book.

0:01:53.6 Neil Richards: Yeah, right. As your question suggests, and as I’m sure people listening will be all too aware, scholars, judges, legislators, lawyers have struggled for generations to come up with a definition of privacy that everybody can agree on, and I certainly don’t have the hubris to purport to answer that question for the ages in the book. But I think when we’re talking about something it’s important to define the terms that we’re talking about, so I offer a working definition in the book, which is for the purpose of the book, it could be for the purpose of our conversation today, about privacy being the extent to which human information is neither known nor used.

0:02:36.5 Trevor Burrus: So it’s definitely about information, and it has to be, but it’s not about secrecy, that’s another interesting point you make, privacy and secrecy are not the same thing.

0:02:46.1 Neil Richards: Right, right. I think there’s a couple of things that are important in the definition. First, it’s about information rather than decisions or autonomy or other things we’ve used to wrap privacy with. It’s about humans, it directs us to information about humans rather than… Many of these technologies can be used to monitor crops, they can be used to monitor climate change, they can be used for a… They can be used to monitor pets. Pets may have privacy, but this book is not about that, it’s about information about humans, and it also directs us to the human values that I think should be used to drive our privacy policy.

0:03:25.8 Neil Richards: It’s also about the use of information as well as the collection, and the fact that it’s the degree to which that information is used or known. Privacy is a matter of degree. Most of the time for most people, for most of human history, most information has existed in the intermediate states between things that are known only to me, like what I dreamed about last night and haven’t told anybody yet, and facts that are known to everybody like John Lennon and Paul McCartney were in The Beatles.

0:03:54.2 Neil Richards: I think our lived experience as human beings is that information exists in those intermediate spaces most of the time, and I think our thinking about privacy and our legal rules should meet information where it is, rather than perhaps where it’s easiest to code.

0:04:09.6 Trevor Burrus: Now, early on in the book, and throughout the book, you bring up a somewhat notorious, at least, especially in privacy spheres, story about Target knowing when a woman might be pregnant. Give us the lowdown on that, and then I want to get into what might be wrong with it.

0:04:25.0 Neil Richards: Yeah, yeah, so I really agonized about whether to include this story in the book, because as… You’re right, Trevor, it’s become infamous in privacy policy circles, but the basic version of the story is that about 10 years ago, the New York Times reported that Target was using big data analytics on the purchase history of its customers, and it could use that to determine when women became pregnant, so changes in unscented moisturizer and folic acid. Now, importantly, it wasn’t diapers and strollers and all that stuff, it was earlier, and there were a couple of incidents in which women were told, “Congratulations, you’re pregnant, here’s a coupon for formula,” and they freaked out.

0:05:18.2 Neil Richards: And the reporting at the time was either big data, isn’t it powerful, or big data, isn’t it creepy. And at the time, I wanted to push past that, and I think what’s actually really interesting about the story and why I think it’s still relevant to us in 2022, is that the reason that Target was collecting the information was not to find it to invade people’s privacy to find out whether they’re pregnant, to sort of flex their big data muscles, maybe a little of the latter, but they were doing it in purpose of a real goal, which was they wanted to find out when people became pregnant because they knew that that was a time when human behavior is particularly susceptible to change, that we’re creatures of habit and one of the…

0:06:04.2 Neil Richards: All the advertising in the world can make it difficult to change those habits, but there are certain known inflection points, starting college, moving to a new city, and critically for this example, having a child. And so they knew that if they could get somebody hooked, you know, give them 50 bucks or less, 5 on a coupon, you could hook them. By becoming Target customers for formula, they could become Target customers for strollers and furniture and power tools and groceries and toiletries more generally, and makeup and socks, the whole… Everything they sell.

0:06:40.6 Neil Richards: And so it’s about the power of the union of data science and behavioral science, knowing when someone’s behavior can change, that’s behavioral science, and using the data about them to manipulate or nudge or coerce their behavior, that’s the data science. And what’s interesting, I think, ’cause there’s so many interesting things about the story, the story has tremendous hidden depths, I think. But what struck me is, when women received the “Congratulations, you’re pregnant, here’s some formula” coupons ads, they freaked out, they didn’t like it, and the awareness that they were being tracked made them disfavored towards Target rather than favorable towards Target.

0:07:26.5 Neil Richards: So what did Target do? It hid the coupon for formula in with wine glasses and lawnmower blades and power tools. In other words, the things that they could think of that are the furthest away from having a baby, but the formula coupon was right there where they were most likely to see it. And I thought that that was a form of trickery and a form of coercion that illustrated the main theme of the book, which is… Builds on Francis Bacon aphorism that information is power.

0:07:58.0 Neil Richards: Information is power, and human information, information about you and me, confers power over humans, and that’s in a nutshell, that’s why privacy matters. If our listeners take only one thing away from this, it’s that because information is power, struggles over privacy are in reality struggles over power, political power, economic power, social power, and personal power.

0:08:21.4 Trevor Burrus: It’s interesting, ’cause a couple of times in your remarks just now you said nudge or coerce, and you said coerce again, and that’s a very important word for me as a libertarian, that nudging is one thing, and it’s hard to define good and bad nudges… I’ve read Thaler and Sunstein, I know the libertarian paternalism which you bring up in the book. But when Netflix says, Hey, do you want to watch this show? We know from big data that you might like this show, and then lo and behold, I do like that show, and I’m like, Thank you, Netflix, thank you for telling me something that I want. It’s not coercion, it’s a nudge.

0:08:55.8 Trevor Burrus: And why is that different than what they did with pregnant women? It doesn’t seem to be categorically different. If they bought this and then enjoyed the products from Target, then the total amount of happiness has gone up, by definition, ’cause they bought it and they weren’t forced to, they were not coerced to in an important way, they were nudged to and they could have not done it.

0:09:14.4 Neil Richards: Yeah, okay, so there’s a lot in there, but let me take a couple of pieces here and find some common ground here. So I think… I agree, I like Netflix too. I like Netflix’s preference engine, and I think there is undoubtedly a line at which we can arrange nudging and persuasion and coercion and manipulation on a continuum, and I think we can agree that there’s a continuum there, I think we might disagree about where that line is, depending on how we weight individual freedom and fears of manipulation.

0:09:54.2 Neil Richards: So I think the difference is, when it comes to Netflix, is… Well, there’s a couple of differences, One is they’re regulated by a federal law called the Video Privacy Protection Act, which limits the ways that they can use that information. I think it’s… They’re also different because they are not an a la carte model, the way Target is, they’re not an engagement model, the way Facebook is, or actually for that matter, free commercial-​driven television is sort of, is an engagement model too. The reason Friends was on the air for so many years is they could sell so many eyeballs in the commercial breaks. Netflix is a sort of all you can eat model, and I think that calls for different sets of considerations.

0:10:43.6 Neil Richards: I think on the Target example, to go back to that, I don’t think it’s just a nudge. And I think even going back to… I don’t think it’s just a nudge because I believe that the ways in which Target deliberately targeted their deployment of behavioral science and data science was to undermine the free will of the people making the choices so that they were making the choices at the margins, and I think significantly at the edges of the margins, not the ones that they wanted to make but the ones that Target wanted them to make, and Target was deploying choice architecture in a way that undermined free will and free choice.

0:11:29.5 Neil Richards: The key insight, I think, from Thaler and Sunstein is this idea of libertarian paternalism, right, and as I talk about in the book, when Thaler signed copies of the book at book signings, he would always inscribe it with a message, nudge for good. And I think that the lesson that we’ve taken from Sunstein and Thaler, sort of like the Target example, is nudges are powerful. To a lesser extent, nudges are creepy, but mostly nudges are powerful. And I think that’s right, but I think if we accept that premise or that conclusion, it’s important for us to focus on what does choice architecture look like, and I think it revealed actually the vulnerability of human choice to those who are in the position to set up what Sunstein and Thaler called choice architecture, the interface is the design and the incentives, all of that.

0:12:28.7 Trevor Burrus: I think it’s a good point, and I don’t disagree about there being a continuum, and there’s obviously some point that I would be creeped out. I like how you said creepy, because that’s definitely one of the things, and you have a great [0:12:38.2] ____ in the book…

0:12:38.9 Neil Richards: I want to talk about creepiness in a second, yeah.

0:12:41.5 Trevor Burrus: Yeah, but at the same time, the weird… You bring up the Hidden Persuaders briefly, and this kind of anti-​advertising motif that has been pretty common, especially for more left-​wing circles, more anti-​corporate circles, that there’s some sort of power that you have by subliminal messaging. Hidden Persuaders brought up this subliminal messaging idea, and it’s all… It all ended up being kind of BS, ultimately, right. The ad industry loved that book because it basically told all these people that advertisers have immense power, like made them a huge amount of money, even though they actually don’t have an immense amount of power.

0:13:18.9 Trevor Burrus: And it’s very different when they target ads to me, and they say, we think… I think this is true of things that are not Netflix. Amazon says we’ve seen you bought this before, and therefore we think that you might want this. It’s very different, and I also know that it’s not some person at Amazon doing this. In so far as that information about me is known, it’s known by an algorithm in a computer, it’s not super accessible, it’s just an automatic process happening that more often than not tells me, things that I want to buy.

0:13:49.2 Trevor Burrus: So the other side of this on libertarian paternalism is it comes very close to saying what the person actually ends up doing, the revealed preference, is not their real preference, and that’s a difficult claim to make, that the person who ended up buying various things related to maternity and pregnancy at Target, that their real preferences were otherwise than what they ended up doing, and I think that’s a difficult claim to make.

0:14:11.9 Neil Richards: I just came back from an eight-​day work trip on which I had an awful lot of french fries and I really, really like french fries, particularly ones that are made from real… As an Englishman, they’re the ones that are made from real potatoes. But I don’t think my real preferences are french fries at every meal, they’re french fries in moderation, but the ways that menus are set up have an effect on the choices that I can make, things like do they disclose the number of calories, do they disclose if they’re actually made from fresh potatoes versus frozen ones from a bag they’ve dropped it in the deep fryer. Are there other options in addition to french fries that will be at least moderately as delicious, but a lot better for my cholesterol levels and waistline.

0:14:57.6 Neil Richards: I think… It’s interesting to think about advertising. You said before that… You implied there’s a meaningful difference between an algorithm knowing something and a human being knowing something. This is the point I think that Richard Posner made in the debates after… After either 9/11 or Snowden, he weighed in. With that point, he’s saying it’s not as bad if a non-​human sees you naked, it’s particularly bad when another human being does that. And we could argue about that. I think there might be something to that, but the point that I want to make in the book is I don’t really care about the human/​non-​human distinction. I’m less interested in the emotions of the observer and much more interested in the power that the information allows people to exert.

0:15:54.4 Trevor Burrus: And you bring that up, in particular…

0:15:58.6 Neil Richards: Cambridge Analytica is a good example here, where…

0:15:58.6 Trevor Burrus: I was just going to bring it up, because you complained a bit about the Target one, but you complained more about Cambridge Analytica.

0:16:01.8 Neil Richards: Absolutely. We experience commercial manipulation more than we… Thankfully, I think, than we experience political manipulation. But the point is, just as computers don’t care if you’re naked, they don’t care the uses to which these techniques are being put, and the exact same techniques that are being developed for commercial persuasion are being deployed for political misinformation, fake news, Cambridge Analytica style, real name, personalized political ad targeting known psychological vulnerabilities.

0:16:44.0 Neil Richards: I think back to a time in my life when like I think most people of my generation, going to graduate school in the ’90s, wowed by the internet, flirted with this notion of the end of history and techno-​libertarianism. You see this in the overlap between the civil liberties community, and the libertarian community and organizations like EFF, to this day, for example. If we’d had a conversation in 1995 or 1998 about what the internet was going to do over the next 25 years, we were being touted, sincerely, a realm of human empowerment, of individual choice, of meeting like-​minded people and this sort of libertarian paradise, and it was undeniably attractive.

0:17:38.8 Neil Richards: But what we got instead was what… Lina Khan actually quoted my book last week in her speech at IAPP as the greatest realm of human surveillance ever known to humankind. And in 1998, always on, complete surveillance on everything we did was not just technically unfeasible because of the… We could talk whether they’re bugs or features, but the limitations of the TCP/IP protocol to enable that kind of surveillance, but it was also politically unfeasible too, on the left, on the right, in the middle, at all points.

0:18:16.5 Neil Richards: The idea that the government should be able to read everything you write or listen to everything you say, know everybody you talk to, was politically unfeasible. Well, as well… I think the Snowden revelations struck a chord across, interestingly across non-​traditional ideological lines, where you have the Wydens and the Ron Pauls of the world standing shoulder-​to-​shoulder, where most of the time they don’t agree on anything. But this is one of those issues, I think. Because we all care about power, we all care about our place in society, we particularly care about our place in society and our autonomy when it comes to political power.

0:19:00.6 Trevor Burrus: But it’s another question of degree, because everyone really liked Obama’s… They seemed to very much like Obama’s highly strategic digital plan that he rolled out in 2008 and 2012 to identify, say, someone who had been a voter in the past and maybe he wasn’t going to vote because they weren’t aware of where their polling place was or various things like this, so we nudge them and we go, okay, maybe this is using a nudge for good, because there’s… I mean, I find the Cambridge Analytica thing fascinating because many people and my friends on the left, one of their biggest complaints about politics is non-​political engagement by voters, both in the sense of voter ignorance and then also just not getting involved either in grassroots things or just not voting. And so maybe one of the solutions to this is to let organizations like Cambridge Analytica, or some variation of that, tell us things that will nudge us towards more political activity. And that is not a bad thing, that’s a good thing.

0:20:00.1 Neil Richards: So I talk about… Yeah, it says that… This is super interesting. I talk about, as you know, the Obama example in the book. I was at a conference sponsored by the Future Privacy Forum, which I don’t mention in the book, they’re a great organization, and this is not their fault, what I’m about to say. But Obama’s chief data scientist was there and talked about the way they had used publicly available data rolls to get out the vote, and the crowd that day was sort of left-​leaning and pro-​data use leaning, and so most of the people applauded it in raptures.

0:20:36.9 Neil Richards: And I can remember talking to my colleague, my co-​author that I was there with, and he said to me, all this means is our elections just come down to who has the best data scientists. Because of course, while that sounds great in principle, neither the Obama campaign, nor the Trump campaign eight years later, nor the Cambridge Analytica, were doing it even-​handedly, that they weren’t nudging, get out the vote equally, they were targeting which districts do we want to get our people out, and which districts do we want to suppress people.

0:21:17.8 Neil Richards: And as we’ve seen, when our elections come down to dozens of votes in 2000 or hundreds of votes in several jurisdictions simultaneously in 2020, maybe I’m naive, but I’d like to think I can share my naivety in an organization that is committed to ideas, but I don’t think that our elections should come down to a battle of the best data scientists, they should come down to a battle of ideas, and the battle of our competing visions of not just of the good, but I think particularly in this conversation, we can maybe agree on a relatively shared notion of the good, but disagree on the right strategy to get there.

0:21:56.6 Neil Richards: That’s what I think democracy at its best is about. Yes, there are pollsters and there are professional campaign officials, but if we also bring in the quants and our elections come down to who has the best data scientists, I’m not sure we have something we can really call a democracy, at least in the sense that we’ve understood it in the past, and the ways in which we tell ourselves it works and it should work to have the kind of legitimacy that we want it to have.

0:22:32.3 Trevor Burrus: Oh, I definitely don’t disagree. I do find it to be much more of a difficult fact of trying to figure out how to get voters engaged. And one of the ways we could do this is with something that some people would call manipulation and other people would call influence. And that’s a very interesting line that I deal with, ’cause I do campaign finance, for example, policy at Cato, and I get people telling me, well, they’re manipulating or they’re influencing, and it kind of depends on which side that you’re coming from in terms of this in the big data…

[overlapping conversation]

0:23:04.6 Neil Richards: Right. If it’s our guy it’s nudging, if it’s the other guy it’s manipulation.

0:23:07.4 Trevor Burrus: Exactly.

0:23:08.2 Neil Richards: It’s interesting. Obviously, I would imagine that a Cato audience would not find a Scandinavian mandatory voting option to be palatable…

0:23:14.7 Trevor Burrus: Definitely not.

0:23:15.5 Neil Richards: But I think you probably could… You could do it with tax incentives, right. And I think that if you’re doing it at a society-​wide level and you’re doing it at least with a transparency around its commitment to the general good, then I think we may be on to something here. But I think if it’s purely left to private ordering, and remember, of course, these techniques are just as good at suppression as they are at getting out the vote, then I think we’ll essentially have 1915 trench warfare on voter suppression, and we’ll just spend more money on campaigns, and it’ll still be the political equivalent of a bloody stalemate.

0:24:01.9 Trevor Burrus: Yeah, no, I think that’s a good point. One of the word… You just said a word, transparency, which is an interesting, word because you deal… That’s sometimes our solution to privacy concerns. So aside from the big data and creeping on us in good or bad ways, you do a very good job in the book of talking about the right conception of privacy, and as the book is called, why it actually matters, but one of the things is this idea of transparency. If we just are told via a massive little thing, we can click yes or no if they want to share our information across the app on iOS, if we’re just told we know what data is being collected, isn’t that the ultimate goal of what we should… Because some people want more data collected and some people want less, so we should just know and transparency is the key.

0:24:46.8 Neil Richards: Yeah, that’s the argument. Notice that I said transparency and not notice, which is the word that we normally use in DC policy circles to talk about the regime of notice and choice that governs, sort of de facto governs privacy policy in the United States. And of course, under notice and choice just very briefly, the basic rule is as long as companies give you notice of the privacy practices and choice about whether to follow them, that’s sufficient, even when in practice notice can mean dense, vague, and actually, I would say dense and vague language is quite a trick to pull off at the same time, but the attorneys…

0:25:27.4 Trevor Burrus: Oh, yeah, but we’re lawyers, we can do it easily.

0:25:30.9 Neil Richards: The attorneys at DC law firms, and in full disclosure, I used to be one of them, writing these privacy policies, really pull that off with aplomb. But where notice is nothing more than dense, vague language you can’t understand, it’s hard to find buried in the bowels of the website, and choice is, as Representative Sensenbrenner said when the FCC rolled back the late Obama-​era broadband privacy guidelines and rules, ain’t nobody got to use the internet. And I think that’s a problem.

0:26:02.7 Neil Richards: So I think that it is important to have transparency, I think it is important to have choice as consumers among commercial options, but I think what has happened in practice is the actual level of choice that consumers have is largely minimal, and we don’t have the right choices that that we’d want, like I don’t want to have surveillance-​based ads, and I’ve searched for 20 years to find that option on the internet, and it’s never been offered to me. But I can go into these menus and sub-​menus and in principle, choice is empowering.

0:26:50.3 Neil Richards: And in many areas of human life, let me be clear, particularly because I’m talking with and to a libertarian audience, I am not an enemy of choice or human flourishing through individual choices and private choices. But it doesn’t scale beyond the important things, who do I marry, what job do I take, what brand of car do I buy, what do I have for dinner tonight? What do I name my children? Do I have sex with this person? Those are choices. But we’ve taken that language, that tremendously important concept, which is central to a wide notion, to a wide variety of notions of human freedom and flourishing, and we’ve stripped it of its context, and we’ve dropped it into one of dozens, if not hundreds, of platforms and services and websites and accounts that we use.

0:27:50.6 Neil Richards: And so we can’t remember our password for all of the sites that we use. How can we possibly be expected to make dozens of individual privacy choices for dozens or hundreds of platforms and accounts and services and individual websites with things like the e-​privacy directive. It just doesn’t scale. So what do we do? We just click I Agree, because we are overwhelmed by the choice, and because we click I Agree, because the companies setting up the choice architecture know we’re going to click I Agree and give up in resignation, they can set the defaults in ways that maximize their preferences and not ours.

0:28:31.9 Neil Richards: And so… And here’s the real magic from a manipulation standpoint of this state of affairs, I click Agree, I do want to protect my privacy, I care very deeply about it, I’ve written books and articles about it, but I click the I Agree, because ultimately, I just want to read the article, I just want to order the bagels. These are weirdly specific but real examples. And after that, though, I think to myself, even though I should know better, well, they did give me a chance to protect my privacy and I didn’t take it, so maybe it’s my fault. And this sort of completes the trap, as I argue in the book, of the nudge.

0:29:16.9 Neil Richards: And let me say one final thing. I testified at the FDC a couple of years ago, and one of my co-​panelists was a lovely, intelligent… You can see a but coming, highly-​skilled, well-​meaning lawyer who worked for one of the major cable companies, and she said, here at company X, we want to develop a deeper relationship with our customer base, and so that’s why we have all these dashboards and choice that you can have for your cable privacy. And then and now, I would say no, I don’t want a deeper relationship with my cable company, I want my cable service to work, and I want to be able to make choices in the marketplace knowing that I’m not going to be manipulated or exposed.

0:29:58.7 Neil Richards: And so I think in DC privacy policy circles, we have reified notice and choice and privacy harm as the game in themself, and really, consumers, in my experience, they just want to make choices about what to buy and what to do, we don’t want to engage in sort of a privacy lube job every time we want to order bagels. I had a particularly bad encounter with one of the bagels…

0:30:32.0 Trevor Burrus: Sounds like bagels are…

[overlapping conversation]

0:30:34.4 Neil Richards: Yeah, it was a local bagel place here in St. Louis, but it’s a national chain, and they had said, why don’t you sign up for our discount plan and you can get free sauce or I guess… Sorry, sauce, free cream cheese. And I said, I just want to buy my bagels and go home, ’cause I’ve got children who are hungry. And no, you should do it, and the guy behind me was like, yeah, you should do it. I’m like, okay, fine, just, I’ll sign up. It took 10 minutes. Now, at this point, the guy behind me has changed his mind and he’s saying, no, I just… I want to buy my bagels, and so I was under a lot of pressure to just sort of click through these various options.

0:31:09.0 Neil Richards: And I think… This is something, I talk about it towards the end of the book. We often make our privacy policy and commercial policy and consumer protection decisions with an idealized consumer in mind, the sort of, the homo economicus of unreconstructed law and economics or classical economic thought. I’ve yet to meet that person. The typical consumer is, as again, as I argue in the book, is distracted, harried, confused, limited in financial and technical and legal support, we don’t all carry Cravath, Swaine & Moore in our back pocket when we encounter a privacy policy.

0:32:00.8 Neil Richards: Sometimes the typical consumer is even a bit drunk, and I think we should make our rules and we should set our policies for ordinary consumers as we find them, so that those ordinary consumers can make choices in the commercial marketplace, can make choices… I hope they’re not drunk when they’re voting, but can make choices when they’re electing representatives in Washington or in the state capitols, free from the fear that they could be betrayed or manipulated. When you take that fear, when you take those risks off the table, then I think you do start to not just have more meaningful choice, but I think closer alignment and a closer vision of the common good between, say, progressives and libertarians.

0:32:53.4 Trevor Burrus: But in some important sense, and you push back on this I think very persuasively in the book, but privacy is itself a type of consumer good, in the sense that there are different levels of risk that different people have, and how much they want to keep something private or not, or whether it’s the thing, like sitting here looking at my computer, I have a little webcam blocker, you know, that I put on… Some people want that, some people don’t want that. And it doesn’t seem terribly different than… You know, some people will go skydiving and some people won’t go skydiving, and I have a roommate who gets everything delivered to his house under pseudonyms because he’s extremely privacy-​conscious and he takes advantage of that. So one point of the transparency and choice should be to let people craft their privacy to their desires and how much they care about keeping certain information hidden about them, and so we should be at least pushing towards, not a one-​size-​fits-​all, but some ability of people to make those choices.

0:33:52.2 Neil Richards: Yeah, I think that’s right, I think… But what we don’t have is a set of baseline rules that takes manipulation, that takes voter suppression, that takes consequences that people do not or cannot understand off the table, so they can make those choices freely.

0:34:15.7 Trevor Burrus: That’s a good point.

0:34:18.3 Neil Richards: A good example would be meat, that we can’t go to the butcher’s shop… My uncle could, because he was a butcher, but we can’t go to the butcher’s shop or the butcher’s counter at the grocery store and assess whether there are worms in the meat or whether it’s safe or whether it’s been handled properly, or whether there’s listeria or dangerous bacteria on it. We have a set of rules, and we can disagree about how expansive or searching those rules should be, but I think post-​Upton Sinclair, we all agree that no worms in sausage, please.

0:34:53.0 Neil Richards: And I think that’s really important, we lack the capacity to assess our own meat, and so we have rules that take the really sort of Russian roulette aspects of meat purchasing off the table. Now, we can then go and buy steak or processed hot dogs with lots of nitrates, we can make “bad choices” in the supermarket, but the really dangerous ones are taken off the table. And I think we should do something similar when it comes to information products. The problem is, because we don’t have a comprehensive privacy law, the only advanced economy in the world that doesn’t, and because we don’t have any sufficient constraint on really bad choices, the…

0:35:43.0 Trevor Burrus: A backdrop, essentially a backdrop that [0:35:46.4] ____ below which you can’t go, yeah.

0:35:46.4 Neil Richards: So I’m sure your roommate… I’ve not met your roommate, but I would imagine your roommate in his privacy-​selecting choices still is happy that he can’t buy heroin from Amazon. There are certain kinds of choices.

0:36:02.2 Trevor Burrus: Now you’re getting into… Now you’re getting into one of my other issue areas. I do think Amazon should be selling heroin. But that’s the libertarian view.

0:36:07.0 Neil Richards: I think we can stipulate that that is a minority position among Americans.

0:36:11.6 Trevor Burrus: Probably, yes. I do want to ask you, ’cause you do a very good job, my personal, biggest pet peeve in the privacy conversation is if you don’t have anything to hide, you shouldn’t worry about privacy. What’s wrong with that argument, which is… I mean, everyone has encountered it at some point, and a lot of people, sometimes even, I think unwittingly use it when they think something is happening, like a huge threat of terrorism, and so suddenly we think that there’s a huge threat of terrorism so privacy doesn’t matter so much anymore, so this idea that you only need to hide something if you have something to hide.

0:36:46.5 Neil Richards: Absolutely right, it’s advanced by organizations like the NSA and by defenders, particularly, one prominent defender of the NSA in particular is fond of articulating this argument, but you’re right, it does function as a kind of coping mechanism for people who are mainstream or majoritarian on particular avenues of difference. Well, they may have my data, but I don’t think… I’ve got nothing to hide, right, I’m okay. So there’s a number of problems with the nothing to hide argument.

0:37:20.5 Neil Richards: But very briefly, and putting aside the fact that it was coined by a literal Nazi, a literal German Nazi working for Hitler, putting that, let’s take ideas seriously regardless of their provenance.

0:37:32.7 Trevor Burrus: Important point, nevertheless.

0:37:37.3 Neil Richards: There are a few absolute moral lodestars, and I think that’s a pretty good one. So first of all, nothing to hide is wrong in its own terms. We all have something to hide, we all wear clothes, we all lock our homes, we all have doors on our bedrooms and on our bathrooms, there are certain activities that even Judge Posner in the [0:38:00.1] ____ case is willing to countenance this mysterious fact about humans that our sexual and [0:38:05.0] ____ activities are considered to be private. In addition, though, all of us have facts about ourselves that we don’t want shared, disclosed or broadcast indiscriminately.

0:38:15.9 Neil Richards: We confide medical information and legal information with doctors and lawyers, we talk to our loved ones, our spouses, our partners, our friends, our confidants, about a whole range of ideas, whether we like Bridgerton or whether we’re thinking about being a libertarian or we are libertarians…

0:38:41.1 Trevor Burrus: Don’t worry, Neil, I won’t tell anyone, you can say you’re a libertarian.

0:38:42.5 Neil Richards: Or we are libertarians, but we think that Atlas Shrugged is a ponderous book.

0:38:44.0 Trevor Burrus: That’s [0:38:47.4] ____. We can’t stand it.

0:38:51.0 Neil Richards: In other words, there are occasions when all of us have a need that furthers both our individual values and our participation in society to not have information about us broadcast indiscriminately. So privacy, about being nothing to hide, is just wrong in its own terms, we all have something to hide. Second, nothing to hide misunderstands why privacy matters. As we talked about, privacy is about power, it’s not about hiding secrets, and secrets are part of privacy, but if we think about privacy from a social, economic, political, personal power perspective, the nothing to hide argument becomes beside the point.

0:39:29.8 Neil Richards: Why are advertisers collecting so much information about us? Well, not because they want to engage in totalitarian domination. We can put Clearview AI to one side for a moment here, but like ordinary, like NAI, AIB, ad tech Chamber of Commerce types, they want to persuade us, they want us to get us to do, they want to control us, let’s say, to get us to purchase products that their advertisers will pay the money for, and not products that their non-​advertisers won’t pay them to do that.

0:40:05.1 Neil Richards: And then third, I think the nothing to hide argument focuses narrowly on privacy as an individual matter rather than as a social value. I think we all benefit from living in a society, no matter how boring, bland or mainstream we might think our views are, and if we’re that boring we should get out more. But I think we all benefit from living in a society in which other people are able to engage in private activity, expression, politics, in a number of ways. One, across a democracy, we all benefit from people generating new ideas, like radical ideas that people have literally died for, like the equality of all people regardless of race or sex or sexual expression.

0:40:50.0 Neil Richards: We benefited from the intellectual privacy that Dr. King and the leaders of the Civil Rights Movement had, despite the attempts by the FBI to interrupt them through widespread surveillance and blackmail in a number of cases. There are things we are doing in our society that are wrong, that history will judge us for, harshly. The problem is we just don’t know what they are yet. And so I think privacy around political views enables us to figure those sorts of things out.

0:41:25.1 Neil Richards: And finally, in a democracy, in a society in which we acknowledge some sense of shared purpose and some sense of common good, if only because we don’t all kill each other, I think we benefit from not knowing everything that our neighbors, our friends are thinking. I think privacy enables civility. We don’t have to know what each other’s sexual fantasies or religious views or political views or other things are, in order to engage with them most of the time in a civil, constructive and socially and individually beneficial way. Privacy enables that.

0:42:11.0 Trevor Burrus: It also seems to enable… And you touch on it in the book that it enables different types of relationships, as defined by privacy. That’s almost what the word intimacy means, that you’ve decided to let someone know things about you that not everyone knows, and if everyone knew it, then you wouldn’t be able to make that decision to let someone into your inner circle and change that relationship with that person, so it is constitutive of our relationships with people.

0:42:41.4 Neil Richards: Absolutely, absolutely. One final point on the civility point, I think much of our political polarization has come from technological models based upon engagement, jamming things that outrage us, sharing facts about people that makes us think less of them and makes us angry and makes us want to… I’m going to show that libertarian/​conservative/​progressive blow-​hard exactly what I think about them and prove to them on Facebook or Twitter that exactly why they’re wrong, and I think some of that anger comes from forcing information about our views out of its intimate context, and into collision, not for political engagement, but for economic engagement with ads that make the Facebooks and Twitters of the world more money.

0:43:42.7 Neil Richards: By contrast, you mentioned intimacy a moment ago, intimacy is an example where we have privacy and we have choice, and we bring people in as our confidants, as our friends, as our intimates into what we really think. But notice what happens too, where there is intimacy, or where there are lawyers and doctors, and where there is trust, we actually share more information. The better you get to know someone, the more you learn about them, and the more they’re willing, maybe even socially encouraged by the process of grooming that intimacy is, to say what they really think about politics or about your mutual colleague.

0:44:29.0 Neil Richards: And that’s one of the real wonderful ironies of privacy, of good privacy protections, of good privacy rules. It actually enables more information-​sharing and better information-​sharing in the context of relationships. And I think recognizing that fact and recognizing the importance of relationships to privacy, and to safeguard the power that information, and particularly the confiding sense that information provides, is something that’s missing from the ways in which we talk about privacy, in this country, but I think also in Europe, with discussions around the GDPR.

0:45:16.7 Neil Richards: Those intimates that you mentioned a little while ago, they can destroy us, right, that we’ve shared so much information with them, and we have to trust them because otherwise, they can disclose and they can destroy. That’s another indication of the power that human information confers and why it’s important to think about it in constructive and thoughtful and nuanced ways, beyond nothing to hide.

0:45:51.4 Trevor Burrus: So we’ve touched on it in different ways, but in the kind of prescriptive part of the book, and the ideas that there should be communal shared values that guide how we address these issues, you have some sort of more specific legislative, that we need a better baseline, as you said, but that ultimately it should be about what sort of values should we be thinking about when thinking about privacy policies.

0:46:20.6 Neil Richards: Exactly, right. So I probably came to privacy thinking it was intrinsically valuable. I’m sure your roommate isn’t a great… I tend to attract a variety of students in my privacy courses at Washington University from across the political spectrum, who believe in the intrinsic value of privacy, but I think not everybody agrees on the intrinsic value of privacy. I think once we recognize that privacy is about power, and that some set of rules for our society including just laissez faire, let the information flow do what you want, deciding to do that’s a choice, some kind of privacy rules are inevitable.

0:46:58.9 Neil Richards: We should ask what guidelines should guide our decisions about privacy rules. And I think we should think about privacy instrumentally in terms of the human values the good privacy rules serve. In the book, I talk about three of them: Development of our identities, protection of us in our political activity, and protection of us as consumers, essentially capturing the three basic roles we fill in society as humans, as citizens and as consumers. But I’m certainly happy to talk about other or to consider today or in the future, other values that good privacy rules might serve.

0:47:46.0 Neil Richards: I think equality is another one that I think we’re only just beginning to appreciate, but there could be others entirely. But basically, as long as our conversation about privacy rules is less about the intricacies of idiosyncratic definitions of privacy as intrinsically valuable and much more about the human values that should animate our privacy policy to guide and shape the exercise of social power in ways that benefit human flourishing or the good or however we define that, then I think not only will the book have been successful, but I think more importantly, our thinking and the way we talk about privacy at this critical point in human history is going to be a lot better off.

0:48:50.0 Aaron Powell: Thanks for listening. If you enjoy Free Thoughts, make sure to rate and review us in Apple podcasts or in your favorite podcast app. Free Thoughts is produced by Landry Ayres. If you’d like to learn more about libertarianism, visit us on the web at lib​er​tar​i​an​ism​.org.