Fighting for privacy
Table of Contents:
Transcript:
Fighting for privacy
BOB SAFIAN: Hey everyone, it’s Bob here. Today we’ve got another special episode for you, recorded live at the invite-only Masters of Scale Summit in San Francisco. In fact, it was the finale session. My guest is Meredith Whittaker, president of Signal, the fast-growing encrypted messaging app. Our topic is privacy, trust, and digital surveillance, from governments and from companies, and Meredith does not disappoint. She comes on stage after former President Bill Clinton’s conversation with Reid Hoffman, a chat you’ll hear in the Masters of Scale podcast feed in the coming days. But Meredith is anything but intimidated. She talks about how she navigates a platform that helps millions of oppressed people communicate but is also used by bad actors. She explains why she helped organize protests at Google, the philosophy that inspired Signal’s founder, Moxie Marlinspike, and more. The audience loved it, so let’s roll into it. I’m Bob Safian, and this is Rapid Response.
[THEME MUSIC]
SAFIAN: All right. We get to close it out.
MEREDITH WHITTAKER: Yeah, we do. Hi, Bob.
SAFIAN: Yeah, we do. You guys doing okay out there?
The philosophy behind Signal
Love that energy. So, Meredith is the president of the Signal Foundation, which, among other things, runs the Signal Messaging app, which is an encrypted app if you haven’t used it. It is growing rapidly, used by up to a hundred million active users a month.
She was also a co-founder of the AI Now Institute, Google’s open research group, served as an advisor to Lina Khan at the FTC, which maybe we’ll get to, helped organize a walkout by thousands of Google employees to protest the handling of sexual harassment claims, among other acts of dissent while she was there.
Many topics to touch on. We’ll see what we get to. One of the biggest questions facing our global society is, and a key factor for tech platforms, about the interplay between trust, privacy, and surveillance — surveillance by governments and surveillance by companies tracking our digital activity.
Can you explain Signal’s purpose and philosophy around that?
WHITTAKER: Bob, great to be here. Well, look. For hundreds of thousands of years of human history, the norm for communicating with each other, with the people we loved, with the people we dealt with, with our world, was privacy. We walk down the street, we’re having a conversation. We don’t assume that’s going into some database owned by a company in Mountain View.
We assume that that is ephemeral, and that if I change my mind, maybe you don’t have a record of that that can be brought up in 12 years, right? 14 years. Now, the internet has obviously put network computation at the center of human communications. And it is now the norm that almost everything we do, where we are, who we talk to, what we buy, is not private.
I think when I think about Signal, Signal is there to preserve that norm of private, intimate communications against a trend that really has crept up in the last 20, 30 years without, I believe, clear social consent. A handful of private companies somehow have access to more intimate data and dossiers about all of us than has ever existed in human history.
So I think of us as maintaining a status quo that we really should have checked in on over the past few decades, not as being a heterodox actor who’s bucking a normative trend.
Balancing core and mainstream users
SAFIAN: Signal started as kind of an outsider platform, really. Like a hacker sort of mentality, right? Now, it started to attract more mainstream users. And I’m curious how you balance sort of that core community with the broader group you might want to reach?
WHITTAKER: So, I mean, I really have to hand it to Moxie and the original Signal team, because I think the hacker ethos is really this ethos of ‘let’s sit around a kitchen table, let’s have an idea, and let’s not just talk about it, let’s try it,’ right?
‘Let’s begin. Let’s see what we can do.’ So there’s an initiative, there’s creativity, there’s a drive, and there’s a willingness to kind of suspend disbelief and try it. But that shouldn’t be conflated with shabbiness or a lack of vision or a lack of broad scope.
Because what Signal was doing back in the day was really building novel technology. The Signal protocol solved huge questions in cryptographic research that made private communications on mobile devices possible in a way that it wasn’t. Moxie and the team insisted that Signal be easy to use, pleasant, and not look like a command line or some complicated PGP rigmarole, but that we actually were putting the needs of communication and the people who picked up Signal, not because they wanted to be private, but because they wanted to text their dad and say the laundry was done, right?
That’s the impetus. So I think there’s a care for actual human beings and what they use Signal for at the heart of it. That was sort of wrapped in virtuosic programming and cryptographic novelty with the bow of that hacker ethos where it’s like, ‘yeah, you said it’s impossible, but we’re gonna try it anyway, because it’s fun, because maybe we can figure out a possibility in the cracks of that impossibility,’ and here we are more than 10 years later, and Signal is a major platform, the biggest and most widely used private messenger in the world, with that hacker community still being vigilant, still checking our open-source code, still testing the implementation of our cryptographic algorithm to make sure you don’t have to trust me, you can actually verify it. It’s right there.
Freedom versus surveillance in digital communication
SAFIAN: One of the things that I struggle with, we all want to avoid surveillance, right? We want to have the freedom to do and be and act the way we want. On the other hand, that same freedom can sometimes support more nefarious or dangerous activities.
Signal is a great resource for people who are persecuted in many parts of the world, but it was also used by the organizers of January 6th, right? So I guess I just think, how do you think about that trade-off between freedom and bad actors and how the technology might be used?
WHITTAKER: Well, let’s break that down, right? The roads were also used by all of those actors, right? The goods and the bads. The grocery stores. And I kind of think about it.
There’s an analogy I like where I think about: Okay, I’m law enforcement. There’s a crime that was committed. I go into the house of this purported criminal, and I find a pen, like a Bic pen they used, and I’m like, oh, this is the tool with which they wrote down the scheming plans for the crime.
I go to Bic Incorporated, knock on the door and say, “Excuse me, this pen was used to communicate about a crime. I need you to reverse-engineer this pen to tell me everything that was ever written with it.” And the CEO would be like, “Are you out of your mind? That’s not how pens work, right? Go try many of the other surveillance tools in your toolbox, the massive budget that you just got from Eric Adams, whatever it is, do that.”
But like, obviously, we’re not going to put a gyroscope in a pen and make you have to charge it, and then the OS doesn’t update every year and it doesn’t work, because that’s not how pens work, right? So what are we actually asking here? Are we asking that every single artifact, every single tool we touch, somehow record our presence?
And then who’s watching? Because I just saw a president on stage talking about some really grim futures. So what are we actually talking about? Are we just sort of buying into this going dark narrative, because any corner of the world in which we have the ability to truly communicate privately, as we have for hundreds of thousands of years, is somehow unacceptable to a state or corporate apparatus that feels that surveillance is now the norm, even though we haven’t had social consent for any of that, in my view?
SAFIAN: I got you. I just want to be sure you’re—because really what you’re saying is sort of like the big… like we designed our digital activity in a way where it could be tracked.
WHITTAKER: Well, yes.
Rebuilding the digital ecosystem without surveillance
SAFIAN: We designed our digital activity. If we look at post-World War II investment in computational infrastructures and technology as command and control infrastructure to try to win the Cold War, right?
And then in the ’90s, the president who warmed this seat for me as well, I saw to it that there was a kind of the network computation and this infrastructure was privatized. Even though there was a lot of warnings around privacy and there’s no fault here, there was an understanding back in the ’70s and before that, hey, this stuff isn’t private.
Right. And there was an industry that grew up around that, that monetized surveillance, right? The advertising-supported surveillance industry came out of the ’90s. There were two decisions that were made in the Clinton era framework. One was that there were no restrictions on surveillance, so there was no privacy law.
We still don’t have a federal privacy law in the U.S. And two, that advertising would be the economic engine of the tech industry. Why are those two things important together? Well, of course, advertising requires you know your customer, and how do you know them? Well, you collect data on them. So there was an impetus for this surveillance, sort of baked into the paradigm we’re talking about that’s in no way natural.
This is in no way the way that tech works. Look, Signal’s rebuilding the stack to show you we can do it differently. And by the way, that’s all open source. If you want to use it, we can raise that bar, but we need to change these incentives, and we need to change the, I think, the articles of faith around surveillance and privacy and the way safety gets deployed to kind of demonize private activity while valorizing centralized surveillance in a way that’s often not critical.
SAFIAN: Nothing like throwing a little shade at a former president, just after he’s left the stage! But I’ll be honest, I hadn’t really thought that much about how our digital engagements are designed to enable tracking. We’ll go deeper into that topic, plus how she became a firebrand at Google. After the break. Stay with us.
[AD BREAK]
SAFIAN: Before the break, we heard Meredith Whittaker of Signal talk about the digital surveillance that so many of us have gotten used to. Now she talks about how Signal hopes to remake the ecosystem, plus what sparked her dissent at Google. Let’s jump back in.
I noticed that your undergraduate degree isn’t in engineering or computer science or policy. It’s in rhetoric and, I think, English.
WHITTAKER: Yeah, and it helps also to be right.
Principles over profit in technology platforms
SAFIAN: No, but I guess what I’m asking is like, how important is the way we talk about these issues versus the technical specifics of, in other words, how much of Signal as a platform is something to sort of raise these issues versus being the platform for actually making it a way that globally we operate?
WHITTAKER: I think we need to do both, right? You know, oftentimes you build, you hit like I have to deal with all of the real contingencies, you know.
In the middle of the night, if something goes off, I have to call someone, right? And I have to make sure they call Amazon so that we figure out why the API just went dark. Then we’re worried about the people who rely on Signal in some part of the world where we’ve been shut out, right?
The theory needs to be drawn from practice, and the practice needs to be informed by theory, and that’s Praxis. And I think that’s part of what Signal is doing. Showing kind of a keystone species in the tech ecosystem, kind of setting that bar for privacy and saying again, this is not natural, right?
This is a product of a certain economic system, a certain set of incentives, a certain narrative and historiographic flow of the world, but we can change it. We can build it differently. Signal’s a nonprofit, and we’re a nonprofit because we don’t want to be pushed by those incentives which so highly prioritize surveillance as the way of garnering revenue that we don’t think that’s safe in this ecosystem.
We’re also trying to change the ecosystem, right? We don’t want to be a single pine tree in the desert. We want to, to use my friend Maria and Robin’s term, rewild that desert so a lot of pine trees can grow. Again, we’re talking about massive change. But again, that hacker ethos says, let’s try.
Challenging authority and status quo in tech
SAFIAN: I mean, you like, I don’t want to say you don’t mind or you like, but your experience is you have poked at a lot of things in different ways where you looked at it and you say this just doesn’t make sense, or this isn’t right. And you don’t, I mean, you have a good career going at Google and you like some people like you were like, “I don’t care.”
WHITTAKER: Yeah. I mean, I’m a really sincere person. Right. And I came into Google, and I learned tech. You’re good at reading from English literature, so I could read the textbooks and I could learn it.
But I didn’t know. I was kind of believing it. I was like, okay, well, if this is that, then what about that? But then why doesn’t that match this? I’m putting together my little mind palace, and I also really, I live in a consciousness where I’m very aware I have one life, and ultimately, why not make it, why not go to sleep at night in a way where I feel peace with my own integrity?
And that’s just little steps every day. So I don’t relish blowing things up, but I do see if someone’s a real friend, they’re going to check you if you’re messing up. And I felt that as I saw this primitive accumulation phase of tech through the 2000s when I was at Google kind of calcify into these massive monopolies with a kind of power that is unprecedented.
I felt the need to check that, right? I felt like I’m on the inside. What would I want to have done if I was looking back at this? And then it wasn’t just me. There were thousands and thousands of people who were also like, “This is weird. What are we doing? Wait. Why are we building that? Why do we add that silly little feature? What are OKRs telling us to do?”
I think everyone has that in them. I still have many, many friends at Google across the industry. I love them. Ultimately, you can change things small, you can change things big, but there’s no one else I would say is.
SAFIAN: Meredith, what I’ll say is, I love your heart and your courage because I think all of these things that you do, while you make it sound very matter-of-fact, they’re very hard to do, and I appreciate it. And thank you so much for being here.
WHITTAKER: Thank you.
SAFIAN: Meredith Whitaker, everybody. Meredith makes it sound simple and logical to go toe-to-toe with big tech, the government, or anyone else. We could probably use more of her no-BS attitude across business leadership, to just say it like it is.
I totally get the appeal of Signal. Privacy and surveillance can seem relatively unimportant until suddenly, they infect everything. At the same time, enabling bad actors to have a free hand, that’s troubling too. But maybe that risk isn’t quite as acute as I’ve been led to believe? And I guess if we’re hurtling towards a future where privacy is under attack, it’s good to have someone as strong as Meredith protecting the ramparts. Keep checking this feed for more highlights from the Masters of Scale Summit. I’m Bob Safian. Thanks for listening.