How Wikipedia will survive AI
Table of Contents:
- How Wikipedia works
- What the Wikimedia CEO learned from speaking with Wikipedia contributors
- Managing the public & political perception of Wikipedia
- How Wikipedia handles complex and controversial topics
- How Wikipedia deems what sources are reliable
- Is AI Wikipedia's competition?
- How Wikipedia uses AI
- Leading a diverse contributor community
- Responding to a rapidly changing tech environment
- Building trust and policing bad actors
- What's at stake for Wikimedia right now?
Transcript:
How Wikipedia will survive AI
MARYANA ISKANDER: As the internet becomes flooded with more and more machine-generated content, we’re seeing that Wikipedia is becoming more trusted, right?
There are real societal concerns about how technology impacts our lives, and to have a source created by human beings, run by a nonprofit organization that has grown from a joke to the last best place left on the internet, which was kind of the more recent headline, is reassuring.
After 23 years, this is a digital project unlike anything humanity has ever seen. And our volunteers are both realistic, pragmatic, and optimistic about what this next chapter is.
BOB SAFIAN: That’s Maryana Iskander, CEO of Wikimedia, the nonprofit behind Wikipedia. I wanted to talk to Maryana to hear how Wikipedia is faring in an era of media mistrust, AI-generated misinformation, and the fierce cultural debate over truth and neutrality — heated more than ever in the run-up to the presidential election. Maryana shares how Wikipedia has surprisingly thrived in the face of ChatGPT and other AI competitors, and what’s keeping her up at night at the dawn of our new technological age. Maryana also reveals how the platform battles bad actors, including recent threats from both the Pakistani and Russian governments. Maryana also shares key lessons about leading through unprecedented turbulence. So let’s get to it. I’m Bob Safian, and this is Rapid Response.
[THEME MUSIC]
I’m Bob Safian. I’m here with Maryana Iskander, CEO of Wikimedia, the nonprofit supporting Wikipedia. Maryana, welcome to Rapid Response. Thanks for joining us.
ISKANDER: I am so delighted to be here, Bob. Thanks for having me.
How Wikipedia works
SAFIAN: So, Wikipedia is in many ways a bellwether of today’s kind of fraught relationship with facts and news and bias and key cultural issues. How does Wikimedia manage or engage with those issues?
ISKANDER: My experience is that almost everybody uses Wikipedia, but very few people really understand how it works and sort of what happens behind the screen. This is one of the top ten, top five, and in some countries literally the top most visited site. It gets 15 billion devices visiting Wikipedia every month, which is twice the population of the globe.
It’s operated by a nonprofit organization called the Wikimedia Foundation. It relies mostly on the generosity of small donations, which for me signals its value and use to millions and millions of people. And the content is created by volunteers all over the world. Hundreds of thousands of them, on topics across the globe.
In 330 languages, those volunteers follow core pillars that were the founding basis of Wikipedia, and lots of policies and guidelines to ensure that the content is accurate, verified, and uses cited sources. It’s not a place for people’s opinions. It’s a place to try to provide a neutral and verified set of information for the world.
SAFIAN: And so how does the Wikimedia Foundation, which you run, interact with the content, the things that we read and see on Wikipedia?
ISKANDER: We predominantly provide the technology infrastructure, but the content itself is really written, managed, and moderated by volunteer communities all over the world. The foundation has a critical role to play in legal and regulatory matters and community support, but we really partner with and support communities and the content creation itself.
What the Wikimedia CEO learned from speaking with Wikipedia contributors
SAFIAN: You joined Wikimedia as CEO in 2022, and I understand you started with a listening tour of Wikipedia contributors. Why, what did you want to learn? What did you learn?
ISKANDER: Maybe some of your readers are familiar with a book called “The Starfish and the Spider,” which talks about how organizations and institutions around the world are becoming more decentralized and have more stakeholders to manage. I just think this old notion of we live in the four walls of an organization isn’t the case anymore. And that book uses Wikipedia as an example of a starfish kind of organization. So I knew I was entering a very decentralized world with a lot of stakeholders, literally in every country of the world.
And since I hadn’t come from inside the Wikipedia world, it was critical to listen first. And so I did. I spent about three months just listening, talking to folks in about 55 countries. And I would say that I heard things that made me hopeful that after 23 years, this is a digital project unlike anything humanity’s ever seen. Our volunteers are both realistic, pragmatic, and optimistic about what this next chapter is.
SAFIAN: Their relationship to the issues around bias and what to trust and what not to trust. Are there things that you heard back from them around this particular moment?
ISKANDER: I would say that after 23 years, it takes a lot for our volunteers to get agitated, if I’m candid with you, because I think they’ve seen a lot of things. They’ve seen a lot of headlines announcing the death of Wikipedia, you know what I mean?
The policies they use to ensure that the content can be trusted have really endured the test of time. So how we think about reliable sources, how we think about verified facts, how we think about a neutral point of view.
Managing the public & political perception of Wikipedia
SAFIAN: Wikipedia’s co-founder, Larry Sanger, made headlines earlier this year by going after your predecessor, Katherine Maher, who now runs NPR, calling for her to resign from NPR. She’d made comments about white bias at Wikipedia, which he called ‘woke.’
Was that something you saw coming, or is that something that you just sort of see scroll across your phone?
ISKANDER: Because Wikipedia is, in some sense, a reflection of what’s happening in the world, everyday stuff happens in the world that comes across my screen. It could be something specific to us, or the content that’s being written about.
So I would say whatever’s happening in the world might somehow get reflected in our universe. It depends on how closely it impacts the work itself.
SAFIAN: Does your role as the head of Wikimedia Foundation involve managing how people think about Wikipedia since you’re managing that brand at the same time?
ISKANDER: I would say at a human personal level, it’s hard. For the foundation itself, it’s about calculating the right responses. What do we actively engage with that supports our volunteers? Particularly, if they’re taking the right stand on difficult issues. You raised the question of trust, and what I would say is that as the internet gets flooded with more machine-generated content, we’re seeing that Wikipedia becomes more trusted, right?
There are real societal concerns about how technology impacts our lives and to have a source that operates at the same level of global traffic as a lot of these other platforms, created by human beings doing it in the interest of free knowledge for all, run by a not-for-profit organization that has grown from a joke to the last best place left on the internet, which was kind of the more recent headline.
We haven’t seen a drop in our traffic since ChatGPT. In fact, we’ve kind of seen the opposite, with it becoming more reliable, more critical, and typically the largest source of training data for large language models. I can’t predict where the future’s going on any of this, but I do think that a lot of the fundamentals are in place for this resource, this asset, this public good, this free-for-all service. It’s like, it’s impossible to imagine.
They say that Wikipedia only exists in practice and not in theory, because when I say all of that, you’re like, how can that possibly be in 2024? How can that continue to be true? But it is.
How Wikipedia handles complex and controversial topics
SAFIAN: As someone who’s trying to provide trusted information myself, obviously in a very different way than Wikipedia does, I worry that people feel like we don’t know what to believe. Because people don’t necessarily understand how the information is put onto Wikipedia, do you worry that the sort of trusted independence that you’ve built over 20 some years might start to dissipate?
ISKANDER: I spoke at an event, and a person made a comment: How could Wikipedia possibly be trusted on a particular issue? I said, well, let’s go look at the page about that topic. They were flabbergasted at how accurate it actually was. It presented all the different points of view.
There’s a talk page that tells you who edited it and what changes they made. There’s an opportunity to post questions and ask about it. So, I think the radical transparency of everything being out in the open serves us well. I want to be clear, there are mistakes on Wikipedia, but the point is somebody fixes them and somebody can edit them if they see them.
When I engage with people and they express disbelief, getting into the details often helps them understand.
But it relies on people like you. It relies on reliable sources and journalism, and university presses, and other things.
How Wikipedia deems what sources are reliable
SAFIAN: The Israeli-Palestinian conflict has been challenging for many to navigate. Wikipedia deemed the Anti-Defamation League an unreliable source about Gaza this summer. Do you hear about that before it’s announced and have a role in managing the message? How does that come to you and out from there?
ISKANDER: That particular issue has quite a lot of complexity. The foundation put out an extensive statement. We have had debates around reliable sources. There’s a long litany of those, particularly around topics where organizations might engage in research and advocacy as well as content creation and gathering. The Gaza-Israel war is a good example, as you would see if you looked at the article about the Russian invasion of Ukraine. Lots and lots of engagement because, in some cases, points of view evolve and change over time as well. The key, speaking back to the point I wanted to make about neutrality, is if you go to Wikipedia and look at the reliable sources policy, it’s all there, long and extensive and clear. In the case of the foundation, it varies how we respond and how we make clear what the editors have done and why they’ve done it, which was the case here.
SAFIAN: So you’re explaining, not taking a position one way or the other, but you’re supporting, in general, your community about the decisions it’s making.
ISKANDER: I guess the point I want to make here is, everything’s out in the open, you know what I mean? Everyone can read everything and come to their own conclusions. We point people back to the talk pages themselves, back to the articles themselves, because the whole system relies on editors following these policies.
We often get asked by governments to take down content. We had a case where we were temporarily blocked in Pakistan for a weekend.
We had gotten an early indication from the Telecommunications Authority. As a new CEO, I was freaking out.
We communicated to our volunteers that this was likely to happen. The block happened on a Friday night, and two days later, it was lifted. I think that even governments of very large nations can appreciate that this is a pretty radical decision. I’m glad to say that was my only experience with something like that. We have an ongoing legal case with the Russian government, fined by the court system there. It’s trickling along. We are observing very carefully how the government might respond, who it’s blocking, what the demands are, and getting fines in court.
The stakes feel very high and very anxiety-producing for us as a leadership team because the mission is to provide access to free knowledge and free information, giving people not only the freedom to read it but the freedom to contribute to it as well.
SAFIAN: Wikipedia really is an amazing, counterintuitive model. It’s easy to take something so ubiquitous for granted, but the fact that it even exists and is thriving, in the face of government opposition and relying only on donations, shows the broad range of possibilities for business. So how is Wikipedia responding to the threat of ChatGPT and other AI? We’ll dig into that after the break. Stay with us.
[AD BREAK]
Before the break, Wikipedia CEO Maryana Iskander talked about the resilience of Wikipedia and how it deals with threats and controversy. Now, she talks about the emerging competition from ChatGPT and other AI, plus how she operates as a CEO in an organization based on unpaid volunteers. Let’s jump back in.
In the past, I’d Google something, and Wikipedia was often the first result. Now, I see Google’s Gemini AI first, or sometimes I’ll just ask ChatGPT.
ISKANDER: Yeah.
Is AI Wikipedia’s competition?
SAFIAN: Is AI in some ways your competition?
ISKANDER: Bob, I worry and think about that all the time. We’re seeing a huge shift from what I would call a link-based internet to a chat-based internet.
I think for Wikipedia, it has kind of two forks. The first is, will people scroll down far enough and click on the link and come to the Wikipedia page? And, in some ways, in the short term, that really matters to us. It matters for our revenue model because that’s how people find us and make their donations.
It matters for how our volunteers understand what they’re doing and how it’s visible. The other side is maybe you’re not going to scroll down and click on the Wikipedia page, but the answer that Gemini or AI is giving you is coming from Wikipedia because it is the largest source of data for most of these models. The struggle is whether you know that or not, whether attribution — which I would say is probably one of the most important things we’re trying to focus on and how AI is going to evolve.
It’s about motivating people to continue to do this and to contribute, right? Attribution matters as a way of thinking about what is going to be human motivation. To continue to create things that the machines can, you know what I mean, suck up and feed back to you in these various chatbots. So I think Wikipedia is becoming more and more vital, even if it’s becoming less and less visible. Do teenagers come to a page and read a long article? My nephew searches the internet through YouTube.
But as I said earlier, we have not yet seen a drop in page views on the Wikipedia platform since ChatGPT launched. We’re on it, we’re paying close attention, and we’re engaging, but also not freaking out, I would say.
SAFIAN: If Wikipedia is such a consistent source of training data for AI engines, is that a good thing or a bad thing? I mean, it’s a good thing in terms of improving the quality of what those AI agents put out, but do you wish you got paid for it?
ISKANDER: I think that’s not really in the model. I mean, we’ve thought about that and talked about that. I think we have a different role to play. How are we going to use our voice and our place in this ecosystem to talk about making the models more open? If you look at the AI models that our teams have built, they’re all in the open. They provide communities with all the data to assess whether they’re working or not. So when people say that can’t be done, I mean, we’re doing it. I recognize it’s a different business model, but it’s an important data point.
How Wikipedia uses AI
SAFIAN: How are you using AI today, and how might that be shifting?
ISKANDER: If you go back to the very early days of Wikipedia, machines have always been in the equation. Using tools to deal with repetitive tasks, always a human in the loop. That remains the case until today. There’s no human missing from any of the loops around our AI models.
Our own use of AI comes up a lot in supporting communities, potentially around translation. I know we’re in English, but there’s a ton of the world that is operating in other languages, and the internet does not reflect that. Then the question of how do we see Wikipedia content being used by others? There was a brief period when ChatGPT was really focused on plugins, and we created a Wikipedia plugin that essentially allowed ChatGPT to only pull from Wikipedia when it answered your questions and again, to try to learn and understand from that kind of experiment. As I said, we’re trying to be a data point for the world that all these things that people often say is not possible, is being done.
Leading a diverse contributor community
SAFIAN: I mean, your business model is so unconventional — no ads and unpaid contributors. And yet you’re still dominant in this age of trillion-dollar tech giants. It’s an unexpected paradox, right?
ISKANDER: I know. I know. It’s astounding, actually. That’s really the point. It’s astounding and almost disbelieving. If you ask our contributors why they do this, they’re not doing it to get paid. It’s like there’s something else going on here that speaks to human motivation, trying to be part of an information ecosystem with other people who care about accurate information, who care about an internet that gives us something we can trust. That’s the game we’re in, and I think finding other allies to be in that with us has also been really important.
SAFIAN: Your customers, in some ways, are not the end users of the product, right? But are the contributors. It’s got to be very different leading volunteers as opposed to paid staff. How do you think about that differently?
ISKANDER: I referenced this “The Starfish and the Spider.” There’s a section that talks about the role of the CEO as a catalyst as opposed to the role of the CEO as the titular head. The quick premise is if you have a spider and you cut off its head, all the legs fall away, and the whole thing dies.
Whereas if you have a starfish and you cut off a leg, it just regenerates the leg, and the rest stays intact. I think that analogy is good for being part of this very diffused system where, I mean, I do have paid staff. The Wikimedia Foundation, as a nonprofit organization, has about 700 people, but we have hundreds and thousands of volunteers. There is no directing; you have to be in partnership. You have to be influencing. There are very few things I get to wake up and decide on my own any day of the week, right?
I live in a system of stakeholders and communities. It’s a very different leadership paradigm than in many traditional organizations.
Responding to a rapidly changing tech environment
SAFIAN: Many tech businesses talk about pivoting, and yet it feels like at Wikimedia, there’s sort of a trust that the basic premise behind it is just sound, and we can long-term stick to what makes it work.
ISKANDER: Yes and no. I think that things that endure do so because the fundamentals are strong and make sense. The day-to-day work involves changing and evolving our technology. How do we respond to the introduction of voice assistants when Siri, Alexa, and others came out? How do we now respond to generative AI? I would say that there’s an evolution on a daily, weekly, and quarterly basis within the organization to understand external trends.
A rapidly changing environment and context.
I feel very privileged to be part of this massive community experiment of humanity in the digital age, and I feel confident that things will continue to change. But those basic fundamentals are holding us in good stead.
Building trust and policing bad actors
SAFIAN: You’re relying on your community to police itself. You could see bad actors wanting to influence that. Is that a new layer of challenge for your organization?
ISKANDER: You are absolutely right. There are ways bad actors can find their way in.
People vandalize pages, but we’ve kind of cracked the code on that, and often bots can be disseminated to revert vandalism, usually within seconds. At the foundation, we’ve built a disinformation team that works with volunteers to track and monitor.
In this election year, we’ve seen that our community policies work, being an antidote to some of those threats, but there’s no autopilot.
SAFIAN: Don’t you worry that the Russian government or the Chinese government might try to infiltrate your community of contributors?
ISKANDER: Do I worry about that?
I worry about that all the time. Creating a very healthy, large, diverse contributor community is the way to ensure that all points of view are represented and not hijacked by a small group.
SAFIAN: With all these things swirling around, how do you keep yourself personally level-headed amid all the turbulence?
ISKANDER: I really appreciate that question. I think building a team you can trust is a prerequisite to not losing your mind.
When I started this job, I had wide eyes around what’s happening in the world and how it impacts what we’re doing. The system works, and it has been working for almost two and a half decades, just gives you comfort. Our servers really never go down, even when we get huge spikes in traffic, typically when celebrities die. So the day Wikipedia got the most hits in our history was in September when Queen Elizabeth died.
But it’s a time for leaders where just trying to get it right is hard.
What’s at stake for Wikimedia right now?
SAFIAN: What’s at stake for Wikimedia right now?
ISKANDER: There are the above-the-line issues we see playing out. I would say the below-the-surface stuff that I worry about is the strength of the institutions we rely on like free press, independent sources, research from universities that can be cited as sources of information.
So there’s infrastructure around information integrity critical for Wikipedia. Caring about those issues, journalism, censorship, and how people are producing and disseminating knowledge are things that might happen and not make headlines. I think our sense of our role vis-a-vis this broader world is critical for our own integrity and survival.
SAFIAN: Maryana, this has been great.
ISKANDER: Thank you so much for doing it.
I really appreciate it. As a journalist, I’m inspired by the resilience of Wikipedia in the face of AI. The still-strong demand shows there’s more appetite for reliable information than we’re often led to believe. While not everything about Wikipedia is perfect, the model remains powerful — millions of volunteers motivated by the goal of free, accessible information. If there’s a concern, it’s that Wikipedia relies on trusted journalism. As those sources come under threat, everything gets weakened — Wikipedia, the AI relying on Wikipedia, and our larger society. Here’s hoping that in the race for a better, faster, more automated digital future, we don’t lose sight of the painstaking and often thankless human contributions at the root of it all. I’m Bob Safian. Thanks for listening.