Wikipedia is a perfect example of a site that relies on the immunities afforded by Section 230. The website—the seventh most popular site in the world—is a completely user-generated online encyclopedia that is freely available in hundreds of different languages.
We spoke with Michelle Paulson, legal counsel at the Wikimedia Foundation, the nonprofit that operates Wikipedia and its sibling sites. Michelle, who deals with a host of issues including intermediary liability, had lots to say about the importance of Section 230 to Wikipedia.
What types of complaints or legal threats have you encountered regarding about user content? What are the most common legal issues you encounter?
Defamation claims are the most common by far—most of the legal threats against the Wikimedia Foundation either directly invoke claims of defamation or defamation is the motivation behind the claims that are actually made. For example, a celebrity or politician may claim that their right to privacy has been violated because they did not authorize a negative statement to be published about them on Wikipedia. While the claim may not be defamation in name, it is in spirit.
As you know, all Wikimedia Projects are educational, wiki-based, freely-licensed resources that are collaboratively created and curated by a global community of volunteer contributors and editors. Anyone in the world can read or contribute to the Wikimedia Projects. In fact, the Wikimedia Foundation does not write or edit any of the content found on the Projects.
One of the benefits (and consequences depending on how you look at it) of free speech and open platforms like Wikipedia or Wikinews is that there is a lot more unfiltered and uncensored information out there—about places, things, and people—than there ever was before. And people don’t always like what is said about them, particularly on the Internet. They frequently wish to regain control over what is said about them and to do this, they threaten or actually file suit.
But against whom? Traditionally, plaintiffs went after those with editorial control… newspaper publishers, TV stations, radio stations… but in an age where they might not necessarily know the identity of the person who allegedly defamed them, they now come after the companies that host these open platforms.
Now, I’m not saying that anonymity is to blame—quite the opposite—anonymity is essential for free speech to exist, grow, and thrive, both online and off. I’m simply saying that I understand the frustration that someone can feel when they read something unflattering about themselves and I understand why they think suing hosting providers can fix their problems. But suing organizations that merely provide arenas for speech, ideas, and knowledge to be shared is not right way to address concerns over content.
How do you respond to such complaints or threats? (e.g., what if someone complains about allegedly defamatory content?)
Initially, we try to explain what it is that we do—the role of the Wikimedia Foundation versus the role of the community of users. Frequently, the people who tend to be upset about what is being said about them on the Internet are the ones who don't really understand the open nature of the Internet—and definitely don't understand the nature of the Wikimedia Projects.
Because Wikipedia, as our largest and most popular Project, is the subject of most of the legal threats lodged against us, I will focus on it for the purposes of this discussion. However, it should be noted that legal threats have concerned all of our Projects at some point or another.
Many people have the misconception that Wikipedia is something that the Wikimedia Foundation tirelessly writes itself and curates, but it's actually very much a community-written and community-controlled labor of love. We have to explain to complainants that the users are the ones that write Wikipedia, the ones that update it, the ones that develop the policies that govern what should and shouldn't be in the articles, and the ones that enforces those policies and decisions.
After we explain how Wikipedia works generally, we try to explain the relevant policies and dispute procedures for their particular complaint and encourage them to use established community processes to resolve their concerns. The community has a system of processes and policies that can sometimes be a little hard to navigate, especially for people who are new to the Wikipedia world, but the community is generally happy to help, explain, and guide people through the processes.
The complaining parties understand that the article is never going to perfectly reflect what they want, nor will they ever have control over the contents of the article, but the article will at least reflect Wikipedia values and policies and will therefore serve the Wikipedia community, as well as the person who initially brought the complaint.
This strategy has been successful in many cases. In fact, some people who initially came to us with legal threats later became Wikipedia editors themselves after learning more about Wikipedia during the course of resolving their complaint.
But those who do not wish to utilize community processes to resolve their complaint will continue to threaten suit or actually file suit. Some are dissuaded from filing once we inform them about the CDA. Others are not. Thankfully, at least within the US, we have the CDA protecting the Projects.
In what situations do you take down content or block access, if ever?
It's extremely rare. The only situations where we've ever tried to intercede have been when there have been threats to life or limb or where we have received a valid DMCA takedown notice. Otherwise we're pretty firm about using community processes.
What kind of staff do you have dedicated to reviewing complaints? How much time, resources, or energy do you spend on reviewing these legal claims or what third parties do?
I've been in charge of reviewing these kinds of complaints for the past several years, but I definitely could not have done it without the support I received from other attorneys on the team, outside counsel, and the legal research performed by our extremely dedicated legal interns. Up until recently, we had a very small legal team, which meant that the time spent on reviewing and responding to these legal claims took away significant legal resources that could have been spent working on innumerable legal initiatives. .
Even now, with a small (rather than very small) legal team, a lot of time and donor money must be dedicated to evaluating and resolving these issues—more than would be the case if the CDA was expanded in scope and if more countries had equivalent protections.
We are relatively well protected in the US, but it still takes considerable resources to defend ourselves in foreign litigation—some of which could drag on for more than a decade.
What would it mean to Wikipedia and Wikipedia's community if you could be held legally responsible for all of your users' actions/posts?
We probably wouldn't exist anymore. Simple as that. Lawsuits are costly when you win, but they are even more costly when you lose. If the Wikimedia Foundation could be held legally liable every time there was a good faith inaccuracy on its Projects, we would have most likely been sued out of existence pretty early on. And we would have never had the chance to grow into what we've become, which is the largest repository of free knowledge in the world, accessible to all.
Furthermore, the Wikimedia community would be without a platform to share their ideas, their experience, and their knowledge with others. And they would likely have a hard time finding an alternate platform to use, as other hosting providers would meet the same challenges and financial difficulties that the Foundation would face if this were the case.
How has CDA 230 been important to Wikipedia?
It is the shield that prevents people from being able to hold the Wikimedia Foundation liable for the sometimes inaccurate or unflattering statements that are contributed by some of our users.
Do you think that the protections of CDA 230 should be narrowed? Increased?
The protections of the CDA absolutely should not be narrowed. I think if they were narrowed, not only would the Wikimedia Projects be in incredible jeopardy, but most of the crowdsourced websites that many people have come to rely on would be threatened as well, even those that have significantly greater financial resources than we do.
I also think that clarifying the extent of some of the CDA’s protections would be helpful. For example, if there are state causes of action relating to intermediary provider liability for user-generated content and it is not clear that the CDA's protections have preempted those state causes of action, then that becomes a potential liability for us and anyone else that the CDA protects.
What other changes could be made to help protect online intermediaries like Wikipedia?
I absolutely support the extension of anti-SLAPP protections to a federal level. Such an extension would better protect our users against frivolous lawsuits they may face for simply reporting truthful (but sometimes unflattering) information about particularly litigious individuals and entities. Our users are usually young, enthusiastic individuals who want to contribute to the largest repository of free knowledge to ever exist. It is unreasonable to expect them to take on the financial burden of defending against lawsuits solely meant to inhibit free speech, initiated by those of greater financial means. Extending anti-SLAPP will go a long way in not just protecting such users once a suit is initiated, but also in discouraging people from filing these kinds of malicious suits in the first place.
Do you think a CDA-230-like immunity should be expanded to offline intermediaries?
Yes. I think an intermediary is an intermediary, whether it's online or off. If you're taking the time to provide a platform for others to have the ability to speak and share, I don't think it matters if it's online or offline. You should be encouraging and fostering these kind of environments to exist in any forms that they can, not creating any additional barriers for them.
Is there anything else in particular about intermediary liability or how CDA 230 affects Wikipedia that you want to mention?
Section 230 of the CDA has had, and will continue to have, a significant and positive impact on online free speech, for US companies and their users both within the US and abroad. The CDA has helped the Wikimedia Projects become what they are today, but until all countries provide equivalent and consistently applied protections to intermediaries, the Internet will never reach its full potential.
The Wikimedia Foundation is based in the United States, but we serve a global community of users. At this point, most major websites serve the world, not a specific country. Unfortunately, this means that when other countries do not have clear protections for intermediaries, we are put into a position where we have to spend quite a lot of time and money litigating around the world. And we cannot accurately assess our risk when we do litigate because of the uncertainty that comes not having these kinds of protections in place, either through legislation or case law.
How do you respond to those sorts of claims—from other countries?
Well, during the initial legal threat, the same that we would anywhere else—by trying to explain what we are and encourage them to utilize the community processes that are available. But in the unfortunate cases which escalate to litigation, we frequently choose to defend ourselves in those other jurisdictions.
Thankfully, explaining how Wikipedia works, within the local legal framework—which is never as simple as invoking the CDA—has managed to prevent the Foundation from being found liable for user-generated content. But it's never a sure thing. It's always a major risk appearing in a foreign court and litigating where intermediary liability and protections are not as clear cut.