Moderating your readers’ comments does not make your news organization “liable” for those comments
News organizations around the country have different approaches to handling comments on their web sites. One question that has long plagued news managers, and which still informs corporate policy-making, is whether moderating the comments posted on your site by third-parties increases your organization’s civil liability for the damage caused by those comments. The short answer is that it usually does not.
Techdirt posted a strongly worded reminder today about the broad immunity granted to web site owners and the fact that modifying comments does not remove that immunity. You can read the Techdirt piece here and link internally to their own excellent coverage and analysis of this issue in general.
Simply put, Section 230 of the Communications Decency Act creates broad immunity for a web publisher for the comments posted by third parties. The entire statue appears here, but here is the relevant language from subsection (c):
(c) Protection for “good samaritan” blocking and screening of offensive material
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
(2) Civil liability
No provider or user of an interactive computer service shall be held liable on account of–
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
The bottom line is that the removal of offensive comments or the organization of third-party content on your site probably does not take your immunity away. Techdirt gets to the heart of it, but I thought I would add some case law to flesh out the reasoning and to highlight some parameters that are still being shaped:
First, here is the Fourth Circuit on the reason Section 230 was enacted:
Congress enacted § 230 to remove the disincentives to self regulation created by the Stratton Oakmont decision. Under that court’s holding, computer service providers who regulated the dissemination of offensive material on their services risked subjecting themselves to liability, because such regulation cast the service provider in the role of a publisher. Fearing that the specter of liability would therefore deter service providers from blocking and screening offensive material, Congress enacted § 230’s broad immunity “to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material.” 47 U.S.C. § 230(b)(4). In line with this purpose, § 230 forbids the imposition of publisher liability on a service provider for the exercise of its editorial and self-regulatory functions.
Zeran v. Am. Online, Inc., 129 F.3d 327, 331 (4th Cir. 1997)
Now, some food for thought:
A District Court in Kentucky, recognizing that the Sixth Circuit has not yet weighed in on this, held that “a service provider is ‘responsible’ for the development of offensive content only if it in some way specifically encourages the development of what is offensive about the content.” Jones v. Dirty World Entm’t Recordings, LLC, 840 F. Supp. 2d 1008, 1010-11 (E.D. Ky. 2012).
This case is currently on appeal at the Sixth Circuit.
The reasoning of the District Court judge though is interesting. This Kentucky federal court identified a number of examples where the immunity ostensibly goes away. In one example, a web host lost its immunity because it “required subscribers to the site as prospective landlords or tenants to include information that was illegal under the Fair Housing Act.” In another example, “the operator of a web site that sold various personal data, including telephone records was violating certain federal confidentiality regulations.” The most interesting part of this opinion is the analysis of the case itself, which involved allegedly defamatory remarks about a female high school teacher posted on a web site called thedirty.com.
This Court holds that, under the principles of Roommates.com and Accusearch, the defendants here, through the activities of defendant Richie, “specifically encourage development of what is offensive about the content” of “the dirty.com” web site. First, the name of the site in and of itself encourages the posting only of “dirt,” that is material which is potentially defamatory or an invasion of the subject’s privacy. Richie’s activities as described in his deposition also require the conclusion that he “specifically develops what is offensive” about the content of the site. Richie acts as editor of the site and selects a small percentage of submissions to be posted. He adds a “tagline.” He reviews the postings but does not verify their accuracy. If someone objects to a posting, he decides if it should be removed. It is undisputed that Richie refused to remove the postings about plaintiff that are alleged to be defamatory or an invasion of privacy. Most significantly, Richie adds his own comments to many postings, including several of those concerning the plaintiff. In these comments, he refers to “the fans of the site” as “the Dirty Army.” He also adds his own opinions as to what he thinks of postings. Richie’s goal in establishing the site was to bring reality TV to the Internet. He wants everybody to log on to “the dirty.com” and check it out. In his opinion, “you can say whatever you want on the internet.” One of Richie’s comments posted concerning the plaintiff was “Why are all high school teachers freaks in the sack,” which a jury could certainly interpret as adopting the preceding allegedly defamatory comments concerning her alleged sexual activities. When asked about this comment, he stated: “[i]t was my opinion, you know, watching the news and seeing all these teachers sleeping with their students and, you know, just my opinion on all teachers just from, like, what I see in the media.” Richie also posted his own comment addressed directly to the plaintiff, stating in part: “If you know the truth, then why do you care? With all the media attention this is only going to get worse for you … You dug your own grave here, Sarah.” He further posted: “I think they all need to be kicked off [the Bengals’ cheerleading squad] and the Cincinnati Bengals should start over. Note to self. Never try to battle the Dirty Army. Nik.” And, perhaps most significantly: “I love how the Dirty Army has war mentality. Why go after one ugly cheerleader when you can go after all the brown baggers.”
This Court holds by reason of the very name of the site, the manner in which it is managed, and the personal comments of defendant Richie, the defendants have specifically encouraged development of what is offensive about the content of the site. One could hardly be more encouraging of the posting of such content than by saying to one’s fans (known not coincidentally as “the Dirty Army”): “I love how the Dirty Army has war mentality.”
Jones v. Dirty World Entm’t Recordings, LLC, 840 F. Supp. 2d 1008, 1012-13 (E.D. Ky. 2012)
Remember, this case is on appeal and a number of stakeholders have filed briefs to the Court. It is being closely watched.
What is the takeaway if you manage a news site? The general rule still controls: you are probably safe from liability even if you manage or modify the comments posted by your users, readers or viewers. This case presents a warning though, that at least one court has tried to remove that immunity when the web site “in some way specifically encourages the development of what is offensive about the content.”