Perspectives: A Magazine for and about Women Lawyers
The AutoAdmit Scandal and Legal Remedies for Online Victimization
Winter 2009
By Kathryn E. Swisher
Kathryn E. Swisher is an associate at the Washington, D.C., law firm of Oldaker Belair & Wittie LLP. She focuses primarily on federal procurement law and related administrative matters.

On March 7, 2007, the Washington Post ran a front-page article featuring interviews with three women—students at Yale Law School. These women, speaking on the condition of anonymity, described demeaning, defamatory, and misogynistic comments posted about them by members of AutoAdmit's online law school discussion board, which bills itself as "the most prestigious law school discussion board in the world." (Ellen Nakashima, Harsh Words Die Hard on the Web, Wash. Post, Mar. 7, 2007, at A01).

Three months later, two of these women filed a civil action in the U.S. District Court for the District of Connecticut against their pseudonymous attackers in Doe I & Doe II v. Individuals, Whose True Names Are Unknown, No. 3:07-cv-00909-CFD, 2007 WL 1988159 (D. Conn. filed June 8, 2007).

The AutoAdmit plaintiffs' complaint provides a detailed narrative of the facts, including examples of the salacious posts with descriptions of the plaintiffs' body parts, vulgar names, and suggestions that the plaintiffs should be raped and sodomized. Unfortunately, none of the details, while certainly deplorable, are particularly shocking. A quick scan of topics on the AutoAdmit law school discussion board reveals several casually racist or sexist threads (among other pressing issues for law students, such as "How to Ace Contracts" and a debate regarding J. Crew versus Burberry suits). The phenomenon of slinging insults at others while hiding comfortably behind a wall of anonymity is not new. A sad reality of life online is that even the most innocuous discussions can devolve quickly into name-calling or worse. More perplexing is what can be done about this kind of reputation maligning.

Sue the Web Site or the User?

An individual who is victimized by online attacks cannot easily sue the operator of the Web site on which the attacks were posted, nor the user who posted them.

Section 230 of Title 47 of the U.S. Code was passed as part of the Communications Decency Act of 1996 (CDA), which was Congress's first attempt to regulate pornography on the Internet. While the Supreme Court has held certain provisions of the CDA to be unconstitutional restrictions on free speech (see Reno v. ACLU, 521 U.S. 844 (1997)), section 230 has thus far survived.

Section 230 provides that

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

It goes on to provide that

No provider or user of an interactive computer service shall be held liable on account of . . . any action taken to enable or make available to information content providers or others the technical means to restrict access to [information provided by another information content provider].

47 U.S.C. § 230,

Although primarily intended by its congressional sponsors to protect Internet service providers (ISPs) such as AOL, Gmail, and Yahoo!, the immunity granted by this section has been extended by the courts to a variety of "providers" and "users." What the provisions actually mean is that a provider or user—such as a blogger or the administrator of a Web site such as AutoAdmit—cannot be sued for objectionable content posted on a Web site if the provider/user did not personally create the content, i.e., if he or she was not the "information content provider."

Theoretically, a blogger would not be liable for entries written by guest bloggers or comments posted by readers of the blog, and administrators of Web sites such as AutoAdmit would not be liable for comments posted by members of the law school discussion board. Further, the provider/user may edit or remove third-party-created content without losing immunity under section 230, although the courts have not clarified the line between acceptable editing and the point at which a provider/user becomes the information content provider so as to fall outside the scope of the immunity—i.e., when the provider/user has sufficiently contributed to the content to make it his or her own.

In a recent case, however, the U.S. District Court for the District of New Hampshire found that the administrators of a Web site,, could be held liable for a user's creation of a fake profile notwithstanding the immunity provisions of section 230. In Doe v. Friendfinder Network, Inc., 540 F. Supp. 2d 288 (2008), the female plaintiff sued the defendant Web site administrators on several claims arising out of the placement of a phony online dating profile by a user of the Web site. The plaintiff did not know the true identity of the user who created the profile—only that he or she accessed the AdultFriendFinder Web site through the Dartmouth College computer network using an e-mail address provided by Yahoo!. The profile included biographical and other identifying data, as well as a nude photo, purportedly of the plaintiff. The plaintiff claimed that she had nothing to do with the creation of the profile and that she did not learn of its existence until more than a year after its creation.

The plaintiff's complaint included multiple counts sounding in tort (defamation, dangerous instrumentality/product liability, intentional infliction of emotional distress), as well as claims under state law for violation of the plaintiff's intellectual property rights, under the federal Lanham Act for trademark violations, and under the New Hampshire Consumer Protection Act.

The court dismissed all the claims in reliance on section 230, except for the state law intellectual property claim "for violation of the plaintiff's right of publicity" and the federal Lanham Act trademark claim. Although the CDA provides that it "shall [not] be construed to limit or expand any law pertaining to intellectual property," defendants moved to dismiss the state law claim on grounds that this provision applies only to intellectual property rights granted under federal, as opposed to state, law. The court, however, disagreed, following dicta from a decision by the First Circuit Court of Appeals. The plaintiff's claim that the defendants appropriated her identity for their own benefit or advantage constituted a right of publicity claim, which is a widely recognized intellectual property right. Therefore, the court determined that the claim arose out of a "law pertaining to intellectual property" within the meaning of section 230 so that the defendants were not immune from suit on that basis.

Interestingly, one of the AutoAdmit plaintiffs, Doe II, brought a copyright infringement claim under 17 U.S.C. § 501 (in addition to the tort claims discussed below), claiming that the defendants had, without her authorization, copied or otherwise reproduced photographs in which she owned valid copyrights. This claim has not yet been adjudicated, but thus far it appears that intellectual property claims may be a way to pierce the veil of Section 230 immunity.

Unlike the Web site operators who were defendants in the AdultFriendFinder litigation, however, the AutoAdmit Web site operators/administrators are not currently involved in the litigation.

Tort Remedies Against Individual Users

In addition to the copyright infringement claim, the AutoAdmit plaintiffs brought claims under three of the four categories of privacy claims defined by the Second Restatement of Torts (§ 652A):

  • Appropriation of the other's name or likeness;
  • Unreasonable publicity given to the other's private life; and
  • Publicity that unreasonably places the other in a false light before the public.

The plaintiffs also asserted claims for intentional and negligent infliction of emotional distress and libel.

Anonymity (or pseudonymity) provides a significant obstacle for any plaintiff pursuing these tort claims against individuals. If, at the outset of the cause of action, only the user name of the person who defamed the plaintiff (or who intruded on the plaintiff's privacy, or who caused the plaintiff emotional distress) is known, the plaintiff must use the discovery process—i.e., issuing subpoenas to the Web site on which the allegedly defamatory content was posted, or to the ISPs and other intermediaries—in order to learn the legal identity of the pseudonymous defendant(s).

This is a substantial burden on the plaintiffs, but it may afford some measure of relief. In August 2008, the AutoAdmit plaintiffs filed a second amended complaint that set out the legal identity of one of the anonymous defendants named in the original complaint. In their most recent filing, the plaintiffs state that they have successfully identified a number of the anonymous defendants and have engaged in settlement negotiations with several of them. As a result of these negotiations, the claims against "Whamo" and "vincimus" (online pseudonyms) were dismissed. It is not clear what, if any, damages were paid by the dismissed defendants, nor is it clear whether they were in fact "information content providers" who would fall outside the immunity provisions of section 230.

Additional Remedies

There may be other, more limited remedies for online abuse. For example, Professor Daniel J. Solove of the George Washington University Law School argues that section 230 should be reinterpreted "to grant immunity [for Web site operators and administrators] only before the operator of a website is alerted that something posted there by another violates somebody's privacy or defames her. If the operator of a website becomes aware of the problematic material on the site, yet doesn't remove it, then the operator could be liable." Daniel J. Solove, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet (2007) (emphasis added).

However, Marc Randazza, a First Amendment lawyer and counsel for one of the named defendants in the AutoAdmit litigation, argues that this interpretation of section 230 is problematic. Randazza asked, in an e-mail to the author, "How should the Web site operator determine whether the information is indeed false? Simply upon a statement from the aggrieved party? Should online publishers be required to conduct a mini-private-defamation trial?"

One compromise might be to require administrators to delete personally identifying information from discussion threads upon request. It is unclear whether this step would solve the problem of "Google bombing" prior to removal. When posters Google bomb someone, they take steps to have threads containing defamatory or harassing statements appear as one of the first search results returned by Google if that individual's name is entered as a search term.

Another approach, suggested by writer Mattathias Schwartz in a New York Times Magazine article, is "disemvoweling"—having message board administrators remove the vowels from trollish comments, which gives trolls the visibility they crave while muddying their message. (Mattathias Schwartz, The Trolls Among Us, N.Y. Times Magazine, Aug. 3, 2008, available at A troll is someone who intentionally posts inflammatory comments to disrupt online communities. Some of the pseudonymous AutoAdmit defendants may be trolls, while some may simply be law students posting thoughtlessly.

No Easy Answers

In general, the problem is how to balance legal redress for victims against the First Amendment rights of everyone else—troll and legitimate poster alike.

Perhaps a more appropriate—and troubling—question is "what should be done" about online reputation maligning. Would it have been better if Doe I and Doe II had left well enough alone? Googling "AutoAdmit" today will result in pages of hits regarding the litigation, and many of these hits are opinionated—and self-perpetuating—responses from the blogosphere. Certainly more people are currently aware of the AutoAdmit scandal than ever would have been had the matter been ignored by the AutoAdmit plaintiffs and left to die a natural death. Then again, why should these intelligent, capable young women—or anyone for that matter—have to put up with the malicious gossip of immature individuals when legal remedies may be available? And why shouldn't such pseudonymous posters be held responsible for their injurious actions?

There are no easy answers to these questions. (Note that Schwartz interviewed a troll who argued that the "willingness of trolling 'victims' to be hurt by words . . . makes them complicit, and trolling will end as soon as we all get over it.") Perhaps one solution is for law students, and people in general, to act with a little more decorum and a lot more decency toward one another.


<< Page 4  Page 5   Page 6 >>
Commission on Women in the Profession Logo

Browse Past Issues of Perspectives

  In This Issue

The Political Pillorying of Pantsuits:
The Media's Gender Bias in the 2008 Presidential Campaign

Collaboration Is Key:
How Women Help Women Succeed

The AutoAdmit Scandal and Legal Remedies for Online Victimization

Careers: Lawyer for Hire - Freelance Contractors Change the Marketplace

Voices: A Journey to Justice

Chair's Message


Short Takes

  Editorial Board

Commission Chair
Bobbi Liebenberg

Editorial Board
Patricia H. Wittie, Chair
Leslie M. Altman
Ann M. Courtney
Mark Curriden
Phyllis Horn Epstein
Alyson Dodi Meiselman
Jana Singer
Patricia Timmons-Goodson

Commission Staff
Veronica M. Muñoz, Director
Alia Graham
Beverly Henderson
Barbara Leff
Melissa Wood

ABA Publishing Editor
Jane Harper-Alport

ABA Publishing Designer
Amanda Draper

  Helpful Links

Subscription Information

How to Submit an Article

Guidelines for Authors

Request Reprints

Copyright © 2009 American Bar Association. All rights reserved.

   Home  |   Features  |   Columns  |   About the Commission on Women in the Profession  

Your e-mail address will only be used within the ABA and its entities.
We do not sell or rent e-mail addresses to anyone outside the ABA.
Update your profile | Unsubscribe | Privacy Policy
American Bar Association | 321 N Clark | Chicago, IL 60654 | 1-800-285-2221