chevron-down Created with Sketch Beta.

The Business Lawyer

Winter 2023/2024 | Volume 79, Issue 1

Liability of Social Media Platforms, Marketplaces, and Advertisers

Chase Jeremiah Edwards and Brent Baker

Summary

  • The developments addressed in this survey relate to business law and marketing topics implicated by Section 230. 
  • Part II addresses the inherent dangers of online activity (Part II.A), publicity rights in advertising (Part II.B), and potential liability of online sales (Part II.C).
  • Part III discusses FOSTA and its statutory exclusion from immunity, which courts have interpreted to impose a high burden on plaintiffs. 
  • Part IV concludes with an outlook for upcoming challenges to the online regulatory landscape. 
Liability of Social Media Platforms, Marketplaces, and Advertisers
iStock.com/LeoPatrizi

Jump to:

I. Introduction

The Communications Decency Act of 1996, among other things, created certain immunity protections for service providers to control the onslaught of lawsuits that threatened to snuff out the Internet Age before it started. Colloquially known by its section number in Title 47 of the U.S. Code, Section 230 allowed the Internet to grow into its current form by shielding platforms from most federal claims and preempting conflicting state claims. It provides an “interactive computer service” (“ICS”) with immunity against lawsuits stemming from information provided by third parties. While Section 230 has enabled the generation of trillions of dollars in global wealth and commerce, each year it is tested in courtrooms throughout the country.

Courts steadfastly affirm that Section 230’s immunity clause covers virtually all forms of third-party content published by platforms of all types (e.g., social media, e-commerce), even if that posted content is blatantly false, unlawful, or intended to deceive or manipulate others. Given the wide-ranging application of Section 230’s immunity clause to companies that constitute a significant portion of the economy, business lawyers should monitor its current applications and its potential future.

Despite unsuccessful attempts by states to weaken or circumvent these protections, and despite the stark differences between the Internet of 1996 and 2023, Section 230 remains essentially unchanged from when it was enacted. In 2018, however, Congress amended Section 230 when it passed the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), which combined a bill in the House of Representatives with a bill in the Senate, and which created an additional exclusion from the immunity otherwise provided by Section 230.

The developments addressed in this survey relate to business law and marketing topics implicated by Section 230. Part II addresses the inherent dangers of online activity (Part II.A), publicity rights in advertising (Part II.B), and potential liability of online sales (Part II.C). Part III discusses FOSTA and its statutory exclusion from immunity, which courts have interpreted to impose a high burden on plaintiffs. Part IV concludes with an outlook for upcoming challenges to the online regulatory landscape.

II. Recent Jurisprudence

A. Inherent Dangers of Online Activity

Snap, the company behind Snapchat, has been the target of numerous suits alleging that the app’s negligent design encourages reckless behavior and enables cyberbullying. In some cases, the courts have denied Section 230 protection to the company because the injuries were not caused by third-party speech but by the integrated features of the app. Nevertheless, those courts often found that Snap’s design was not the proximate cause of the injury or that Snap had no duty to prevent the deaths or injuries sustained. In at least one case, a plaintiff ’s negligent design claim survived a motion to dismiss, meaning that there is precedent for judicial review of app design as it affects the users of an ICS platform.

In a similar suit, Section 230 provided a dispositive defense. The plaintiffs in L.W. v. Snap, Inc. claimed that Snapchat is “an inherently dangerous software product that [Snap, Inc.] deceptively advertise[d] and promote[d] in a way that facilitate[d] sex crimes against children.” They alleged that child predators routinely misuse Snapchat to groom minors for illicit activities and that Snap knowingly failed to monitor and remove offensive content. The court straightforwardly applied Section 230 because the offensive material was obviously content supplied by third-party users. The plaintiffs attempted in vain to work around Section 230 by citing or imitating the arguments of other partially successful suits, but the court dismissed them all because each was “predicated on the theory that the Defendants violated various state laws by failing to adequately monitor and regulate end-users’ harmful messages.”

B. Publicity Rights in Internet Advertising

In recent years, companies that accumulate publicly available information in order to sell access to their databases have incurred significant litigation expenses in failed attempts to use Section 230 to shield them from liability under state publicity statutes. Ancestry.com and PeopleConnect, which operates Classmates.com, have been denied Section 230 immunity because they, not third parties, are responsible for posting the collected data to the public.

This commercialization of personal information remains under heavy scrutiny in two new cases that revolve around the use of yearbook excerpts (pictures, in particular) from records collected by Ancestry. The company uses school yearbook photos and names of people in targeted ads to entice potential customers into subscribing to its site and paying for access to other records. Ancestry, however, lacked the consent of those pictured in its advertisements. In Wilson v. Ancestry.com, LLC and Fry v. Ancestry.com Operations Inc., federal courts in Ohio and Indiana found that the company knowingly published data in advertising campaigns by using the personas of unwitting spokesmodels and, thus, was not immunized by Section 230. This line of cases poses a serious impediment to the advertising models of personal data aggregators.

C. Intermediary Liability for Online Sales Wthout Section 230 Protection

Armslist, an online marketplace for firearms and associated products, has become a focal point of the national debate regarding the sale of guns. In prior surveys, we discussed a variety of cases against Armslist in which the company successfully used Section 230 to defeat claims by those harmed by weapons whose purchase was facilitated by the site. Even in cases where, for example, a weapon was illegally purchased by an ineligible buyer and used to shoot a policeman, Section 230 provided an effective shield for the company. However, in other cases, including Webber v. Armslist LLC, the federal court refused to extend Section 230 immunity to the company.

In Webber, an Armslist customer murdered his estranged wife and committed suicide using a gun that he would not have been able to purchase from a firearms dealer due to a court order. After finding that Section 230 immunity did not apply to claims of negligence and public nuisance based on a defendant’s affirmative conduct, the court conducted a standard negligence analysis, ruled in favor of Armslist, and posited that Wisconsin public policy prohibits the imposition of liability on Armslist for the victims’ deaths. On appeal, the Seventh Circuit upheld the ruling that Section 230 did not apply, after thoroughly reviewing Section 230 jurisprudence within the circuit. Webber and similar cases make it clear that online marketplaces cannot simply rely on Section 230 to rebuff all claims for harm done by their online listings.

III. Raising the Bar for FOSTA

Congress enacted FOSTA to combat sex trafficking by rolling back Section 230 immunity for websites that aid in the marketing of those victims by intentionally or recklessly hosting user-generated advertisements. FOSTA states explicitly that Section 230 “does not prohibit the enforcement against providers and users of interactive computer services of Federal and State criminal and civil law relating to sexual exploitation of children or sex trafficking.”

Unfortunately, ambiguity surrounding the interface of state and other federal statutes, including the remainder of Section 230, has prevented most victims from obtaining justice through the federal civil remedies provided by FOSTA. In Does No. 1-6 v. Reddit, Inc., the Ninth Circuit raised the bar for claims to a potentially impossible level. The court held that, “for a plaintiff to invoke FOSTA’s [Section 230] immunity exception, she must plausibly allege that the website’s own conduct violated section 1591,” which section criminalizes child sex trafficking. Consequently, a plaintiff must plead and prove that the defendant company, not its users, knowingly participated in sex trafficking.

This new jurisprudential standard in the Ninth Circuit, home court for the vast majority of ICS companies, was later embraced by two other decisions of that court. In cases against Craigslist and Twitter, the appellate court reiterated the requirement that plaintiffs must plead and prove that the ICS knowingly participated in sex trafficking to prevail in a FOSTA case.

While this jurisprudence certainly will bar most FOSTA claims against complex ICS platforms, the plaintiffs in A.M. v. Omegle.com LLC survived a motion to dismiss by overcoming that high pleading standard. Omegle is a blind chat-matching service that pairs anyone who accesses its webpage with a stranger. The court held that the

Defendant could make changes that would minimize predators’ access to children. For example, Omegle could require age verification and forbid minors from use, separate minors and adults, or more thoroughly track and monitor its users. It does not. Given the very structure of the platform, and Omegle’s business model, I find that Plaintiff has sufficiently alleged that Omegle knew or recklessly disregarded the fact that it was receiving compensation from advertisers on account of the sex acts taking place on its website, some of which involved minors.

With this high bar met, the court denied Omegle’s motion to dismiss on five of the six counts alleged.

IV. Future Concerns

The tide of state-level legislative efforts to regulate online speech has ebbed during the past year following the failure of laws in Texas and Florida to bring about meaningful change. However, the upcoming presidential election likely will reignite social discourse over content moderation. Certain online harms remain largely unaddressed, which harms include cyberbullying, child sex abuse material, the psychological effects of social media on young people, and algorithmic targeting of individuals. As with every area of business, the law, and society, the effects of artificial intelligence on intermediary liability remain unclear. Courts will soon have to address various permutations of AI-generated content across all legacy ICS platforms and determine how AI usage will be interpreted with regard to the third-party requirement for protection under Section 230.

    Authors