So, what is section 230? What is the issue with section 230 a quarter of a century after its enactment? What is the status of section 230? Where might this controversy end up? Let’s dive in.
Setting the Stage with the Telephone Platform
Shortly after Alexander Graham Bell’s first call to Watson in 1876, subscribers likely began using their new telephones to harass others and commit crimes and torts. No specific evidence as to when such nefarious practices started is offered here. But there is a presumption based upon our flawed human nature that callers quickly tumbled to the idea of using the telephone as a means of threat, extortion, theft, gossip, and harassment.4
In any event, the early telephone companies were treated as “common carriers.”5 Thus, as Professors Stuart Minor Benjamin and James B. Speta explain, telephone companies have been exempt from liability (for, e.g., defamation) for their customers’ miscreant deeds.6 The basic idea was that the telephone companies did not control or monitor the customers’ content.7 This fact has traditionally distinguished telephone companies from newspapers or television broadcasters, which have been treated as “speakers or publishers” due to editorial control over what appears in their type of media/platform.8 As Tarleton Gillespie observed, telephone companies traditionally have been “trusted interpersonal information conduits,” as the service is the commodity, not the information it conveys.9 This contrasts with media content producers such as television and newspapers, where the entertainment is the commodity and we expect some content moderation.
Social media platforms, the focus of this article, are perhaps a new category, “a hybrid between mere information conduits and media content providers.”10 Some argue that these social media platforms (a product of technological convergence) are “enjoying the privileges of common carriers without the responsibilities”11 such as the obligation to serve all users in a nondiscriminatory manner.
A tour of some of the more interesting section 230 cases may help flesh out this topic.
The “Wolf of Wall Street” Gives Birth to Section 230
Fast-forward a century or so. In 1996 the internet platform was beginning to take shape, and its growth coincided with the first major rewrite of telecommunications law since 1934. The federal Telecommunications Act of 1996 (FTA 96), described as “revolutionary legislation” by President Bill Clinton,12 was primarily focused on three big themes: facilitate local exchange competition, increase competition in the long-distance telephony market, and reform the century-old policy of universal service. But as Professor Jeff Kosseff explained in his must-read “biographical” book on section 230,13 this under-the-radar provision worked its way into the FTA 96. Section 230 flew under the banner of the Communications Decency Act, which was added to Title V of the FTA 96.14 Today, “Section 230” now has its own Wikipedia page!15
As is often the case with the enactment of legislation, a bill idea is a by-product of catching up to prior, real-life events. In this case, the firm of Stratton Oakmont (yes, that Stratton Oakmont16) had a legal battle with Prodigy (now like Blockbuster and Radio Shack in our memories). The firm sued Prodigy over content that it deemed defamatory on the latter’s online “bulletin boards.” The posts of one Prodigy user described the head of Stratton as a “criminal” and the company as a “fraud,” among other such invectives.17 The court in Stratton Oakmont, Inc. v. Prodigy Services Co. eventually held Prodigy to the “strict liability” standard of a publisher of defamatory statements because it had actively advertised its practice of controlling content and screening/editing messages posted on its bulletin boards.18
Congress (at least those members who were aware of the implications of section 230) swooped in within a year of the decision and decided to provide statutory “immunity” (albeit this term is not used in section 230) to “interactive computer services” with millions of users from tort-based lawsuits. Such burdensome litigation posed an imminent and substantial threat to the relatively new internet platform and its providers. It must be noted that this law was passed before there was a Facebook or Twitter or most social media platforms that currently occupy large parts of our daily lives. The related policy position was to encourage such providers to self-regulate the dissemination of offensive material on the internet and not be subject to liability as a “publisher” in exercising these “editorial” functions. In short, as Milton Mueller posited, section 230 was intended both to immunize providers that did nothing to restrict users’ communications and to immunize providers that took efforts to discourage or restrict undesirable content.19
So, What Are the 26 Words?
Section 23020 has far more than 26 words, but this article focuses on the 26 words that constitute the key “publisher or speaker” provisions in 47 U.S.C. § 230(c)(1): “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
The companion provision is 47 U.S.C. § 230(c)(2), which provides:
(2) Civil liability
No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
Taken together, these two subsections are known as the “Good Samaritan” section21 relating to the blocking and screening of offensive material. Digesting its intended effect, the provider is not liable for bad information posted by another and the provider is also not liable if it either moderates or edits content or decides not to do so (but it is not required to do so). The underlying ideas were to encourage the development of the internet and limit government interference in this new platform/medium to a minimum.22 While wanting to protect children from indecent material, Congress also envisioned encouraging the exchange of “intellectual activity”23 and promoting commerce via the internet.
As with other best of intentions at the time, we’ll turn to how these objectives turned out in the past quarter century and how courts applied or misapplied (depending upon your viewpoint) this “simple” little addition to the FTA 96. We’ll then highlight the current debate over section 230 and discuss what, if anything, may be done about it in Washington.
Just How Far Does Section 230 Go?
With section 230 now “on the books” (at least in the United States),24 the next phase was the interpretation and application of the law by various courts. There have been numerous decisions with a variety of opinions.
Zeran v. AOL
One such early case was Zeran v. AOL,25 which had an odd connection to the tragic Oklahoma City bombing of 1995. An unidentified person posted advertisements on AOL’s (yes, it is still around)26 bulletin board to purchase offensive T-shirts related to the bombing by calling the home telephone number of Kenneth Zeran. Zeran was immediately deluged with calls and death threats. Zeran called AOL for help, which said that the posting would be removed from the bulletin board but that AOL would not print a retraction. Nonetheless, additional postings on AOL continued for several days. Zeran repeatedly called AOL for assistance and was told that the offending account would be closed.27 But by this time, a local radio station relayed the first posting on air and attributed it to “Ken” at Zeran’s number. The harassing and threatening calls increased.
Zeran sued AOL, arguing that AOL should be liable for defamatory speech initiated by a third party. AOL pleaded section 230 as an affirmative defense, and the U.S. Court of Appeals for the Fourth Circuit upheld the district court’s finding of immunity. The court specifically rejected Zeran’s argument that section 230 eliminates only “publisher” liability and not “distributor” liability.28 The court found that distributor liability is merely a “subset” of publisher liability.29 The court also rebuffed the argument that liability should be imposed on service providers that have actual knowledge of defamatory content (due to notice), finding that such liability would be impractical to administer and would defeat the fundamental purposes of section 230.
Zeran illustrates the difficult balance of the section 230 framework between individual harm30 and society’s benefit from an online platform.
Fair Housing Council of San Fernando Valley v. Roommates.com
Fair Housing Council of San Fernando Valley v. Roommates.com involved another unusual set of facts unlikely contemplated by the authors of section 230.31 Roommate.com (Roommate)32 operated a website to match people that had rooms to rent with would-be renters, and the website obtained its revenues from advertisers and subscribers. Roommates created a profile series of questions, including questions about sexual orientation, and encouraged users to provide “additional comments.”33 The underlying litigation was a complaint by the Fair Housing Council that Roommate’s business violated the Fair Housing Act (FHA). Roommate won dismissal at the district court level, relying on section 230 immunity.
The U.S. Court of Appeals for the Ninth Circuit explained that a website operator can be both a “service” provider, i.e., one who “passively” displays content of third parties, or a “content” provider who creates content.34 Thus, the operator may be liable for some content and have immunity for other content. The court thought that section 230 was meant to immunize the removal of content, not the creation of content. Here, Roommate was found to have “created the questions and choice of answers, and designed [the] website registration process around them” and thus was an “information content provider,” in the court’s opinion.35 Roommate “passively” displayed the content provided by the subscriber, per the majority,36 and the Nineth Circuit thus affirmed Roommate’s “immunity” (again, not a term in section 230) under section 230.37
The dissent argued that the majority opinion expanded liability for internet service providers and would “chill the robust development of the Internet” and “chill speech on the Internet.”38 The dissent further argued that the users were providers of the content and that the majority had blurred the definition of “development.” More than a decade later, arguments about the chilling of speech on the internet abound—but, as we will see, from a different viewpoint.
Section 230 and “Hard” Cases
Is there a limit to section 230’s immunity force field of protection for online platforms?
Doe v. Backpage.com, LLC
In Doe v. Backpage.com, LLC, Judge Selya wrote that “[t]his is a hard case . . . in the sense that the law requires that we . . . deny relief to plaintiffs whose circumstances evoke out-rage.”39 Backpage.com provided an online classified advertising service that included the categories of “Adult Entertainment” and “Escorts.” Three young women who had been minors during the relevant time period brought suit against Backpage.com for facilitating sex trafficking. The suit claimed that the website’s rules and processes helped encourage this despicable practice (by, for example, failing to require phone or email verification). The question presented was whether section 230 shielded Backpage from liability. The district court found that section 230 shielded conduct if the defendant “is a ‘provider or user of an interactive computer service’; . . . the claim is based on ‘information provided by another information content provider’; and . . . the claim would treat [the defendant] ‘as the publisher or speaker’ of the information.”40
The First Circuit found that the essential claim of the website facilitating the illegal conduct necessarily treated the website as a publisher or speaker of the content, and thus, Backpage was entitled to section 230(e)(1) protection! The court was not amenable to any argument that Backpage had gone beyond the behavior of Prodigy and AOL in the cases discussed above. The court pointed the appellants toward Congress to seek legislation, and remedial legislation was subsequently passed.41 But this case serves as a possible precursor to other hard cases that have arisen or will arise. What, then, will be made of such outrageous exceptions?
Batzel v. Smith
Another odd set of facts became the subject of section 230 litigation. In Batzel v. Smith, a handyman apparently had some issue with his customer, a lawyer named Batzel.42 The handyman overheard an (alleged) conversation in which Batzel said that she had had connections with Hitler’s staff and that she possessed a significant amount of old art that she said she had inherited. The handyman turned war-crime solver crafted an email to a stolen art investigation–related website outlining his concerns. The website, which is used by art thief investigators and operated out of the Netherlands by Ton Cremers, published the email. The handyman later said that he would have not sent the email to Cremers’s website if he had known that it would be blasted around the internet. Batzel sued all parties involved, including some advertisers on the website.
Cremers raised section 230 in his defense, arguing that the handyman’s email was “information provided by another information content provider.”43 Hence, Cremer claimed that he could not be sued for “publishing” it on the internet under section 230. Judge Berzon and the majority agreed, explaining that Cremers did no more than select and make insignificant changes to the email in question. Simply, the majority read the “26 words” literally.
The dissent mounted the argument that the majority went far beyond what the Congress intended, and now people will be able to “spread vicious falsehoods” on the internet with immunity.44 Judge Gould in his dissent explained that
Congress understood that entities that facilitate communication on the Internet—particularly entities that operate e-mail networks, “chat rooms,” “bulletin boards,” and “listservs”—have special needs. The amount of information communicated through such services is staggering. Millions of communications are sent daily. It would be impossible to screen all such communications for libelous or offensive content.45
Judge Gould would implement section 230 under the following test.
Similarly, the owner, operator, organizer, or moderator of an Internet bulletin board, chat room, or listserv would be immune from libel suits arising out of messages distributed using that technology, provided that the person does not actively select particular messages for publication.
On the other hand, a person who receives a libelous communication and makes the decision to disseminate that message to others—whether via e-mail, a bulletin board, a chat room, or a listserv—would not be immune.46
As the majority noted at the outset of the opinion, Congress has chosen to treat liability for defamation and obscenity differently in “cyberspace” than in the “brick and mortar world.”47 This policy decision can present some seemingly odd results, whereby someone may be liable for defamation for mailing a stamped letter to numerous people but have immunity if they communicate the same information via the internet, which raises the following questions.
Publisher Liability Versus Distributor Liability
Malwarebytes, Inc. v. Enigma Software Group USA, LLC
Typically, an individual statement regarding the denial of a petition for certiorari does not receive much attention. But Justice Thomas’s statement in Malwarebytes, Inc. v. Enigma Software Group USA, LLC in 2020 warrants close review here.48 Justice Thomas suggested that courts in section 230 cases have mistakenly confused publisher liability with distributor liability. He explained that, “Traditionally, laws governing illegal content distinguished between publishers or speakers (like newspapers) and distributors (like newsstands and libraries). Publishers . . . could be strictly liable for transmitting illegal content. But distributors were . . . liable only when they knew (or constructively knew) that content was illegal.”49
Justice Thomas’s discussion of Stratton Oakmont, Inc. v. Prodigy Services Co.50 and the legislative history surrounding Congress’s use (or lack thereof) of the terms “publisher” and “distributor” in section 230 and other Communications Decency Act provisions is quite provocative.51 He raises concerns about “extending § 230 immunity beyond the natural reading of the text” and comments that the court should decide the “correct interpretation of § 230” in the future.52
In re Facebook
The Texas Supreme Court later noticed Justice Thomas’s statement and discussed it at length in its In re Facebook, Inc. & Facebook, Inc. opinion issued during the summer of 2021.53 This case included a set of facts that are unfortunately reminiscent of the horrible facts in the Backpage.com case discussed above. Facebook sought dismissal of three separate cases brought by alleged victims of sex trafficking when they were minors. The victims became ensnared in the trap of the perpetrators via the tools of Facebook and Instagram (owned by Facebook). The relators’ attorneys sought dismissal, relying on section 230. The Texas Supreme Court denied this request after engaging in a lengthy review of Justice Thomas’s statement in the Malwarebytes case.54
Facebook had moved to dismiss, citing 47 U.S.C. § 230(e)(3), which provides that “[n]o cause of action may be brought, and no liability may be imposed under any State or local law that is inconsistent with this section.” Facebook argued that the plaintiffs’ claims are “inconsistent with” the primary provision under discussion in this article, section 230(c)(1).55
The court strongly rejected this argument, saying “We do not understand section 230 to ‘create a lawless no-man’s-land on the Internet’ in which states are powerless to impose liability on websites that knowingly or intentionally participate in the evil of online human trafficking.”56 The Texas Supreme Court relied in part on the Roommates.com decision discussed above, stating:
Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that section 230 does not allow it. Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking. Congress recently amended section 230 to indicate that civil liability may be imposed on websites that violate state and federal human-trafficking laws.57
Furthermore, the court quoted Roommates.com for the proposition that “[a] defendant that operates an internet platform ‘in a manner that contributes to,’ or is otherwise ‘directly involved in,’ ‘the alleged illegality’ of third parties’ communication on its platform is ‘not immune.’”58
So, Is It Time to Review Section 230?
While these cases pose interesting dilemmas for litigants and courts, is it time to review section 230? Given the amount of discussion in the media and in D.C. and state capitols, the answer appears to many to be a resounding yes! But is it? Does the sample of cases summarized above warrant such further review? Or is this current debate motivated by other reasons (or both)?
The above cases presented novel situations, but any candid observer would find that the current controversy generally centers on the cause célèbre of Big Tech and its control over what content appears on its respective platforms. This debate takes on a strongly political flavor, as some conservatives state that the internet is slanted against their views and liberals argue that platforms are protecting society from incorrect and/or inciting messaging.
As noted above, there is no real debate that Big Tech is big.59 It is big in many ways.60 This article assumes this to be the case (since this is not an antitrust complaint/brief). For example, in terms of market capitalization, Amazon, Apple, Google, and Microsoft easily exceed $1 trillion each.61 A bit more startling is the fact that the “Big Five” of Big Tech—Apple, Amazon, Alphabet (Google), Facebook, and Microsoft—make up about 20 percent of the total value of the stock market!62
But these financial facts may be a bit esoteric. More practical tests of size include the following.
- During COVID, where do the millions of consumers “go” to buy something every day? Amazon.
- How do billions of people stay connected to Grandma or old high school classmates in another city? Facebook.63
- Where do millions of people go to vent their opinions in a few words? Twitter.64
- From what company do millions buy multiple smartphones, EarPods, PCs, desktops, notebooks, and smart watches year after year? Apple.
- If you are going to create a document or presentation for school or work, what software do you use? Microsoft.
- How do millions search online for the latest information on the pandemic or the bio of the star of your favorite show to binge? Google.65
Indeed, some have come to call these platforms “digital nation states.”66
On top of this size issue is the argument that section 230 issues raise constitutional arguments, and some claim that the provision “is the most important law protecting free speech.”67 But the issue really became inflamed in the context of the political speech and social media mediation decisions. Several months before the tragic events of January 6, 2021, President Trump and other conservatives had raised issues about unfair “censorship”68 of their views by social media platforms such as Twitter.69 Trump even signed an executive order directing federal agencies to review “social media censorship,” which stated in part:70
Twitter, Facebook, Instagram, and YouTube wield immense, if not unprecedented, power to shape the interpretation of public events; to censor, delete, or disappear information; and to control what people see or do not see.
As President, I have made clear my commitment to free and open debate on the internet. Such debate is just as important online as it is in our universities, our town halls, and our homes. It is essential to sustaining our democracy.
Online platforms are engaging in selective censorship that is harming our national discourse. Tens of thousands of Americans have reported, among other troubling behaviors, online platforms “flagging” content as inappropriate, even though it does not violate any stated terms of service; making unannounced and unexplained changes to company policies that have the effect of disfavoring certain viewpoints; and deleting content and entire accounts with no warning, no rationale, and no recourse.71
A few weeks later, Representative Devin Nunes’s lawsuit against Twitter was dismissed due to section 230 “immunity.”72 Nunes had claimed that Twitter had orchestrated a nefarious scheme to silence his voice and assassinate his character by enabling the publication of several false and defamatory statements against him via satirical anonymous accounts.
Another episode in this controversy was a “workshop”73 conducted by Attorney General William Barr’s Department of Justice (DOJ) in February 2020, which was followed by a report with recommendations to Congress as to section 230.74 Of course, with the change in the presidential administration, this report may not carry as much (if any) weight. But it is still somewhat instructive as to ideas about what to do with section 230.
It is also worth noting here that then-candidate75 Joe Biden called for the repeal of section 230, telling the New York Times Editorial Board that “Section 230 should be revoked, immediately should be revoked, number one.”76 There was also this question-and-answer exchange:
CW: That’s a pretty foundational laws[sic] of the modern internet.
[Biden:] That’s right. Exactly right. And it should be revoked. It should be revoked because it is not merely an internet company. It is propagating falsehoods they know to be false, and we should be setting standards not unlike the Europeans are doing relative to privacy. You guys still have editors. I’m sitting with them. Not a joke. There is no editorial impact at all on Facebook. None. None whatsoever. It’s irresponsible. It’s totally irresponsible.77
Academics can be found on all sides of the issue of whether to reboot section 230 and, if so, how. Some argue “that there is a growing consensus that we need to update Section 230.”78 In his book, Gillespie highlights three considerations for the calls to review section 230.
- The “safe harbor” law was not designed for the social media platforms, which benefit from it today.
- Section 230 laws are limited to the United States, and platforms are international.
- Terrorism and hate speech are placing higher stakes on the debate.79
Other Issues Stoking the Section 230 Debate
Two huge events have added even more fuel to this fire: the January 6, 2021, Capitol riot and the COVID-19 pandemic.
Section 230 Flash Point Mob
All of this served as a prelude the decision of Twitter, Facebook, and YouTube to suspend/revoke (i.e., “deplatform”) former President Trump’s accounts over election-related and other claims (regarding, e.g., COVID-19). This development, along with the horrible day in January 2021, amplified the debate over section 230. Congress has held a series of hearings with the relevant CEOs. Bills in various states (to be discussed below) started to appear regarding digital platforms, social media platforms, and censorship. And an internal review board issued a report on Facebook’s actions regarding Trump. The board upheld Facebook’s decision to restrict Trump’s access to posting content on his Facebook page and Instagram account, but the board also found that it was not appropriate for Facebook to impose the indeterminate and standardless penalty of indefinite suspension.80 (Facebook later modified the suspension to two years.)81
Just two weeks after the riot at the Capitol, the Congressional Research Service issued a report, Social Media: Misinformation and Content Moderation Issues for Congress.82 The report concluded that if Congress decides to address the issue of misinformation or moderation, it might consider:
- the “scope of proposed actions, under what conditions they would be applied, and the range of . . . legal, social, and economic consequences”;
- “costs . . . that further entrench[] the market power of incumbent[s]”; and
- “how U.S. actions . . . fit within an international legal framework.”83
Following the deplatforming, Trump filed three class action lawsuits in July 2021 against Twitter, Facebook,84 and Google/YouTube, respectively.85 Trump’s two basic complaints against each company include the following.
- The defendant reacts to “coercive pressure from the federal government to regulate specific speech,” which amounts to “state action” and violates the Class Member’s First Amendment rights to participate in a “public forum.”86
- Section 230 is “unconstitutional on its face” because Congress cannot “induce, encourage . . . private persons to accomplish what it is constitutionally forbidden to accomplish.”87
Like many Trump-related issues, the merits and possible success of Trump’s lawsuits (beyond attracting even more attention to the issue) have generated polar viewpoints.88
The Pandemic and Section 230
COVID has wreaked havoc on all of us in so many ways. One issue that has arisen relative to section 230 is the issue of censorship of COVID-19 misinformation by social media platforms. Information that has been censored from the internet has ranged from theories on the origin of the disease, severity of treatments (e.g., medicines), and possible cures.
This censorship has taken place in a very volatile situation where theories and government-recommended approaches to the disease change as events unfold. As the Congressional Research Service report noted:
[p]art of the difficulty addressing COVID-19 misinformation is that the scientific consensus about a novel virus, its transmission pathways, and effective mitigation measures is constantly evolving as new evidence becomes available. During the pandemic, the amount and frequency of social media consumption increased. Information about COVID-19 spread rapidly on social media platforms, including inaccurate and misleading information, potentially complicating the public health response to the pandemic.89
There have even been reports of government coordination with platforms on these important issues and finger-pointing between the two entities.90 Senator Amy Klobuchar filed a bill that would penalize platforms for “spreading lies” about COVID-19.91 One would think it would be a weighty proposition for a company to decide (whether by assigned moderators/people or by algorithms92) what is or is not accurate as to complex diseases, much less the multitude of other issues that appear on their platforms daily.
State Action
But not all the section 230 action is in Washington, D.C., or in the courts. The agendas at state capitol buildings around the nation have been filled with legislation relevant to the section 230 debate, arising from perceived censorship and the power of the major digital/social media platforms. As will be seen, generally these efforts have not reached fruition without controversy. This article focuses on two battleground states: Florida and Texas.93
Florida
Florida passed Senate Bill 7072,94 which was supposed to take effect on July 1, 2021:
- The bill establishes a violation for social media deplatforming95 of a political candidate or journalistic enterprise and requires a social media platform to meet certain requirements when it restricts speech by users. The bill prohibits a social media platform from willfully deplatforming a candidate for political office and allows the Florida Elections Commission to fine a social media platform $250,000 per day for deplatforming a candidate for statewide office and $25,000 per day for deplatforming any other candidate, in addition to the remedies provided in chapter 106 of the Florida Statutes. If a social media platform willfully provides free advertisements for a candidate, such advertisement is deemed an in-kind contribution, and the candidate must be notified.
- The bill provides that a social media platform that fails to comply with the requirements under the bill may be found in violation of the Florida Deceptive and Unfair Trade Practices Act by the Department of Legal Affairs (Attorney General).
- The bill permits a user of a social media platform to bring a private cause of action against a social media platform for failing to apply consistently certain standards and for censoring or deplatforming without proper notice.
The bill was met with criticism (reflective of this controversy)96 and litigation. Before the bill was even able to take effect, Judge Hinkel issued a preliminary injunction in NetChoice v. Moody.97 Judge Hinkel identified many legal deficiencies in SB 7072, ruling that “the plaintiffs are likely to prevail on their challenge to the preempted provisions—to those applicable to a social media platform’s restriction of access to posted material.”98
Hinkel made other observations, such as: “The First Amendment does not restrict the rights of private entities not performing traditional, exclusive public functions.”99 He then applied strict scrutiny for his review of First Amendment claims, finding that SB 7072 is content-based legislation writing.
To survive strict scrutiny, an infringement on speech must further a compelling state interest and must be narrowly tailored to achieve that interest. See, e.g., Reed, 576 U.S. at 171. These statutes come nowhere close. Indeed, the State has advanced no argument suggesting the statutes can survive strict scrutiny. They plainly cannot.100
Texas
My home state of Texas is also a setting for section 230–related legislation. Texas had at least “two bites at the apple” before finally adopting House Bill 20.101 The bill’s general purpose is to establish complaint procedures and disclosure requirements for social media platforms and the censorship of users’ expressions by an interactive computer service. The bill includes requirements such as publication of “transparency” reports regarding the platform’s mediation efforts.102 The bill also focuses on “viewpoint discrimination.”103 A key section in the bill on censorship provides:
Sec. 143A.002. CENSORSHIP PROHIBITED. (a) A social media platform may not censor a user, a user’s expression, or a user’s ability to receive the expression of another person based on:
(1) the viewpoint of the user or another person;
(2) the viewpoint represented in the user’s expression or another person’s expression; or
(3) a user’s geographic location in this state or any part of this state.
(b) This section applies regardless of whether the viewpoint is expressed on a social media platform or through any other medium.104
NetChoice also challenged the Texas law, filing a complaint in September 2021 in federal district court in Austin.105 The complaint points to Judge Hinkel’s ruling on the Florida law for support. The complaint also alleges that H.B. 20 violates the First Amendment, is void for vagueness under the Due Process Clause of the Fourteenth Amendment, violates the Commerce Clause, is preempted under the Supremacy Clause and section 230, and violates the Equal Protection Clause of the Fourteenth Amendment.
The Texas social media law met the same fate as the Florida bill when Judge Pitman issued a preliminary injunction blocking the law from taking effect on December 2, 2021.106 The judge cited the Florida ruling. Judge Pitman found that H.B. 20 violated the First Amendment, many terms in the bill were “vague,” and it discriminated against Big Tech social media platforms. The court also rejected the state’s “common carrier” argument and ruled that the severability clause did not save other provisions in the bill. One of the court’s observations was about the impracticality of provisions regarding transparency and a user appeals process given the enormous amount of traffic that flows on these platforms every day. This premise and these rulings present serious challenges to legislators seeking to impose some sort of restrictions on these platforms. The state has indicated it will appeal the ruling to the Fifth Circuit Court of Appeals.
The Future of State Initiatives
If these various initiatives fail, it would not be surprising to witness their return in future state sessions as controversial bills often take more than one session to pass or finally die. If passed, as demonstrated in Florida and Texas, subsequent litigation is all but assured.
What Changes Should Be Made to Section 230
Setting aside the (in my view, unlikely) nuclear option of striking section 230 from the U.S. Code, what are some possible changes that could be made to section 230 in light of the above considerations?
In September 2020, the Barr DOJ Report107 mentioned above recommended draft legislation that:
- “has a series of reforms to promote transparency and open discourse and ensure that platforms are fairer to the public when removing lawful speech from their services”;108
- “[e]xplicitly overrule[s] Stratton Oakmont to [a]void [m]oderator’s [d]ilemma . . . [by] clarifying that a platform’s removal of content pursuant to Section 230(c)(2) or consistent with its terms of service does not, on its own, render the platform a publisher or speaker for all other content on its service”;109
- outlines a “category of amendments aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation claims”;110 and
- “proposes carving out certain categories of civil claims that are far outside Section 230’s core objective, including offenses involving child sexual abuse, terrorism, and cyberstalking.”111
Danielle Keats Citron and Benjamin Wittes believe that 47 U.S.C. § 230(c)(1) immunity is “too sweeping,” and they have suggested this new language (in italics):
No provider or user of an interactive computer service that takes reasonable steps to prevent or address unlawful uses of its services shall be treated as the publisher or speaker of any information provided by another information content provider in any action arising out of the publication of content provided by that information content provider.112
Mark Zuckerberg, praising section 230 for its promotion of the internet, has offered some suggestions to modify section 230 in testimony before Congress while defending Facebook’s “misinformation” and “hate speech” identification efforts:
We believe Congress should consider making platforms’ intermediary liability protection for certain types of unlawful content conditional on companies’ ability to meet best practices to combat the spread of this content. Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it. Platforms should not be held liable if a particular piece of content evades its detection—that would be impractical for platforms with billions of posts per day—but they should be required to have adequate systems in place to address unlawful content. Definitions of an adequate system could be proportionate to platform size and set by a third-party. That body should work to ensure that the practices are fair and clear for companies to understand and implement, and that best practices don’t include unrelated issues like encryption or privacy changes that deserve a full debate in their own right. In addition to concerns about unlawful content, Congress should act to bring more transparency, accountability, and oversight to the processes by which companies make and enforce their rules about content that is harmful but legal. While this approach would not provide a clear answer to where to draw the line on difficult questions of harmful content, it would improve trust in and accountability of the systems and address concerns about the opacity of process and decision-making within companies.113
Michael D. Smith and Marshall Van Alstyne describe such language as a “duty of care” standard.114 Neil Fried argues that section 230 removed the ordinary business standard to act with a duty of care toward customers/users.
Ordinarily, businesses have a common law duty to take reasonable steps to not cause harm to their customers, as well as to take reasonable steps to prevent harm to their customers. That duty also creates an affirmative obligation in certain circumstances for a business to prevent one party using the business’s services from harming another party. Thus, platforms could potentially be held culpable under common law if they unreasonably created an unsafe environment, as well as if they unreasonably failed to prevent one user from harming another user or the public.
Section 230(c)(1), however, states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Courts have concluded that this provision “creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service.”115
In short, Fried posits that section 230 has created a disincentive for platforms to moderate content and recommends that Congress:
- “amend Section 230 to require that platforms take reasonable steps to curb unlawful conduct as a condition of receiving the section’s liability protections”; and
- “create transparency provisions requiring platforms to adopt and disclose content moderation policies addressing (1) what content the platforms will take down and leave up; (2) how people can file complaints about deviations from those policies; (3) how people can appeal the platforms’ decisions under those policies; and (4) disclosure of aggregated data regarding complaints, takedowns, denial of takedown requests, and appeals.”116
As if there were a need to bring any further attention to this issue, a “whistleblower” came forward in the fall of 2021 and provided internal Facebook documents to the Wall Street Journal, which published an intensive series of articles—called “The Facebook Files”—critical of Facebook’s practices, business model, and impact on society.117 The whistleblower then appeared before congressional committees.118 This kept, if not brightened, the spotlight on section 230.
It remains to be seen what Congress will actually do on this important national issue.
What Next?
Assuming that Congress has not addressed section 230 further by the time of publication, I suspect that the section 230 controversies will continue to thrive at both the federal and state levels and in various legislative, judicial, and political forums. A couple of years ago, I published an article in this publication stating that “it may be difficult to move NN (Net Neutrality) off its perch at the top of the regulatory box office.”119 I think that it is probably fair to say that there is a new Number One in this box office—section 230 (perhaps with the privacy issue not far behind).
In the meantime, in its In re Facebook opinion, the Texas Supreme Court fairly summarized some of the basic considerations going forward.
The internet today looks nothing like it did in 1996, when Congress enacted section 230. The Constitution, however, entrusts to Congress, not the courts, the responsibility to decide whether and how to modernize outdated statutes. Perhaps advances in technology now allow online platforms to more easily police their users’ posts, such that the costs of subjecting platforms like Facebook to heightened liability for failing to protect users from each other would be outweighed by the benefits of such a reform. On the other hand, perhaps subjecting online platforms to greater liability for their users’ injurious activity would reduce freedom of speech on the internet by encouraging platforms to censor “dangerous” content to avoid lawsuits. Judges are poorly equipped to make such judgments, and even were it otherwise, “[i]t is for Congress, not this Court, to amend the statute if it believes” it to be outdated.120
Congress, what say you?
Endnotes
1. Section 230 was first offered as an amendment by Representatives Christopher Cox (R-Cal.) and Ron Wyden (D-Or.). See 141 Cong. Rec. H4860 (Aug. 4, 1995).
2. Note that another hot issue with Big Tech is, of course, antitrust. The term “Big Tech” as used herein means Google (i.e., Alphabet, which owns YouTube), Facebook (which owns Instagram/WhatsApp), Amazon, Apple, Microsoft, and Twitter. Twitter is not as large as the other companies but is a significant tech company that is embroiled in the section 230 debate. These companies are referred to at times in various combinations, such as “The Big Four,” “GAFA,” “FAANG” (including Netflix), “GAFAM,” or “FAAMG.” There are several antitrust actions past and pending in the United States and Europe. See, e.g., United States v. Google, No. 1:20-cv-03010 (D.D.C. filed 2020); Google Shopping Case, CASE AT.39740 (Eur. Comm’n 2017). Such efforts to date have met with mixed success; for example, Judge Boasberg promptly dismissed FTC v. Facebook and State of New York v. Facebook for failure to state claims in No. 20-3590 and No. 20-3589 (D.D.C. 2021), respectively. The Federal Trade Commission (FTC) promptly refiled. Cat Zakrzewski, FTC Refiles Antitrust Case Against Facebook, Says No Other Social Network Come Close to Its Scale, Wash. Post (Aug. 19, 2021), https://www.washingtonpost.com/technology/2021/08/19/ftc-refiles-facebook-lawsuit-lina-khan. This article does not attempt to analyze these antitrust issues. One article on such issues addresses, for example, the difference between “information platforms” and “transaction platforms.” Donald J. Baker & William S. Comanor, A U.S. Antitrust Agenda for the Dominant Information Platforms, 35 Antitrust (Summer 2021), https://www.americanbar.org/digital-asset-abstract.html/content/dam/aba/publishing/antitrust_magazine/atmag-summer-2021/baker.pdf. The focus here is more modest: section 230.
3. Ryan Tracy & John D. McKinnon, Tech Ceos Square off with Senators in Hearing over Online Speech, Wall St. J. (Oct. 28 2020), https://www.wsj.com/articles/senate-tech-hearing-facebook-twitter-google-11603849274.
4. Teenage boys filled early operators’ positions. The practice had to be quickly discontinued as the boys did not properly complete connections and had fun prematurely disconnecting conversations.
5. The “common carrier” designation is not a legal concept that started with telephone companies. This legal classification can be traced back to early England with shippers and trains. Some have raised the idea that Big Tech companies should not be treated as common carriers with the attendant responsibilities and obligations, which is a topic relevant to the section 230 debate. See Tunku Varadarajan, The “Common Carrier” Solution to Social-Media Censorship, Wall St. J. (Jan. 15, 2021), https://www.wsj.com/articles/the-common-carrier-solution-to-social-media-censorship-11610732343?reflink=desktopwebshare_twitter. But see Rick White, Laws Governing Online Speech Need Reform, Not Repeal, Wall St. J. (Dec. 22, 2020, 6:16 pm ET), https://www.wsj.com/articles/laws-governing-online-speech-need-reform-not-repeal-11608678984?reflink=desktopwebshare_twitter. An opinion from 2006 rejected such an argument as to Google: Kinderstart argued without success that Google is a common carrier performing a public function and/or is a private space dedicated to public use. Kinderstart.com LLC v. Google, No. C06-2057, 2006 WL 3246596 (N.D. Cal. July 13, 2006) (designated as not for citation), https://volokh.com/files/kinderstart.pdf. A more fulsome treatment of this issue is likely worthy of its own article. See, e.g., Adam Thierer, The Perils of Classifying Social Media Platforms as Public Utilities, 21 CommLaw Conspectus 249 (2013), https://scholarship.law.edu/commlaw/vol21/iss2/2.
6. Stuart Minor Benjamin & James B. Speta, Chapter 15: Internet Platform Regulation, in Internet and Telecommunications Regulation 868 (Carolina Academic Press 2019).
7. This is stated in the “routine course of business” case. Clearly, with court orders and documents, telephone companies are involved in wiretapping activities in conjunction with law enforcement on occasion.
8. Newspapers, for example, decide which “letters to the editor” to include in their medium.
9. Tarleton Gillespie, Platforms Are Not Intermediaries, 2 Geo. L. Tech. Rev. 198, 209 (2018).
10. Id. at 211.
11. Philip Hamburger & Clare Morell, The First Amendment Doesn’t Protect Big Tech’s Censorship, Wall St. J. (July 31, 2021), https://www.wsj.com/articles/big-tech-twitter-facebook-google-youtube-sec-230-common-carrier-11627656722?reflink=desktopwebshare_permalink. In this piece, Hamburger and Morell opine that large tech platforms and services function as common carriers and states and the federal government have the power to regulate common carriers.
12. Bill Clinton, President, United States, Remarks by the President in Signing Ceremony for the Telecommunications Act Conference Report (Feb. 8, 1996), https://clintonwhitehouse3.archives.gov/WH/EOP/OP/telecom/release.html.
13. Jeff Kosseff, The Twenty-Six Words That Created the Internet (Cornell Univ. Press 2019).
14. See Pub. L. No. 104-104, tit. V (1996); see also H.R. Rep. No. 104-458, at 81–91 (1996); S. Rep. No. 104-230, at 187–93 (1996); S. Rep. No. 104-23, at 9 (1995). While this provision is often referred to as section 230 of the Communications Decency Act of 1996 (Pub. L. No. 104-104), it was enacted as section 509 of the Telecommunications Act of 1996, which amended section 230 of the Communications Act of 1934. The primary goal of the Communications Decency Act was to limit the exposure of minors to indecent material. The “framers” of section 230 are former Representative Chris Cox and former representative and current Senator Ron Wyden. Interestingly, the Communications Decency Act was promptly found unconstitutional, but section 230 was not part of that case. Reno v. Am. C.L. Union, 521 U.S. 844 (1997).
15. Section 230, Wikipedia (July 27, 2021), https://en.wikipedia.org/wiki/Section_230.
16. I digress here to observe that Leonardo DiCaprio has a very indirect connection to two significant developments in the history of telecommunications-related law. DiCaprio starred in Titanic, and the actual Titanic tragedy led to the enactment of the Radio Act of 1912. Furthermore, DiCaprio, as noted, later starred in the The Wolf of Wall Street, the underlying story of which helped birth section 230.
17. Stratton Oakmont, Inc. v. Prodigy Servs. Co., 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995). See Note, Section 230 as First Amendment Rule, 131 Harv. L. Rev. 2027 (2018), https://harvardlawreview.org/wp-content/uploads/2018/05/2027-2048_Online.pdf.
18. Id.
19. Milton Mueller, Hyper-Transparency and Social Control: Social Media as Magnets for Regulation, 39 Telecomm. Pol’y 804 (2015).
20. Excluding the provisions related to “findings, policy and effect on other laws,’” the remaining substantive provisions of section 230 provide in 47 U.S.C. § 230(c) and (d):
(c) Protection for “Good Samaritan” blocking and screening of offensive material
. . . .
(2) Civil liability
No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
(d) Obligations of interactive computer service
A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.
21. The original accounting of the Good Samaritan lesson is found in Luke 10:25-37 (King James Version). Of course, other Good Samaritan laws have been enacted to encourage people to help others in distress without fear of liability, such as section 74.151 of the Texas Civil Practice and Remedies Code. Some call these provisions “safe harbor” provisions.
22. Congress’s findings and the public policy behind section 230 are clear in 47 U.S.C. § 230(a), (b):
(a) Findings
The Congress finds the following:
(1) The rapidly developing array of Internet and other interactive computer services available to individual Americans represent an extraordinary advance in the availability of educational and informational resources to our citizens.
(2) These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops.
(3) The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.
(4) The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation.
(5) Increasingly Americans are relying on interactive media for a variety of political, educational, cultural, and entertainment services.
(b) Policy
It is the policy of the United States—
(1) to promote the continued development of the Internet and other interactive computer services and other interactive media;
(2) to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation;
(3) to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;
(4) to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material; and
(5) to ensure vigorous enforcement of Federal criminal laws to deter and punish trafficking in obscenity, stalking, and harassment by means of computer.
23. 47 U.S.C. § 230(a).
24. Section 230 is unique to the United States in the world of internet regulation.
25. Zeran v. AOL, 129 F.3d 327 (4th Cir. 1997).
26. Jordan Valinsky, Verizon Offloads Yahoo and AOL in $5 Billion Deal, CNN (May 3, 2021, 10:14 AM EDT), https://www.cnn.com/2021/05/03/media/verizon-yahoo-sold-apollo/index.html.
27. Would a platform of today have been more active in content moderation of these postings? Section 230 does indicate that publisher protection is not lost with some moderation of user-generated content.
28. Zeran, 129 F.3d at 331.
29. Id. at 332.
30. The idea that an individual like Zeran may be able to somehow track down the “unknown” poster and bring a defamation lawsuit does not offer a serious opportunity for meaningful redress.
31. Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc).
32. Id. at 1161 n.2 (noting that although the web address is plural, “the company goes by the singular name ‘Roommate.com, LLC’”).
33. Id. at 1161.
34. Id. at 1162.
35. Id. at 1164.
36. Id. at 1174.
37. The opinion proceeded to remand the case for further review to determine whether Roommate was immune or not under the Fair Housing Act. The district court then found that the FHA did not apply in this case as to “shared dwellings.” Fair Hous. Council of San Fernanado Valley v. Roommate.com, LLC, 666 F.3d 1216 (9th Cir. 2012).
38. Roommates.com, 521 F.3d at 1176, 1188.
39. Doe v. Backpage, 817 F.3d 12, 15 (1st Cir. 2016).
40. Id. at 19 (quoting Universal Commc’n Sys., Inc. v. Lycos, 478 F.3d 413, 418 (1st Cir. 2007) (quoting 47 U.S.C. § 230(c)(1))).
41. Fortunately, Congress later did pass the Allow States and Victims to Fight Online Sex Trafficking Act of 2017, which President Trump signed into law. It makes clear that Congress did not intend for section 230 to provide protection to websites that facilitate traffickers in advertising the sale of unlawful sex acts, and it creates a crime and civil remedies for such services. Pub. L. No. 115-164, 132 Stat. 1253 (2018).
42. Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003), reh’g denied & reh’g en banc denied, 351 F.3d 904 (9th Cir. 2003). The case also discussed the California Anti-SLAPP law, which is beyond the scope of this article.
43. Id. at 1026.
44. Id. at 1036.
45. Id. at 1039.
46. Id.
47. Id. at 1020.
48. Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC, 141 S. Ct. 13 (2020) (statement of Thomas, J., respecting denial of certiorari), https://www.supremecourt.gov/opinions/20pdf/19-1284_869d.pdf.
49. Id. at 14. In a discussion of Domen v. Vimeo, 991 F.3d 66 (2d Cir. 2021), one commenter pointed to Justice Thomas’s comments in response to a question asking whether there are limits to section 230 immunity. Allysia Finley, Does Section 230 Have Limits?, Wall St. J. (July 21, 2021, 12:48 pm ET), https://www.wsj.com/articles/vimeo-domen-section-230-religious-liberty-first-amendment-11626875989?st=tzn2tmf625afk58&reflink=article_email_share; Domen v. VIMEO, Inc., No. 20-616 (2d Cir. 2021), Justia L., https://law.justia.com/cases/federal/appellate-courts/ca2/20-616/20-616-2021-03-11.html (last visited Nov. 20, 2021).
50. 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995).
51. Recall that the Zeran decision did discuss the distributor issue.
52. Malwarebytes, 141 S. Ct. at 18.
53. In re Facebook, Inc. & Facebook, Inc., 625 S.W.3d 80 (Tex. June 25, 2021).
54. To be accurate, the court determined that the plaintiffs’ statutory human-trafficking claims should be allowed to move forward. But the plaintiffs’ common-law claims for negligence, gross negligence, negligent undertaking, and products liability were dismissed.
55. In re Facebook, 625 S.W.3d at 83.
56. Id.
57. Id. at 83. The court cited the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), Pub. L. No. 115-164, 132 Stat. 1253 (2018). It is worth noting that not only has the Texas decision been criticized but so has FOSTA. As to the latter, some have pointed out that it has rarely been utilized as intended, citing a GAO report from June 2021, GAO-21-385. Brown, Elizabeth Nolan Brown, FOSTA’s Failure: The 2018 Sex Trafficking Law Has Been Worse Than Useless So Far, Reason (June 30, 2021) (citing U.S. Gov’t Accountability Off., GAO-21-385, Sex Trafficking: Online Platforms and Federal Prosecutions (2021)), https://reason.com/2021/06/30/fostas-failure-the-2018-sex-trafficking-law-has-been-worse-than-useless-so-far.
58. In re Facebook, 625 S.W.3d at 98 (quoting Roommates.com, 521 F.3d at 1169).
59. Indeed, bigness is found in more and more industry segments with almost daily consolidation among fewer and fewer participants. This fact led President Biden to consider an executive order commanding federal agencies to review measures to address these issues. Jacob M. Schlesinger, Biden Weighs New Executive Order Restraining Big Business, Wall St. J. (June 30, 2021), https://www.wsj.com/articles/biden-weighs-new-executive-order-restraining-big-business-11625007804?st=e9e03mlyi81hgr3&reflink=desktopwebshare_permalink.
60. Congress is laser focused on Big Tech as demonstrated by a flurry of bills filed in the 117th Congress.
- The American Innovation and Choice Online Act prohibits companies from discriminating against smaller competitors and prioritizing their products ahead of others. H.R. 3816.
- The Platform Competition and Opportunity Act empowers regulators to block dominant companies from acquiring would-be competitors. H.R. 3826.
- The Ending Platform Monopolies Act prohibits companies from blotting out smaller competitors and undermining fair and free online competition. H.R. 3825.
- The Augmenting Compatibility and Competition by Enabling Service Switching (ACCESS) Act eases market entry for new companies by changing requirements that affect costs for businesses. H.R. 3849.
- The Merger Filing Fee Modernization Act updates filing fees for mergers; it is designed to help regulators better enforce antitrust laws. S. 228.
61. Amy Klobuchar, Modern-Day Antitrust Challenges, in Antitrust: Taking on Monopoly Power from the Gilded Age to the Digital Age 215–80 (Alfred A. Knopf 2021); Rick Whiting, IT Vendor Market Cap 2019 Winners and Losers, CRN (Mar. 2, 2020), https://www.crn.com/slide-shows/cloud/it-vendor-market-cap-2019-winners-and-losers.
62. The rapid growth of these companies is quite remarkable. Facebook was launched in 2007; Twitter was launched in 2006; Google is an “old-timer,” having been launched in 1998 (YouTube in 2005); Microsoft is an “ancient” company, having been founded in 1975; and Apple was no April Fool’s Day joke when it was founded on April 1, 1976 (the iPhone did not come along until 2007, which helped grow the recently born Facebook, Twitter, and YouTube).
63. In September 2021, the FTC argued in court that Facebook controls 85 percent of the personal social networking market with its acquisitions of Instagram and WhatsApp, claiming that it is an unlawful monopoly. Nihal Krishan, “Facebook Has 85% of Personal Social Networking Market and Must Be Broken up, FTC Says.” Washington Examiner, (Sept. 9, 2021), www.washingtonexaminer.com/news/facebook-85-personal-social-networking-market-must-broken-up-ftc.
64. More importantly, more and more people go to Facebook and Twitter first to obtain their news.
65. Some facts for amplification: In 2019, Facebook had over 2.4 billion monthly active users; Google has over 90 percent of the search engine market worldwide (and owns YouTube, which has one billion daily views); Apple has about 45 percent of the smartphone market; Microsoft has 35 percent of the operating system market worldwide; and Amazon, among other mind-numbing statistics, has sold over 100 million Alexa devices. The Top 10 Valuable Facebook Statistics—Q2 2021, Zephoria, https://zephoria.com/top-15-valuable-facebook-statistics (last visited Nov. 20, 2021); Aleksandria, 63 Fascinating Google Search Statistics, SEO Tribunal (Sept. 26, 2018), https://seotribunal.com/blog/google-stats-and-facts; Dieter Bohn, Amazon Says 100 Million ALEXA Devices Have Been Sold—What’s Next?, Verge (Jan. 4, 2019), https://www.theverge.com/2019/1/4/18168565/amazon-alexa-devices-how-many-sold-number-100-million-dave-limp; Operating System Market Share Worldwide, StatCounter GlobalStats, https://gs.statcounter.com/os-market-share (last visited Nov. 20, 2021). Obviously, my queries are not an antitrust test, but this is not such a forum.
66. Alexis Wichowski, Net States Rule the World; We Need to Recognize Their Power, Wired (Nov. 4, 2017), https://www.wired.com/story/net-states-rule-the-world-we-need-to-recognize-their-power; Danny Crichton, The Nation-State of the Internet, TechCrunch (Dec. 8, 2018), https://techcrunch.com/2018/12/08/the-nation-state-of-the-internet.
67. Cory Doctorow et al., Section 230 of the Communications Decency Act, Elec. Frontier Found., https://www.eff.org/issues/cda230 (last visited Nov. 20, 2021).
68. For example, Sen. Josh Hawley filed a bill in 2019, S. 1914, “Ending Support for Internet Censorship Act.” The bill did not move, but its stated purpose was to remove section 230 immunity unless a social media company obtains certification from the FTC that it does not moderate information on its platform in a manner that is biased against a political party, candidate, or viewpoint.
69. This article makes no attempt to resolve the censorship debate. This information is included because it is in fact part of the controversy of section 230.
70. Brian Fung, White House Proposal Would Have FCC and FTC Police Alleged Social Media Censorship, CNN Bus., Aug. 10, 2019, 8:15 AM EDT, https://www.cnn.com/2019/08/09/tech/white-house-social-media-executive-order-fcc-ftc; Allan Smith & Rebecca Shabad, Trump Signs Executive Order Aimed at Social Media Companies After Fuming over Fact-Check, NBCNews.com (May 29, 2020, 9:44 PM EDT), https://www.nbcnews.com/politics/white-house/angry-over-how-social-media-platforms-are-treating-him-trump-n1216401.
71. NBC News, Full Text of President Trump’s Executive Order Aimed at Preventing Online Censorship, NBCNews.com (May 28, 2020, 6:17 PM EDT), https://www.nbcnews.com/politics/donald-trump/full-text-president-trump-s-executive-order-aimed-preventing-online-n1217126.
72. Nunes v. Twitter, Inc., Global Freedom of Expression (Jan. 21, 2021), https://globalfreedomofexpression.columbia.edu/cases/nunes-v-twitter-inc; Letter from John Marshall, Judge, Cir. Ct. of Henrico Cty., to Steven S. Bliss et al., Esqs. (June 24, 2020), https://globalfreedomofexpression.columbia.edu/wp-content/uploads/2020/06/Nunes-v.-Twitter.pdf. The court relied on Zeran v. AOL and Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc. See Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250 (4th Cir. 2009), Elec. Frontier Found., https://www.eff.org/issues/cda230/cases/nemet-chevrolet-ltd-v-consumeraffairscom-inc (last visited Nov. 20, 2021).
73. Adi Robertson, Five Lessons from the Justice Department’s Big Debate over Section 230, Verge (Feb. 20, 2020, 9:07 PM EST), https://www.theverge.com/2020/2/19/21144223/justice-department-section-230-debate-liability-doj.
74. U.S. Dep’t of Justice, Section 230—Nurturing Innovation or Fostering Unaccountability?: Key Takeaways and Recommendations (June 2020), https://www.justice.gov/file/1286331/download?utm_medium=email&utm_source=govdelivery.
75. Other candidates also called for review of section 230 to address platforms’ responsibility for hate speech and disinformation. Some of the candidates remain in the Senate or are part of the Biden administration. Rani Molla & Emily Stewart, Should Social Media Companies Be Legally Responsible for Misinformation and Hate Speech? 2020 Democrats Weigh In, Vox (Dec. 3, 2019), https://www.vox.com/policy-and-politics/2019/12/3/20965459/tech-2020-candidate-policies-section230-facebook-misinformation-hate-speech.
76. Editorial Bd., Joe Biden: Former Vice President of the United States, N.Y. Times, Jan. 17, 2020 (emphasis added), https://www.nytimes.com/interactive/2020/01/17/opinion/joe-biden-nytimes-interview.html?smid=nytcore-ios-share; Makena Kelly, Joe Biden Wants to Revoke Section 230, Verge (Jan. 17, 2020) (emphasis added), https://www.theverge.com/2020/1/17/21070403/joe-biden-president-election-section-230-communications-decency-act-revoke.
77. Editorial Bd., supra note 76. Biden mentions the important topic of “privacy” on social media and the internet. Yet again, this is an important topic worthy of its own review.
78. Michael D. Smith & Marshall Van Alstyne, It’s Time to Update Section 230, Harv. Bus. Rev. (Aug. 12, 2021), https://hbr.org/2021/08/its-time-to-update-section-230.
79. Tarleton Gillespie, The Myth of the Neutral Platform, in Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media 30–39 (Yale Univ. Press 2018).
80. Oversight Board Upholds Former President Trump’s Suspension, Finds Facebook Failed to Impose Proper Penalty, Oversight Bd. (May 2021), https://oversightboard.com/news/226612455899839-oversight-board-upholds-former-president-trump-s-suspension-finds-facebook-failed-to-impose-proper-penalty.
81. Shortly thereafter, Facebook stated that it would subject Trump to a two-year suspension and announced new enforcement protocols. Nick Clegg, In Response to Oversight Board, Trump Suspended for Two Years; Will Only Be Reinstated If Conditions Permit, About Facebook (June 4, 2021), https://about.fb.com/news/2021/06/facebook-response-to-oversight-board-recommendations-trump.
82. Jason A. Gallo & Clare Y. Cho, Cong. Rsch. Serv., R46662, Social Media: Misinformation and Content Moderation Issues for Congress (2021), https://crsreports.congress.gov/product/pdf/R/R46662.
83. Id. at 23–24.
84. In the complaint against Facebook, Trump reviews the decision by the Facebook Oversight Board.
85. Trump v. Twitter & Jack Dorsey, No. 1:21-cv-22441 (S.D. Fla. July 7, 2021); Trump v. Facebook & Mark Zuckerberg, No. 1:21-cv-22440 (S.D. Fla. July 7, 2021); Trump v. YouTube & Sundar Pichai, No. 21-cv-61384 (S.D. Fla. July 7, 2021).
86. Complaint at 26, Trump v. Twitter, No. 1:21-cv-22441; ¶¶ 113, 115–116, https://ia902302.us.archive.org/18/items/gov.uscourts.flsd.595801/gov.uscourts.flsd.595801.1.0.pdf.
87. Id. ¶¶ 131–132.
88. Shirin Ghaffary, Trump’s Lawsuits Against Facebook, Twitter, and Google Will Probably Go Nowhere, Vox (July 7, 2021), https://www.vox.com/recode/22566780/trump-lawsuit-facebook-twitter-google; David A. Graham, Trump’s Fantasy Legal World, Atlantic (July 8, 2021), www.theatlantic.com/ideas/archive/2021/07/trump-lawsuits-against-facebook-and-twitter-are-fantasy/619378; Vivek Ramaswamy, Trump Can Win His Case Against Tech Giants, Wall St. J. (July 11, 2021, 1:42 pm ET), https://www.wsj.com/articles/trump-can-win-his-case-against-tech-giants-11626025357. Trump later took the debate to the next level by announcing his own social media website, called “Truth Social.” Trump’s New Social Media Company Says Users Won’t Be Censored—Unless They Disrespect the Site, NBCNews.com, NBCUniversal News Group, (Oct. 22, 2021), www.nbcnews.com/politics/donald-trump/trump-s-new-social-platform-welcomes-free-speech-unless-you-n1282051.
89. Gallo & Cho, supra note 82.
90. The discovered exchange of emails between Mark Zuckerberg and Dr. Anthony Fauci and President Biden’s criticism of Facebook and COVID-19 misinformation may be worth following in this “government coercion” discussion. Ed Browne, Dr. Fauci on Redacted Mark Zuckerberg Emails: “Friendly and Innocent,” Newsweek (June 22, 2021, 7:15 AM EDT), https://www.newsweek.com/dr-fauci-mark-zuckerberg-emails-friendly-innocent-1602875. ABC News, Libby Cathey, Biden Tries to Clarify Comment That Facebook Is “Killing People,” ABC News, July 19, 2021, 2:02 PM, https://abcnews.go.com/Politics/biden-clarify-comment-facebook-killing-people/story?id=78927017.
91. Klobuchar took another tack on this issue, proposing a bill to eliminate section 230 immunity for COVID-19 misinformation spread by social media platforms’ algorithms. Anna Edgerton, Klobuchar Takes Aim at Online Covid Lies with Section 230 Bill, L.A. Times, July 22, 2021, https://www.latimes.com/business/technology/story/2021-07-22/klobuchar-takes-aim-at-online-covid-lies-with-section-230-bill.
92. Algorithms are computer rules/processes used by social media platforms to manage data. Algorithms help operators organize and prioritize content and guide what platform users may see on their screens. Some argue that algorithms are “not neutral.” Instead, they argue that these processes are really just “filter bubbles” identifying information based solely on the user’s past and the user’s preferences. Roger McNamee, Move Fast and Break Things, in Zucked: Waking up to the Facebook Catastrophe 66–69 (Penguin Press 2019).
93. Several other states are witnessing such activity. For example, the Utah legislature passed S. 228 regarding electronic free speech, but the governor later vetoed the bill. The bill would have required social media corporations to provide, for Utah account holders, clear information about the social media corporation’s moderation practices and notice to the account holder or the attorney general when the social media corporation uses a moderation practice with respect to a Utah account holder’s account. Electronic Free Speech Amendments, S. 228, 2021 Leg., Gen. Sess. (Utah 2021), https://le.utah.gov/~2021/bills/static/SB0228.html. Yet another bill that did not make it to the finish line was a North Dakota bill, H.R. 1144, which was filed in 2021 as an effort “to protect free speech from racial, religious, and viewpoint discrimination by a social media platform or interactive computer service.” The bill passed the House but was soundly rejected in the state Senate. Bill Versions for HB 1144, N.D. Legis. Branch (2021), https://www.legis.nd.gov/assembly/67-2021/bill-index/bi1144.html.
94. The law targets social media companies with annual gross revenues in excess of $100 million or at least 100 million monthly individual platform participants globally, and exempts theme parks (hey, it’s Florida). S.B. 7072: Social Media Platforms, Fla. Senate (2021), https://www.flsenate.gov/Session/Bill/2021/7072/?Tab=BillHistory; Fla. Stat. § 501.2041 (2021), https://m.flsenate.gov/Statutes/501.2041.
95. The Florida act defines “deplatform” as “the action or practice by a social media platform to permanently delete or ban a user or to temporarily delete or ban a user from the social media platform for more than 14 days.” S.B. 7072: Social Media Platforms, supra note 94. Another prohibited act is to “shadow ban” someone. “Shadow ban” means action by a social media platform, through any means, whether the action is determined by a natural person or an algorithm, to limit or eliminate the exposure of a user or content or material posted by a user to other users of the social media platform. This term includes acts of shadow banning by a social media platform that are not readily apparent to a user. Id. The bill also has about 10 pages of provisions on the establishment of an “antitrust violation vendor list” regarding procurements.
96. Florida’s New Social Media Law Has Ramifications Beyond Political Realm, GreenbergTraurig (June 22, 2021), https://www.gtlaw.com/en/insights/2021/6/floridas-new-social-media-law-has-ramifications-beyond-political-realm#_ftn1; Kurt Opsahl, The Florida Deplatforming Law Is Unconstitutional. Always Has Been, Elec. Frontier Found. (May 11, 2021), https://www.eff.org/deeplinks/2021/05/florida-deplatforming-law-unconstitutional-always-has-been; Carl Szabo, We’re Suing the State of Florida over New Social Media Law. Here’s Why: Opinion, Tallahassee Democrat, Tallahassee Democrat (June 11, 2021), https://www.tallahassee.com/story/opinion/2021/06/11/sue-florida-over-ron-desantis-social-media-law-anti-conservative/7634445002.
97. NetChoice v. Moody, No. 4:21cv220-Rh-MAF (N.D. Fla. June 30, 2021), https://law.justia.com/cases/federal/district-courts/florida/flndce/4:2021cv00220/371253/113.
98. Id. at 16.
99. Id. at 17. The judge cited to an old Florida case, Miami Herald Publishing Co. v. Tornillo, 418 U.S. 241 (1974), in which the Supreme Court rejected just such an argument, striking down a Florida statute requiring a newspaper to print a candidate’s reply to the newspaper’s unfavorable assertions. NetChoice, No. 4:21cv220-Rh-MAF, at 18.
100. NetChoice, No. 4:21cv220-Rh-MAF, at 27. Hinkel noted that the same result would have been reached under “intermediate scrutiny.” Id. He added, “It is also subject to strict scrutiny because it discriminates on its face among otherwise-identical speakers: between social-media providers that do or do not meet the legislation’s size requirements and are or are not under common ownership with a theme park. Parts also are expressly preempted by federal law.” Id. at 30.
101. H. 20, 2021 Tex. Gen. Laws., https://legiscan.com/TX/bill/HB20/2021/X2. https://capitol.texas.gov/tlodocs/872/billtext/pdf/HB00020F.pdf#navpanes=0. Like the heated “voting restrictions/election integrity” issue that spilled over from the regular session to special sessions, the social media platform legislation followed a similar twisted path, passing in the second special session. The Texas legislature meets in regular session from January through May every other year. As a backup plan for major initiatives that a governor wants to see completed, he or she has the option to call for a special session(s), scheduled for 30 days at a time, between regular sessions. The special sessions were complicated in 2021 when Democratic House members left the state to preclude the necessary quorum.
102. Section 113.001 sets out key basic definitions:
(1) “Social media platform” means an Internet website or application that is open to the public, allows a user to create an account, and enables users to communicate with other users for the primary purpose of posting information, comments, messages, or images.
The term does not include:
(A) an Internet service provider as defined by Section 324.055;
(B) electronic mail; or
(C) an online service, application, or website:
(i) that consists primarily of news, sports, entertainment, or other information or content that is not user generated but is preselected by the provider; and
(ii) for which any chat, comments, or interactive functionality is incidental to, directly related to, or dependent on the provision of the content described by Subparagraph (i).
(2) “User” means a person who posts, uploads, transmits, shares, or otherwise publishes or receives content through a social media platform.
Id.
103. The bill apparently relies on the premise of “common carrier” responsibility. Hamburger & Morell, supra note 11.
104. H. 20, 2021 Tex. Gen. Laws, https://legiscan.com/TX/bill/HB20/2021/X2.
105. Complaint for Declaratory and Injunctive Relief, NetChoice v. Paxton, No. 1:21-cv-00840 (W.D. Tex. Sept. 22, 2021), https://netchoice.org/wp-content/uploads/2021/09/1-main.pdf. Members of one or both plaintiff organizations include Airbnb, Alibaba.com, Amazon.com, AOL, DJI, DRN, eBay, Etsy, Expedia, Facebook, Fluid Truck, Google, HomeAway, Hotels.com, Lime, Lyft, Nextdoor, Oath, OfferUp, Orbitz, PayPal, Pinterest, StubHub, TikTok, Travelocity, TravelTech, Trivago, Turo, Twitter, Verisign, Vimeo, VRBO, Vigilant Solutions, VSBLTY, Waymo, Wing, and Yahoo!. Steve DelBianco, Big Tech Sues Texas over New Law Targeting Social Media Censorship, NetChoice (Sept. 24, 2021), https://netchoice.org/media-press/big-tech-sues-texas-over-new-law-targeting-social-media-censorship.
106. NetChoice, LLC v. Paxton, No. 1:21-cv-00840 (W.D. Tex. Dec. 1, 2021).
107. U.S. Dep’t of Justice, supra note 74.
108. Press Release, U.S. Dep’t of Justice, The Justice Department Unveils Proposed Section 230 Legislation (Sept. 23, 2020), https://www.justice.gov/opa/pr/justice-department-unveils-proposed-section-230-legislation.
109. Department of Justice’s Review of Section 230 of the Communications Decency Act of 1996, Justice.gov (some emphasis removed), https://www.justice.gov/archives/ag/department-justice-s-review-section-230-communications-decency-act-1996 (last visited Nov. 24, 2021).
110. Press Release, supra note 108.
111. Id.
112. Danielle Keats Citron & Benjamin Wittes, The Internet Will Not Break: Denying Bad Samaritans § 230 Immunity, 86 Fordham L. Rev. 401, 403, 419 (2017) (emphasis in original), https://ir.lawnet.fordham.edu/flr/vol86/iss2/3.
113. Hearing Before the Subcomms. on Consumer Prot. & Com. & Commc’ns & Tech. of the H. Comm. on Energy & Com., 117th Cong. (Mar. 25, 2021) (testimony of Mark Zuckerberg, Facebook CEO), https://docs.house.gov/meetings/IF/IF16/20210325/111407/HHRG-117-IF16-Wstate-ZuckerbergM-20210325-U1.pdf; Lauren Feiner, Facebook’s Suggestion to Reform Internet Law Is a “Masterful Distraction,” Says Silicon Valley Congresswoman, CNBC (Mar. 24, 2021), https://www.cnbc.com/2021/03/24/facebook-section-230-suggestion-masterful-distraction-rep-eshoo.html.
114. Smith & Van Alstyne, supra note 78. As described above, there is activity at the state level touching on such “duty of care” standards/ideas.
115. Neil Fried, The Myth of Internet Exceptionalism: Bringing Section 230 into the Real World, Am. Affs. J. (July 9, 2021) (emphasis in original), https://americanaffairsjournal.org/2021/05/the-myth-of-internet-exceptionalism-bringing-section-230-into-the-real-world.
116. Id.
117. Jeff Horowitz et al., The Facebook Files: A Wall Street Journal Investigation, Wall St. J. (Oct. 1, 2021), https://www.wsj.com/articles/the-facebook-files-11631713039?mod=searchresults_pos19&page=1.
118. John D. McKinnon & Ryan Tracy, Facebook Whistleblower’s Testimony Builds Momentum for Tougher Tech Laws, Wall St. J., (Oct. 5, 2021, 1:42 pm ET), https://www.wsj.com/articles/facebook-whistleblower-frances-haugen-set-to-appear-before-senate-panel-11633426201?mod=searchresults_pos9&page=1. During the hearing, Senator Blumenthal said, “Facebook and Big Tech are facing a Big Tobacco moment.” Shira Ovide, Spending Is Big Tech’s Superpower, N.Y. Times, Oct. 12, 2021, https://www.nytimes.com/2021/10/12/technology/big-tech-spending.html; Cecilia Kang, Lawmakers See Path to Rein in Tech, But It Isn’t Smooth, N.Y. Times, Oct. 9, 2021, https://www.nytimes.com/2021/10/09/technology/facebook-big-tobacco-regulation.html.
119. Joe Cosgrove Jr., Net Neutrality: Take 4!, 59 Infrastructure no. 2 (2020), at 1, 3–10.
120. In re Facebook, Inc., 625 S.W.3d 80, 101 (Tex. June 25, 2021) (quoting Dodd v. United States, 545 U.S. 353, 359–60 (2005)).