chevron-down Created with Sketch Beta.
June 23, 2021 Feature

When “Safe at Home” Is Not Safe: Addressing the Increase of Online Child Sexual Abuse in the COVID-19 Pandemic

By Rachel Haney

A t the onset of the COVID-19 pandemic, governments across the country and around the world issued “safe at home orders,” causing much of society to remain at home, moving our world further online. But at home and online is not always a safe place, and a myriad of pandemic factors have left some of the most vulnerable children at an increased risk of exploitation and abuse.

With the closure of schools, vulnerable children are isolated from the support of friends, teachers, and school counselors. Perpetrators have taken advantage of increased, often unsupervised, time of minors online, as well as the challenges online service providers are facing in quickly removing content with work-from-home constraints. Some children are forced to shelter in place with their abusers, who for reasons including stress, financial instability, or even boredom have seized the opportunity to create child sexual abuse material (CSAM)1 and distribute it online.2

Even before the onset of the COVID-19 pandemic, the proliferation of CSAM had “grown exponentially” as use of online communications has increased.3 For decades, there has been an increase in “the distribution of child pornography, in the number of images being shared online, and in the level of violence associated with child exploitation and sexual abuse crimes . . . the only place we’ve seen a decrease is in the age of victims.”4 In 2019 alone, the National Center of Missing and Exploited Children (NCMEC) received more than 16.9 million reports of suspected child sexual exploitation, which included 69.1 million files.5

The problem has only compounded throughout the COVID-19 pandemic. Initial reports indicate a significant increase in the sharing of CSAM online since early 2020.6 For example, NCMEC experienced a 98.66 percent increase in reports of online enticement between January and September 2020 as compared to the same time period in 2019.7 Evidence also shows an increase in self-generated CSAM8 as more online time has allowed greater opportunities for perpetrators to groom and exploit children.9

It is unclear when “offline” life will resume. Many children have either not returned to school or returned in limited capacities; law enforcement investigative capacities remain stretched to their limits; criminal court proceedings remain backlogged; and large numbers of the workforce—including content reviewers at technology companies—continue to work from home. Virtual spaces, which increasingly took a central role in our social and professional lives before the pandemic, remain at the core of nearly every aspect of our lives, with little change in sight. In the face of such uncertainty, it’s more critical than ever to double-down on efforts to efficiently, effectively, and collectively combat the evils of online child sexual exploitation. The question is how?

Content Review Programs

Online service providers, including social media companies, peer-to-peer networks, search engines, and other content-hosting platforms, make use of a range of technological and human resources to combat the proliferation of CSAM. One such resource is hash matching, a process by which a mathematical algorithm generates an alphanumeric sequence unique to a specific file that can be used to detect copies of that file.10 (A file’s hash is sometimes referred to as its “digital fingerprint.”11) A number of hash protocols exist, enabling providers to detect both exact matches and “fuzzy” matches of previously identified content.12 Regardless of protocol, hash-matching technologies share a primary goal: to find images or videos previously identified as CSAM to help providers quickly and effectively remove this content from their platforms and report it to NCMEC.

Legal Obligations Regarding CSAM

Service providers13 are not required to search for CSAM on their platforms.14 However, once providers obtain “actual knowledge of any facts or circumstances . . . from which there is an apparent violation” of the enumerated statutes involving CSAM, federal law requires providers to make a report to NCMEC’s CyberTipline “as soon as reasonably possible.”15 Such reporting is mandatory “in order to reduce the proliferation of online child sexual exploitation and to prevent the online sexual exploitation of children [. . .].”16 Providers may also voluntarily make a report to NCMEC’s CyberTipline after learning that a violation of the statutes involving CSAM may be “planned or imminent.”17 This permissive reporting was added to the statute in 2018 with the passage of the CyberTipline Modernization Act,18 to address the proliferation of online “grooming”19 of children and to allow intervention before further harm can occur.20

Penalties for knowing and willful failure to report are considerable, with fines up to $150,000 for the first instance and $300,000 for any subsequent instance. “Masha’s Law,”21 codified at 18 U.S.C. § 2252, also permits victims of violations of the CSAM statutes to sue in federal court for actual or liquidated damages in the amount of $150,000.22 To date, no claim for failure to report has successfully been litigated, but a lawsuit was recently filed against one social media provider alleging just that.23

To an extent, the statutory scheme has created perverse incentives for providers to engage in content safety work, thereby assuming knowledge of CSAM violations on their platforms.24 Providers have no obligation to monitor their platforms or scan for CSAM and face no statutory liability in refusing to do so.25 To the contrary, providers who do elect to implement thorough content safety programs may be exposed to liability.26 Further, providers who engage in content safety verification and human review risk discovery requests from criminal defendants regarding their detection practices. In addition, content safety reviewers are exposed repeatedly to horrific imagery, impacting resiliency and retention rates for employees and contractors who work in this field. And providers themselves assume risk of resiliency-related employment litigation.27

In effect, while service providers are well-positioned as the first line of defense to aid in thwarting the widespread distribution of CSAM, they can either refuse to do so or assume the bulk of the responsibility and potential liability if they do. The fact that many providers voluntarily assume this risk regardless shows a willingness by industry to intervene early and often. But this is not—and cannot be—solely up to technology providers: Courts and lawmakers need to get up to speed on these issues and develop the case law, legislation, and aggressive enforcement policies needed to collectively combat CSAM.

Proposed Changes and the Current Case Law

Compelled Scanning Is Not the Answer

It might seem at first glance, then, that one answer may simply be to compel service providers to monitor for CSAM. And indeed, Congress had tried. See discussion on EARN IT Act below. But compelled scanning programs present a host of constitutional issues, as well as practical challenges, particularly in the midst of a global pandemic.

Whether or not providers act in their private interest or at the direction of the government is a central question in criminal proceedings related to CSAM offenses. A number of criminal defendants throughout the country have moved to suppress evidence of CSAM violations on the theory that the government forced a provider to scan his or her account without a warrant, either through a separate arrangement or under the regulations aimed to prevent the proliferation of CSAM, and evidence was therefore obtained in violation of the Fourth Amendment. Courts have consistently rejected such arguments, finding that nothing in the statutory regime requires providers to scan for CSAM or converts them to agents of the government by requiring reporting of CSAM once detected.28

And by compelling private companies to engage in scanning, the government would threaten the very system that enables both NCMEC and law enforcement to open and view previously viewed files without a warrant. The private search doctrine provides an exception to the Fourth Amendment’s general prohibitions on warrantless searches. Where providers conduct their content-scanning programs based on their own private interest, and not at the direction of the government or NCMEC, no Fourth Amendment violation results.29 Courts have consistently recognized that providers maintain independent, private interests in keeping their platforms free of content that exploits children and harms other users who may be exposed to such content.30 As one court aptly noted, “[n]o sane person, let alone a business that values its image and reputation, wants to be publicly associated with the sexual exploitation of children.”31

Compelled scanning would also pose considerable practical difficulties for providers in the midst of a global pandemic when large numbers of the workforce—particularly at technology companies—are working from home. Many providers engage content safety personnel to review content flagged by detection technology in order to validate hashes received by external sources, train detection technology, and ensure that specific images flagged for reporting are, in fact, reportable. Compelled scanning would risk inundating content review teams already operating under greatly reduced capacity due to the pandemic.

AI vs. the Human Eye and the Impact on the Private Search Doctrine

Some may argue that the solution lies in increasing technological detection regardless of human review capacity; after all, nothing in the current reporting regime requires a human eye to view content before it is removed or reported to NCMEC. Indeed, to bridge pandemic-related gaps in human review, many service providers have ramped up technological detection.32 But problems persist. Without question, technology is a critical tool in content detection, but it is not a perfect substitute for the human eye, either practically or legally.

While technological detection might suffice for many types of violating content, most content carries no mandatory reporting obligation. The nature of technological scanning arguably presents as many questions as it answers when it comes to CSAM content. Do provider computer logs of content detected as potential CSAM constitute actual knowledge, even if no human has confirmed the image? Does the duty to report turn on the accuracy of the detection method, i.e., should the law treat exact matches, fuzzy matches, or even proactive detection33 the same? How soon must a provider report content, where content review teams are constrained by COVID-19? Is it better to report more, potentially lower-accuracy reports, or fewer, higher-quality reports confirmed by human review teams? And what happens to users reported to NCMEC on a false positive because a computer thinks their lawful content is CSAM? Courts have just begun to grapple with these questions, but recent opinions show a lack of clear guidance.

The role of human review in CSAM-related investigations and prosecutions has taken center stage since the Tenth Circuit’s ruling in United States v. Ackerman.34 In Ackerman, AOL’s hash-value matching technology matched image files Ackerman attached to an email, stopped delivery, terminated his account, and reported the files to NCMEC. On appeal, the Tenth Circuit found NCMEC was a government entity and violated Ackerman’s Fourth Amendment rights in exceeding the scope of AOL’s review by opening and viewing the contents of the CSAM images.35 While a person at AOL had previously classified a CSAM image sharing the same hash value to one of the images attached to Ackerman’s email, neither the email nor any of its attachments at issue was ever viewed by a human at AOL.36

Since Ackerman, there’s been a flurry of litigation related to what a “private search” of CSAM even entails. Courts are generally in agreement that where a person at the company views the contents of a file concurrently with or immediately prior to making a report to NCMEC, both NCMEC and law enforcement may view the contents of the file without expanding on the scope of the provider’s private search.37 However, there is no clear guidance on whether NCMEC or law enforcement may view the contents of a file if someone at the company previously viewed a version of the image bearing the same hash but did not view the image concurrent with reporting to NCMEC.38 Put plainly, it’s not clear if the government (or NCMEC) can view files detected by hash-matching technology but not viewed by a human at the time of reporting without a warrant.

Practically speaking, whether a provider conducted human review matters because, given the sheer volume of reports, NCMEC and law enforcement deprioritize reports where content was not viewed by a person concurrent with the report. Given the ambiguity, numerous jurisdictions bar their agents from viewing reports where content was not re-reviewed prior to upload, and law enforcement’s inability to view apparent CSAM files prior to making an application for a search warrant may frustrate the ability to obtain one.39 Indeed, a technology-dependent regime may lead to a bizarre paradox where violators who amass and attempt to share egregious, well-known imagery escape prosecution entirely because a provider did not re-review notorious hash-matched content immediately prior to reporting.

Legislative Developments and Executive-Branch Enforcement

The need for smart, effective law-making around online child exploitation is self-evident. Yet many legislative efforts addressing child exploitation have been geared towards achieving political means, not keeping kids safe.

For example, in March 2021, Sen. Lindsey Graham (R-SC) introduced the EARN IT Act, an attempt to target Section 230 immunity and end-to-end encryption masquerading as a bill to “establish a National Commission on Online Child Sexual Exploitation Prevention.”40 In its earliest form, the bill aimed to, among other things, require providers to “earn” their Section 230 immunity by certifying compliance with a number of “best practices” established by a commission while also removing Section 230 immunity for civil and state criminal claims for CSAM offenses. Although technically structured as a voluntary step taken in exchange for immunity, the actual effect of the amendment would have been to require providers to comply with the best practices, including monitoring and screening for CSAM. The bill was ultimately revised in committee—removing the coercive compliance with best practices in exchange for immunity—and ultimately died on the Senate floor, but not before drawing strong bipartisan support.

Compare this to the Invest in Child Safety Act, first introduced in May 2020, which aimed to direct $5 billion in mandatory funding to significantly increase the headcount available for prosecutions and investigations, as well as much-needed counseling support to prevent, detect, treat, and prosecute CSAM abuse.41 The Invest in Child Safety Act also aimed to address the failures of the prior administration to coordinate efforts across federal agencies related to combating child exploitation, as well as its siphoning of $60 million from programs aimed at preventing child exploitation and supporting victims. While the Invest in Child Safety Act gained little traction in May 2020, it was reintroduced in February 2021 in the new Democratic-controlled Congress.42


Technology (and tech companies) are critical to combating the proliferation of CSAM but are not the only solution. And in the era of “techlash,” putting the burden and blame squarely on service providers may be politically palatable but will not move the needle where it matters most: preventing the spread of CSAM and ratcheting up the ability to hold perpetrators accountable. Collectively we all have a role to play in keeping kids safe online. It’s beyond time we do it.


If you think you’ve seen a missing child or are aware of online child exploitation, please make a report to NCMEC’s CyberTipline 24-hours a day, 7 days a week online at or by calling 1-800-THE-LOST (1-800-843-5678).

NCMEC and other nonprofit organizations provide resources for parents and youth aimed at educating children and helping parents have conversations around online safety. For example, NCMEC provides interactive resources for teaching kids 5–17 about online safety and digital censorship through games, videos, and lessons (

Thorn provides resources for teens and parents to empower victims of sextortion to get help and find support (

Additional resources:,

Resources are also available for people concerned about their own thoughts and behavior. For example, the Johns Hopkins Moore Center for the Prevention of Child Abuse provides a portal for individuals to seek help and support:


1. Throughout this article, the term “child sexual abuse material” is used to refer to what is termed “child pornography” under federal law. 18 U.S.C. § 2256(8). This term, along with others, including child sexual abuse imagery (CSAI) and child sexual exploitation and abuse (CSEA), are more commonly used by service providers and others in recognition that the term “child pornography” may invoke the mistaken belief that children could be willing participants in the creation of CSAM. Congress has proposed amending the affected statutes to more accurately reflect what this content and conduct is: exploitation and abuse.

2. Interpol, Threats and Trends Child Sexual Exploitation and Abuse: COVID-10 Impact at 14 (Sept. 2020),

3. See Paroline v. United States, 572 U.S. 434, 440 (2014) (internal citation omitted).

4. Att’y Gen. Eric Holder Jr., speech at the National Strategy Conference on Combating Child Exploitation, San Jose, Cal. (May 19, 2011).

5. About NCMEC, Nat’l Ctr. for Missing & Exploited Children,,helping%20locate%20noncompliant%20sex%20offenders.

6. Interpol, supra note 2, at 10.

7. Brenna O’Donnell, COVID-19 and Missing & Exploited Children, NCMEC Blog (Oct. 20, 2020),

8. Id.

9. Pietro Ferrara et al., The Dark Side of the WebA Risk for Children and Adolescents Challenged by Isolation During the Novel Coronavirus 2019 Pandemic, 228 J. Pediatrics 324 (2021),

10. Richard P. Salgado, Fourth Amendment Search and the Power of the Hash, 119 Harv. L. Rev. F. 38, 38–40 (2005).

11. See United States v. Ackerman, 831 F.3d 1292, 1294 (10th Cir. 2016) (citing Salgado, supra note 10, at 38–40).

12. See Petter C. Bjelland, Katrin Franke & André Årnes, Practical Use of Approximate Hash Based Matching in Digital Investigations, 11 Digit. Investigation S18 (2014),

13. For purposes of CSAM reporting, “provider” is a defined term referring to an electronic communication services provider or remote computing service. 18 U.S.C. § 2258E(6).

14. Id. § 2258A(f).

15. Id. § 2258A(a)(1).

16. Id. § 2258A(a)(1)(A).

17. Id. § 2258A(f).

18. CyberTipline Modernization Act of 2018, Pub. L. No. 115-395, 132 Stat. 5287.

19. See, e.g., James, Online Grooming: What It Is, How It Happens, and How to Defend Children, Thorn Blog (June 15, 2020), (“Online grooming is a term used broadly to describe the tactics abusers deploy through the internet to sexually exploit children.”).

20. Press Release, Sen. Dianne Feinstein (Cal.), Feinstein, Cornyn Bill to Modernize CyberTipline Heads to President’s Desk (Dec. 12, 2018),

21. James R. Marsh, Masha’s Law: A Federal Civil Remedy for Child Pornography Victims, 61 Syracuse L. Rev. 459 (2011).

22. Id. Masha’s Law is named for a victim of CSAM, commonly known as Masha Allen. Masha was adopted in 1998 from Russia to an American man who raped, filmed, and distributed Masha’s abuse for six years until being caught by the FBI in 2003. Law enforcement agencies report that CSAM depicting Masha is found on 80 percent of the devices of apprehended child predators worldwide.

23. John Doe v. Twitter, No. 3:21-cv-00485 (N.D. Cal. Jan. 20, 2021).

24. See United States v. Ringland, 966 F.3d 731, 736 (8th Cir. 2020) (noting “the penalties for failing to report child pornography may even discourage searches in favor of willful ignorance”).

25. 18 U.S.C. § 2258A(f).

26. While the statute does provide immunity from civil and criminal claims arising from the performance of reporting and preservation obligations under 18 U.S.C. § 2258A, this has been interpreted primarily to bar users from litigating proper disclosures of their account information to NCMEC (e.g., privacy torts or Stored Communication Act claims). See, e.g., Jurek v. Am. Tel. & Tel. Co., No. 5:13 CV 1784, 2013 WL 5298347, at *5 (N.D. Ohio Sept. 20, 2013) (invoking 18 U.S.C. § 2258B to shield AT&T from liability for disclosing plaintiff’s personal information to law enforcement after modem technician opened, viewed, and reported apparent CSAM on plaintiff’s computer to his supervisor at AT&T, who subsequently reported the incident to law enforcement).

27. For example, Facebook was sued in a class-action lawsuit by content moderators alleging PTSD, ultimately settling these claims for $52 million. Facebook to Pay $52m to Content Moderators over PTSD, BBC News (May 13, 2020),

28. See United States v. Stevenson, 727 F.3d 826, 830 (8th Cir. 2013) (“The only subsection that bears on scanning”—section 2258A(f)—“makes clear that an [ISP] is not required to monitor any user or communication, and need not affirmatively seek facts or circumstances demonstrating a violation that would trigger the reporting obligation. . . .”); United States v. DiTomasso, 81 F. Supp. 3d 304, 311 (S.D.N.Y. 2015), aff’d, 932 F.3d 58 (2d Cir. 2019) (“In light of [language clarifying providers are not required to monitor communications], I would be hard-pressed to conclude that sections 2258A and 2258B require private actors to perform law enforcement searches.”).

29. See United States v. Miller, 982 F.3d 412, 425 (6th Cir. 2020) (Google’s hash-value matching and reporting of child pornography did not implicate the Fourth Amendment); United States v. Reddick, 900 F.3d 636, 637 (5th Cir. 2018) (Microsoft conducted a private search when it discovered child pornography via hash-value matching technology); see also United States v. Stevenson, 727 F.3d 826, 831 (8th Cir. 2013); United States v. Cameron, 699 F.3d 621, 638 (1st Cir. 2012); United States v. Richardson, 607 F.3d 357, 366 (4th Cir. 2010).

30. United States v. Rosenow, No. 17-CR-3430-WQH, 2018 WL 6064949, at *8 (S.D. Cal. Nov. 20, 2018) (“Yahoo has a business interest in enforcing its terms of service and ensuring that its products are free of illegal conduct, in particular, child sexual abuse material.”).

31. See Decision and Order Denying Motion to Suppress, United States v. Bebris, Case No. 19-CR-00002 (E.D. Wis. Mar. 9, 2020).

32. YouTube Team, Inside YouTube: Responsible Policy Enforcement During Covid-19, YouTube Official Blog (Aug. 25, 2020),

33. Some detection technology does not rely on hashing. For example, some providers employ technology to proactively detect child nudity, previously unknown CSAM, or inappropriate interactions with children. See Press Release, Antigone Davis, Global Head of Safety, Facebook App: New Technology to Fight Child Exploitation (Oct. 24, 2018),

34. 831 F.3d 1292 (10th Cir. 2016).

35. Id. at 1298.

36. Id. at 1305–06.

37. See, e.g., United States v. Ringland, 966 F.3d 731, 737 (8th Cir. 2020); United States v. Drivdahl, CR 13-18-H-DLC, 2014 WL 896734, at *4 (D. Mont. Mar. 6, 2014).

38. See United States v. Miller, 982 F.3d 412, 430 (6th Cir. 2020) (no Fourth Amendment violation where law enforcement viewed contents of files Google did not review concurrently with or immediately prior to reporting to NCMEC where the files matched hash values to images a person at Google had previously viewed and confirmed to be apparent CSAM); but see United States v. Keith, 980 F. Supp. 2d 33, 43 (D. Mass. 2013). A case involving similar issues is currently pending before the Ninth Circuit (United States v. Wilson, No. 18-50440).

39. For example, a number of circuits require a magistrate judge to independently review CSAM images satisfying the federal definition under the “lascivious exhibition” category of 18 U.S.C. § 22546(2)(A). See, e.g., United States v. Perkins, 850 F.3d 1109, 1116 (9th Cir. 2017); United States v. Brunette, 256 F.3d 14, 18 (1st Cir. 2001).

40. Riana Pfefferkorn, The EARN IT Act: How to Ban End-to-End Encryption Without Actually Banning It, Ctr. for Internet & Soc’y Blog (Jan. 30, 2020, 12:42 PM),

41. Invest in Child Safety Act (summary),

42. Press Release, Sen. Ron Wyden (Or.), Wyden, Gillibrand, Brown, Hirono and Eshoo Reintroduce Invest in Child Safety Act to Protect Children from Online Exploitation (Feb. 4, 2021),

The material in all ABA publications is copyrighted and may be reprinted by permission only. Request reprint permission here.

By Rachel Haney

Rachel Haney is an associate at Perkins Coie, where she defends companies in privacy and security litigation on a range of matters, including government investigations and data breach reporting. Her practice includes counseling technology companies on compliance with the Electronic Communications Privacy Act and the CyberTipline Modernization Act, among other laws. Rachel is a member of the firm’s Tech Amicus Practice and has represented technology companies in a number of amicus curiae filings related to child safety issues. The author thanks Perkins Coie partner Ryan Mrazik for his thoughtful editing and contributions to this piece. The views expressed here are those of the author and do not necessarily represent the views of the American Bar Association, Perkins Coie, or any of its clients.