Summary
- This Survey provides seven law review articles intended to offer a useful summary of the cutting-edge issues influencing the intersection of law and technology, selected for their relevance to The Business Lawyer.
The 2024–2025 Survey of the Law of Cyberspace (Survey) continues in the mode set by its predecessors since its introduction in 1997. This year’s collection of seven pieces sets forth major legal developments that technologies have brought about and the new fields of endeavors in legal, commercial, scientific, and creative pursuits that we had no idea would emerge just thirty years ago. These new technologies and the adoption of new laws and regulations that address them have animated the minds of the clever lawyers who have given their time to the Business Law Section’s Cyberspace Law Committee to help lawyers meet their ethical duties to clients to stay current and competent as we advise clients on their uses of technology in their businesses and use it ourselves.
This year’s Survey contains articles on subjects ranging from the emerging regulation of artificial intelligence (AI) to technology-assisted electronic payments and financial services, to technology’s role in market manipulation, and, finally, to intermediary liability and challenges under Section 230 of the Federal Communications Act. In a departure from past Surveys, this year’s Survey includes four articles focused on AI—two on federal regulations enacted since June 1, 2023, and two on the implications of Colorado’s enactment of its Artificial Intelligence statute, the first state law to regulate AI in the United States.
The two AI regulatory analyses are by pairs of authors Michael Simon and Andrew Pery, and Candace M. Jones and Roland L. Trope. These four authors contributed two articles on AI featured in the 2023–2024 Survey. Messrs. Simon and Pery focus on the Artificial Intelligence Act that the European Parliament passed in May 2024, and efforts by federal agencies in the United States and Colorado following the explosive introduction in late 2022 of products such as ChatGPT and subsequently developed uses for AI in many fields of endeavor here and abroad. In the second AI survey, Ms. Jones and Mr. Trope discuss examples of emergent “guardrails” for Generative Artificial Intelligence (GenAI) from three federal agencies, a creative trade union’s deal with major film studios and television networks, and in guidance from courts in England and Wales and New Zealand.
You might ask why we included four articles on AI in this year’s Survey instead of two, as we did in the 2023–2024 Survey. The answer is simple—AI has seen exponential growth during the Survey year and its uses have generated huge attention and new regulations from the European Union, Biden Administration, National Institute of Standards and Technology, the Federal Trade Commission, Congress, and, so far, one state. AI certainly seems like the hot topic in technology for the 2023-2024 Survey year.
The two more sector-related articles are by Jon M. Garon and Reena Bajowala. Professor Garon is a regular contributor to the Survey, including an article on AI in the 2023–2024 Survey. Professor Garon’s 2024–2025 article focuses on efforts by the federal Equal Employment Opportunity Commission (EEOC) and the State of Colorado to regulate algorithmic bias in high-risk use cases. The piece evaluates the EEOC’s efforts in enforcement actions that seek to clarify its jurisdiction over those who use and those who provide tools that employ algorithms in employment contexts. Professor Garon’s identification of theories of liability that the EEOC has employed is particularly useful. His analysis of Colorado’s 2024 AI legislation delves into the two-pronged approach it uses: first, to require that the public receive notice when a company is using AI for communications or other purposes, and, second, to limit algorithmic bias or discrimination. To focus the law, the statute identifies many common uses of AI that are excluded from the definition of “algorithmic discrimination” under the statute, including fraud protection, cybersecurity, data storage, calculators, and similar uses. Professor Garon’s insights will help every reader of this Survey appreciate the human toll of ever-wider uses of AI in every arena.
Ms. Bajowala’s article focuses on the paradigm shifts evident in the EU Parliament’s Artificial Intelligence Act and Colorado’s Artificial Intelligence statute, showing that there is a call for “components of a governance program that require efforts far above and beyond the prior disclosure-and-consent mechanisms.” This governance requirement includes compliance requirements that cover more than deployers of AI, implementation of risk-management and other policies, development of governance programs, and development of testing and auditing of AI programs. Ms. Bajowala also points to insurance industry AI practices as “best-in-class AI governance” practices and identifies features of best AI-governance programs.
Following this quartet of articles on AI is an article focused on uses of technology in payments and financial services—a longstanding feature in Cyberspace Law Surveys of the first twenty-six years of its existence. The authors are Stephen T. Middlebrook, Tom Kierner, and myself. This survey centers on the federal and state enforcement actions against fintechs for engaging in unfair or deceptive acts or practices, such as their use of dark patterns, negative options, and solicitations of “tips” in lieu of fees. It also covers banks’ duties to supervise and manage third parties, including fintech companies, with which banks partner. In addition, the survey evaluates efforts by the Consumer Financial Protection Bureau (CFPB) and state efforts to regulate newer payments products, such as buy-now-pay-later (BNPL) and earned-wage-access (EWA) payments and international remittance payments. Finally, it calls for renewed attention to misleading advertising of FDIC insurance in the wake of fintech bankruptcies and the first enforcement action brought under New York State’s Exempt Income Protection Act for failures to protect consumers from garnishment of income from protected sources.
We conclude this year’s Survey with two articles—one on technology’s role in market manipulation by Professors John Bagby and Nizan Geslevich Packin, and the other on intermediary liability and challenges for Section 230 of the federal Communications Act of 1934 by Professors Chase Edwards and Brent Baker.
Professors Bagby and Packin explain how technology has contributed new tools to apply to those who manipulate the markets for agricultural commodities, securities, derivatives, swaps, currencies, energy, credit, and interest rates. Among their survey’s contributions is an analysis of whether countermeasures against manipulation through electronic communications and connectivity are sufficiently robust and their discussion of the influence of financial technology and cybersecurity measures.
Professors Edwards and Baker’s survey focuses on federal immunity under Section 230 and the litigation costs nevertheless imposed on providers of “interactive communications services.” Section 230 not only shields “interactive communications services” from federal claims relating to content provided by third parties but also preempts most state claims relating to such content. The developments included in their survey relate to the use of surreptitious terms and conditions, liability for hijacked accounts, and liability of print-on-demand services for intellectual property infringement. Professors Edwards and Baker conclude with analysis of future challenges to Section 230 immunity and its “efficacy and survival as a cornerstone of Internet policy.”
As the curators and editors of this Survey, Jon Garon and I see the increasing importance for lawyers to keep up on many, divergent laws and practices that affect privacy in health-care services, digital commerce, financial services (including insurance), creative arts in film and television, online communications services that include materials created by third parties, and attempts to manipulate markets using private, online communications services that are harder for supervisors and regulators to monitor.
Many individuals contributed to this Survey in addition to those mentioned by the authors in their introductions. Chief among this group is Diane Babal, the Production Manager for The Business Lawyer; Diane’s grace in dealing with authors is unparalleled and her advice is superb. Next in line for thanks are two Indiana University Maurer School of Law students, Ms. Shunyo Morgan, Class of 2024, and Ms. Apurva Swami, Class of 2025, who assisted authors. Finally, we thank the Cyberspace Law Committee members who have guided the Committee’s Surveys collectively over most of the past twenty years—Professors John Rothchild of Wayne State University, Edward Morse of Creighton University, and Juliet M. Moringiello of Widener School of Law—and Professor Gregory Duhl, as the Co-Editor-in-Chief of The Business Lawyer. Their meticulous editing and encouragement helped everyone whose drafts they shepherded and taught all who contributed how to condense important developments into more readily digested prose.