At the same time, the dynamism of algorithmic pricing and its ability to respond to the market at lightning speed has ignited questions about whether algorithms could facilitate illegal collusion, while concealing any evidence of a price-fixing agreement. Since these algorithmic tools can adjust prices and strategies in a manner that mirrors coordinated behavior, even without explicit communication, some posit that algorithms could form agreements with competing pricing algorithms. Considering such arrangements in the context of existing antitrust law rightly raises some questions. Parallel action—that is, action taken by competitors simultaneously without an agreement—is generally permissible. It is not permissible, however, when technology is knowingly used to reach or effectuate an agreement between competitors. Identifying and understanding such misconduct is complicated by the constant use and improvement of technology, particularly where an algorithm can plausibly teach itself to respond to market conditions in conjunction with other systems or market players. These emerging technologies pose new challenges for antitrust authorities and companies alike, in terms of understanding the technology and applying traditional legal frameworks to the possibility of algorithm-driven market manipulation. As technology evolves, understanding and addressing algorithmic collusion are essential to ensure fair competition and protect consumer interests.
Recent Legislative Efforts and Antitrust Enforcement by the DOJ
Increased Legislation Targeting AI
In recent months, the US legislature has expressed concerns with companies utilizing AI because pricing algorithms could facilitate collusion. As a result, legislative officials announced that they will be prioritizing regulation.
Legislative interest in AI and pricing algorithms has grown exponentially, with numerous bills introduced in recent years. Eight years ago, only a single bill was introduced in state legislatures related to AI and pricing algorithms. However, that number increased to 37 in 2022 and 190 in 2023, representing at least 25 different states plus Puerto Rico and the District of Columba. Artificial Intelligence 2023 Legislation, NCSL (last updated Jan. 12, 2024).
The US Congress also has stepped up its efforts to regulate AI, devoting more resources to understanding the potential risks imposed. The Senate Judiciary Committee, at a December 13, 2023, hearing, discussed the potential for antitrust violations and other illegal conduct stemming from the use of AI. The New Invisible Hand? The Impact of Algorithms on Competition and Consumer Rights: Hearing Before the Sen Comm. on Judiciary, Subcomm. on Competition Pol’y, Antitrust & Consumer Rts., 118th Cong. (Dec. 13, 2023). Notably, Chair Amy Klobuchar cautioned that when competitors decide to “delegate their independent pricing decisions to an algorithm, the result is little more than a sophisticated cartel hiding in code.” Id. In February 2024, Klobuchar, along with Senators Ron Wyden (D-Or.), Dick Durbin (D-Ill.), Peter Welch (D-Vt.), Mazie Hirono (D-Haw.), and Richard Blumenthal (D-Conn.), introduced the Preventing Algorithmic Collusion Act to bar companies from using algorithms to collude to set higher prices. News Release, US Sen. Amy Klobuchar, Klobuchar, Brown Introduce Legislation to Combat Anticompetitive Behavior in the Residential Housing Market (July 12, 2024).
The US House of Representatives similarly announced the establishment of a bipartisan task force on AI that will “explore how Congress can ensure America continues to lead the world in AI innovation while considering guardrails that may be appropriate to safeguard the nation against current and emerging threats.” Press Release, US House Rep. Ted Lieu, House Launches Bipartisan Task Force on Artificial Intelligence (Feb. 20, 2024).
Heightened DOJ Enforcement of AI Algorithms
Within the Executive Branch, the DOJ has increased its scrutiny of AI. Acknowledging the rapid pace at which AI tools are used to make pricing decisions, the Assistant Attorney General for the Antitrust Division of the US Department of Justice, Jonathan Kanter, last year pushed for competition authorities to “invest in technological and data science expertise to understand, detect, investigate, and ultimately prosecute crimes involving algorithms and AI.” Khushita Vasant, US DOJ’s Kanter Warns Companies over Algorithmic Price-Fixing, Calls for Corporate Compliance, MLex (May 4, 2022). Shortly thereafter, Kanter revealed “Project Gretzky,” to better understand AI and recruit data scientists. The initiative is designed to develop the Antitrust Division’s know-how and expertise into advanced technological tools. Ashley Gold, DOJ Has Eyes on AI, Antitrust Chief Tells SXSW Crowd, Axios (Mar. 13, 2023). Kanter also suggested that corporate compliance should be expanded to address pricing technologies. Vasant, supra. Kantar made clear that the DOJ will hold companies liable, irrespective of whether conduct is done in a smoke-filled room or using AI; from the government’s perspective, “it’s the same thing.” Id.
Echoing the call for increased scrutiny of pricing algorithms and related tools, Principal Deputy Assistant Attorney General of the Antitrust Division Doha Mekki stated that “[i]n some industries, high-speed, complex algorithms can ingest massive quantities of ‘stale,’ ‘aggregated’ data from buyers and sellers to glean insights about the strategies of a competitor.” Doha Mekki, Principal Deputy Assistant Att’y Gen., Remarks at GCR Live: Law Leaders Global 2023 (Feb. 2, 2023). Mekki concluded that the DOJ is “experiencing an inflection point in the use of algorithms, data at scale, and cloud computing.” Id.
As part of its efforts to increase scrutiny of AI algorithms, the DOJ announced it would pursue “stiffer sentences—for individual and corporate defendants alike”—for crimes that are made “significantly more serious” through the use of AI. Lisa Monaco, Deputy Att’y Gen., Keynote Remarks at the American Bar Association’s 39th National Institute on White Collar Crime (Mar. 7, 2024). On March 7, Deputy Attorney General Lisa Monaco explained that the DOJ will consider how well corporate compliance programs manage AI-related risks. Id. Additionally, to facilitate the DOJ’s increased scrutiny against AI, a new initiative was created—“Justice AI”—to research and address the impacts of AI. Id.
These new AI initiatives follow President Biden’s Executive Order on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence last October. The White House, Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, Briefing Room (Oct. 30, 2023). This Executive Order established eight guiding principles and priorities for AI development and use, emphasizing that “Artificial Intelligence must be safe and secure.” Id. It calls for the federal government to “increase its internal capacity to regulate, govern, and support responsible use of AI to deliver better results for Americans.” Id. Additionally, the Executive Order tasks the DOJ with submitting a report to the US president addressing the anticipated impact of AI in the criminal justice system. Id.
How Recent Case Developments Provide Guidance for Compliance
The seminal DOJ case involving algorithms occurred nearly nine years ago, involving a conspiracy to use an algorithm to coordinate and fix the prices of wall posters sold on a large e-commerce platform. United States v. Topkins, No. 3:15-CR-00201, 2015 WL 1522300 (N.D. Cal. Apr. 6, 2015).
The DOJ has not brought another AI-related criminal antitrust prosecution since Topkins. However, the DOJ recently filed a civil action against an algorithmic pricing aggregator and has filed multiple statements of interest in other civil cases brought by class action plaintiffs. The DOJ continues to signal that it has renewed interest in aggressively prosecuting collusion and cartel behavior facilitated by AI algorithms both criminally and civilly.
A few interesting trends have emerged in the recent cases, which we discuss below.
First, while it is typical for the DOJ to have the upper hand in identifying and pursuing potentially illegal antitrust conduct, litigation against pricing algorithms has been led by civil plaintiffs.
This trend flips the script—with the DOJ in the fairly novel position of pursuing follow-on criminal or civil enforcement, rather than in its typical position of leading the way. This may be surprising given the DOJ’s advantages in terms of available investigatory tools, mainly the Antitrust Leniency Policy, which offers criminal amnesty to antitrust violators in return for full cooperation in the DOJ’s investigation. See Frequently Asked Questions About the Antitrust Division’s Leniency Program, US Dep’t of Just. (Jan. 3, 2023).
A number of civil antitrust lawsuits seeking treble damages have all originated from civil lawsuits filed by traditional plaintiffs’ firms. These include:
- In re RealPage Rental Software Antitrust Litig. (No. II) (RealPage): Here, renters asserted that multiple landlords separately provided their nonpublic business information to the same price-setting algorithm, then used it to set rental prices. In October 2022, ProPublica published a report, investigating the use of this algorithmic model. Following the report, more than 30 class-action suits were filed. These suits were later consolidated in a multidistrict litigation in the Middle District of Tennessee. See In re RealPage, Inc., Rental Software Antitrust Litig., 709 F. Supp. 3d 478 (M.D. Tenn. 2023).
- Duffy v. Yardi Systems, Inc. (Yardi): Filed in September 2023, this case involved allegations that the defendant used common algorithmic pricing software in the multifamily property market where managers and owners employed a “centralized pricing mechanism” known as the “RENTmaximizer” that civil plaintiffs characterized as facilitating collusive agreements to raise prices on buyers. See Complaint, Duffy v. Yardi Sys., Inc., et al., Dkt. 2:23-cv-01391 (W.D. Wash. Mar. 1, 2024), Dkt. No. 1.
- Karen Cornish-Adebiyi, et al. v. Caesars Entertainment, Inc. & Gibson v. Cendyn Group LLC: In these cases, civil plaintiffs representing classes of renters levied allegations that hotels and casinos in Las Vegas (Cendyn) and Atlantic City (Caesars) employed software to stabilize and raise rents in these tourist markets. In Cendyn, plaintiffs alleged that another company, acquired by Cendyn, employed a pricing algorithm that allegedly “90% of the hotels on the Las Vegas Strip” used. Plaintiffs allege that the tool provided pricing recommendations that would maximize revenues and profits after analyzing data from competitors in that hotel market. See Complaint, Gibson v. Cendyn Group LLC, et al., Dkt. 2:23-cv-00140 (D. Nev. May 8, 2024), Dkt. No. 1.
Second, while the DOJ does not lead the charge on filing initial cases about AI algorithms, they have certainly been watching closely and getting more involved.
In March 2024, the Antitrust Division reportedly opened a criminal investigation into companies using rental pricing software. As part of the DOJ’s investigation, on May 22, 2024, the Federal Bureau of Investigation (FBI) performed an unannounced investigation at the Atlanta headquarters of apartment owner and manager Cortland Management. Khushita Vasant, FBI Raids Cortland Management in Atlanta as Part of Antitrust Probe of Rental Housing Market, MLex (May 29, 2024).
In August 2024, the DOJ and several states filed a civil antitrust lawsuit against RealPage. The DOJ alleges that the company’s pricing software allows landlords to share sensitive information and make coordinated pricing decisions, leading to higher rents. The DOJ claims this practice violates the Sherman Act by enabling unlawful information sharing and monopolization. The DOJ asserts only a civil rule of reason claim under Sections 1 and 2 of the Sherman Act. As noted, this case follows previous private litigation. Currently, no charges have been filed against the landlords that use the software; however, cases against these landlords may be forthcoming.
The DOJ also filed statements of interest in many civil lawsuits clarifying their view of the law. This includes:
- A Statement of Interest filed on November 15, 2023, in the RealPage case (Docket No. 3:23-MD-3071, Middle Dist. of Tennessee). The DOJ referred to algorithms as the “new frontier” of price-fixing enforcement, which “poses an even greater anticompetitive threat than the last.” Id., Statement of Interest of the United States at 2. The DOJ clarified that “not every use of an algorithm to set price qualifies as a per se violation of Section 1 of the Sherman Act” but can constitute a violation when, in the opinion of the DOJ, “competitors knowingly combine their sensitive, non-public pricing and supply information in an algorithm that they rely upon in making pricing decisions, with the knowledge and expectation that other competitors will do the same.” Memorandum of Law in Support of Statement of Interest of the United States, In re RealPage, Case No. 3:23-MD-3071 (M.D. Tenn Nov. 15, 2023).
- A Statement of Interest filed on March 1, 2024, in the Yardi case (Docket No. 2:23-cv-01391-RSL, Western Dist. of Washington). The DOJ clarified its view that an agreement between competitors to fix end prices is per se unlawful, regardless of whether the competitors adhere to or enforce compliance with those fixed prices. Statement of Interest of the United States, McKenna Duffy v. Yardi Systems, Inc., et al., No. 2:23-cv-01391-RSL (W.D. Wash. Mar. 1, 2024). In other words, “competitors’ jointly delegating key aspects of their decisionmaking to a common algorithm” constitutes the concerted action necessary to violate the antitrust laws because doing so “deprives the marketplace of independent centers of decisionmaking.” Id. at 2–3. The DOJ reiterated its view that the Sherman Act’s prohibition against price fixing “includes agreements to use the same pricing formula—analogous to agreements to use the same pricing algorithm.” Id. at 3. The DOJ also explained that “it is irrelevant to the scheme whether landlords share confidential information among themselves or with only the pricing agent; the alleged scheme is designed to obviate the need for competitors to share information directly with each other.” Id. at 7, n.4.
- A Statement of Interest filed on March 28, 2024, in the Caesars case (Docket No. 1:23-cv-02536-KMW-EAP, Dist. of New Jersey). The DOJ reiterated that “an agreement among competitors to fix the starting point of pricing is per se unlawful, no matter what prices the competitors ultimately charge.” Statement of Interest of the United States at 3, Karen Cornish-Adebiyi, et al. v. Caesars Ent., Inc., et al, No. 1:23-cv-02536-KMW-EAP (D.N.J. Mar. 28, 2024) (emphasis in original).
Given the DOJ’s strongly stated positions and increased scrutiny, it likely will take a more aggressive role in leading future cases related to algorithmic pricing.
Third, DOJ appears to be signaling an aggressive enforcement agenda aimed at the use of common pricing algorithms.
In the Cendyn case in Las Vegas, a Nevada federal court dismissed with prejudice the putative class action. , after plaintiffs had already filed a second amended complaint. See Order at 5, Gibson v. Cendyn Grp., Case No. 2:23-cv-00140-MMD-DJA, Dkt. No. 183 (May 8, 2024).
In the SAC, plaintiffs claimed that Las Vegas hotels charged supra-competitive prices for rooms through GuestRev and GroupRev, which are shared-revenue management systems licensed by the Cendyn Group. They claimed that Cendyn spearheaded a hub-and-spoke conspiracy using an algorithm that recommended room rates based on price and occupancy data.
The dismissal highlighted that Cendyn’s pricing recommendations were not based on nonpublic, competitively sensitive information—only public information available from online listings and online travel agencies. Defendants argued and the court agreed that where confidential information was not shared, the allegations were missing a key input to prove the alleged hub-and-spoke conspiracy. The judge characterized this deficiency as a “fatal defect with their first claim because it too compels the conclusion that there is no rim” to the hub-and-spoke conspiracy. Id. at 9.
Further, the plaintiffs failed to allege that hotel operators “agreed to be bound by [Cendyn’s] pricing recommendations, much less that they all agreed to charge the same prices.” Id. at 6.
The decision is on appeal to the Ninth Circuit where DOJ has already weighed in with an amicus brief. See Brief for the DOJ as Amicus Curiae, Gibson v. Cendyn Group LLC, No. 24-3576 (9th Cir. filed Oct. 24, 2024), Dkt. No. 28.1 (the “Amicus Brief”). The Amicus Brief supported the plaintiffs’ appeal and argued that the district court made at least one legal error in finding that non-binding prices cannot be a “restraint of trade.” Id. at 31. In DOJ’s view, Defendants can take concerted actions where there is “an invitation for collective action followed by conduct showing acceptance” such as “joint use” of the algorithm. Id. at 18, 22. This concerted action can be a per se violation if the algorithm or formula sets a “default or starting-point price” even if “non-binding” or the “end prices ultimately vary.” Id. at 23-24.
DOJ also believes that the use of a pricing algorithm can constitute a vertical rule of reason case at the pleading stage where plaintiffs allege a prima facie showing sufficient to infer “injury to competition” or “an anticompetitive effect.” Id. at 28-29.
Finally, DOJ’s Amicus Brief and the appeal demonstrate that the prevailing law remains unsettled and the incumbent risk from competitors agreeing to use a common algorithm remains uncertain. Caution is advised.
Thus far, civil plaintiffs have struggled to apply the per se rule. As noted, Judge Du has already dismissed one of these cases with prejudice, finding that there were not sufficient allegations of an anticompetitive agreement. Many of the ongoing civil lawsuits brought by renters also raise doubt as to whether the per se rule will apply to this conduct.
From one perspective, the conduct is not a traditional horizontal conspiracy because it involves a vertically situated third-party software that is not a traditional competitor. To receive per se treatment, plaintiffs must allege that the AI algorithm company is serving as a hub of a hub-and-spoke conspiracy. But those conspiracies come with some challenging legal requirements. Often, plaintiffs will have to prove that the members knowingly entered into these agreements among themselves—sometimes referred to as the “rim” agreement—to prove a Sherman Act violation.
Already in some of the ongoing civil lawsuits, defendants have succeeded in arguing that the per se rule is inappropriate because of the vertical nature of the agreements with the pricing algorithm and the lack of allegations of any rim agreements. Accordingly, the rule of reason ought to apply instead, which comes with a much higher evidentiary burden on plaintiffs (and the government). To prove a rule of reason case, plaintiffs will be required to define relevant markets, establish that defendants had market power within those markets, and prove an anticompetitive effect that is not outweighed by the potential procompetitive effects of the alleged conduct.
Only with more precedent, and further insight into the DOJ’s future enforcement plans, will we be able to ascertain whether these AI algorithm cases will continue to be handled under the rule of reason and if they may be subject to per se treatment and criminal enforcement.
Key Takeaways and Compliance Considerations
Despite potential antitrust challenges, there are substantial procompetitive benefits to using pricing algorithms, which will continue to drive their growth. For example, the use of pricing algorithms has been recognized as improving market efficiency. Robert M. Weiss & Ajay K. Mehrotra, Online Dynamic Pricing: Efficiency, Equity and the Future of E-commerce, 6 Va. J.L. & Tech. 11 (2001). Algorithms also can help improve the customer experience by increasing price transparency, improving current products, driving innovation for new products, promoting consumer welfare, and identifying customer demand. Marco Bertini & Oded Koenigsberg, The Pitfalls of Pricing Algorithms, Harv. Bus. Rev. (Sept.–Oct. 2021). By addressing customers’ needs and expectations in a more precise way, pricing algorithms also can benefit the competitive process by reacting to market changes, thereby achieving procompetitive effects. Anne-Sophie Thoby, Pricing Algorithms & Competition Law: How to Think Optimally the European Competition Law Framework for Pricing Algorithms?, Competition F., art. no. 0009 (Dec. 17, 2020).
It is unrealistic to presume that businesses will shy away from utilizing algorithmic tools to optimize pricing and enhance their commercial strategy. Instead, companies should analyze the competitive impact of these tools and manage risks as part of their antitrust compliance. For instance, industries that employ algorithms to aggregate relevant supply and demand data should proceed with caution to ensure that the exchange of data does not spiral into anticompetitive conduct. During the 2023 American Bar Association Antitrust Spring Meeting, Leslie Wulff, Chief of DOJ Antitrust Division’s San Francisco office, explained that companies “have a responsibility to check in on their algorithms, see what they’re learning, see how they’re adjusting to current market realities.” Mike Swift, Companies Using Pricing Algorithms Can’t Just “Set It and Forget It,” US DOJ antitrust official says, MLex (Mar. 30, 2023), .
While some companies may have limited resources to devote to compliance, the costs of antitrust lawsuits or DOJ investigations are quite significant and likely more than justify the investment. Investigating agencies will not take kindly to companies that fail to monitor their algorithms and neglect implementing basic compliance steps.
Certainly, the recent cases related to pricing algorithms suggest that there is more antitrust risk where companies subscribing to pricing algorithms contribute their nonpublic confidential data. The risk spikes if an algorithm is making pricing recommendations based on competitively sensitive data such as granular transactional data, including pricing and capacity information, real-time discounts, or other commercially sensitive information that would typically not be publicly available to competitors.
The antitrust risk also increases if subscribers to an algorithm are bound by the pricing recommendations of the algorithm. Judges also will consider how often subscribers use the recommended prices as a proxy for an agreement to accept those recommendations. Algorithms, thus, raise compliance risks where their recommendations are mandatory or frequently accepted—suggesting a common commitment or agreement to jointly set prices.
Of course, DOJ’s view remains that any agreement among competitors to use a common algorithm that recommends default or starting-point prices can constitute a violation of the antitrust laws, even if final prices diverge or the use of those prices is non-binding.
Third-party software companies offering or developing algorithmic and pricing data services should be aware that they may face charges for violating antitrust laws if their tools facilitate illegal price-setting between competitors that are based on non-public competitor data and require compliance with their pricing recommendations. Even smaller companies with limited access to AI have a responsibility to monitor their algorithms to ensure they’re adjusting for market realities.
While algorithms and their applications are novel and rapidly evolving, the analysis of anticompetitive impact remains unchanged. The antitrust risks surrounding algorithms and data tools can be mitigated through compliance. When using pricing algorithms, businesses and individual users should consider the following principles:
- Refinement of Compliance Programs: Design and tailor antitrust compliance programs to address the antitrust risks associated with AI and algorithms, as applicable to the company’s business and industry. Most of the ongoing challenges involve third-party algorithms run by independent companies, but similar principles apply to in-house algorithms developed by individual companies.
- Information Exchange Procedures: Incorporate compliance policies that address information exchanges, including the proper use of data that are collected and aggregated by AI tools. It is imperative that employees understand the types of commercially sensitive information that may be disclosed using pricing software and the risks associated with handling that information.
- Risk Assessment: Evaluate whether compliance policies include measures to mitigate antitrust risks associated with these types of technologies, including reporting and monitoring mechanisms. Companies may not simply “set it and forget it.” Id.
- Auditing: Conduct regular audits and periodic reviews of any pricing algorithms or other AI tools used to inform pricing decisions to ensure appropriate data collection and dissemination.
- Training: Ensure all employees receive proper training and communication about the antitrust laws and corresponding compliance obligations. This includes employees responsible for, among other things, developing, designing, or utilizing pricing algorithms or pricing systems, including employees who have access to the information from these systems, analyze the data or pricing, and/or have contact with competitors.
- Internal Safeguards: Create internal company safeguards to protect commercially sensitive information and customer data, such as setting up firewalls or encryptions to prevent data access, and other key risk mitigation measures.
- Third-Party Management: Perform due diligence with all third-party relationships. When contracting for AI software, conduct proper training and communications with third parties to ensure technological or contractual safeguards are in place to protect users’ competitively sensitive information and to prevent confidential user data from being disclosed, reverse engineered, or accessed by others. AI developers, software companies, and platform operators should be aware of the antitrust risks that can arise from the exchange of confidential information among competitors. Companies should continue to monitor and audit their third-party relationships and periodically perform risk assessments and due diligence.
Conclusion
Companies utilizing AI algorithms for pricing must scrutinize how those algorithms work and implement robust compliance programs to mitigate the potential risks of antitrust lawsuits and investigations. Two key concerns are whether companies are feeding the algorithm nonpublic, competitively sensitive information (or whether their competitors are providing that) and whether companies are agreeing to delegate their pricing authority to these algorithms. Both concerns substantially raise the antitrust risk of using such software.
Given the possibility of rule of reason treatment, companies also should document the purpose for implementing AI, including any efficiency-enhancing justifications. This documentation can be invaluable if legal challenges arise.
Basic compliance steps and review by specialized antitrust counsel can help to spot these risk factors. Companies are advised to refine their compliance programs, adopt safeguards around sharing sensitive confidential information, conduct risk assessments and audits, and provide employee antitrust training to help identify and mitigate these risks.
The DOJ is prepared to pursue more aggressive antitrust challenges against companies using pricing algorithms to maintain supra-competitive prices. With the possibility for increased scrutiny, criminal and civil penalties, even potential prison sentences, companies are well advised to ensure that their compliance programs are tailored to these emerging technological risks. Close monitoring of pricing algorithms and AI-driven software is essential to navigate this complex landscape.