December 18, 2020 Notes

Bias in, Bias out: Why Legislation Placing Requirements on the Procurement of Commercialized Facial Recognition Technology Must Be Passed to Protect People of Color

Rachel S. Fleischer


Facial recognition technology is increasingly ever-present in today’s society, shaping and redefining integral aspects of human life. While this ubiquitous technology was created to be objective and neutral in its application, it is not immune to discriminatory biases. These biases have led to a highly disturbing situation, where, while being used disproportionately on People of Color, facial recognition technology is also disproportionately misidentifying these individuals as criminals. Meanwhile, commercial facial recognition technology continues to be procured by law enforcement agencies for policing and intelligence purposes.

This Note argues that Congress must pass legislation amending the Federal Acquisition Regulation and place requirements on the procurement of commercial facial recognition technology in order to protect People of Color. This Note also proposes language for the legislation. Ultimately, the solution proposed by this Note is vital to help mitigate the disparate impact that the use of biased facial recognition technology will have on People of Color.

I. Introduction

On the morning of April 25, 2019, Brown University student Amara K. Majeed awoke to death threats.1 Majeed’s photo had been associated with the name of a suspected terrorist, tied to an attack in Sri Lanka that killed more than 250 people.2 Because of an error in the facial recognition software used to investigate the attack, Majeed’s photo was connected to the suspected terrorist’s name, ultimately putting both Majeed and her family in danger for a crime she never committed.3

Premium Content For:
  • Public Contract Law Section
Join - Now