On June 4, 2019, San Francisco’s City and County Board of Supervisors passed the Surveillance Technology Policy Ordinance. The ordinance outlined new approval and reporting procedures for government entities seeking to begin or continue using surveillance technology with a goal of enhancing transparency and oversight over the routine use of technology by law enforcement. The most headline-grabbing component of the ordinance was its wholesale prohibition on the use of facial-recognition technology within the county, which made San Francisco the first city in the nation to announce such a prohibition.
The stated rationale for the ordinance is rooted in federal and state constitutional protections for civil rights and liberties. Additionally, there was a distinct equal-protection undertone to the discussion surrounding the ordinance’s passing. For example, the board of supervisors expressed concern over the potential abuse of surveillance and facial recognition technology in the “intimidation and oppression” of persons in protected classes. S.F., Cal., Ordinance on Acquisition of Surveillance Technology (May 21, 2019). In specifically prohibiting the use of facial recognition technology, the board concluded that the danger of unlawful discrimination outweighed the security benefits the technology offered.
Given that counties and states are important laboratories for potential national policy, the implementation of and reaction to San Francisco’s regulation may be an important harbinger for privacy legislation in the future. The new regulation also raises many potential areas for litigation, both in defense of law-enforcement activity and as individuals defend themselves when the technology has been used for prosecutorial purposes. For example:
- Definition of “surveillance technology.” The ordinance outlines both an affirmative definition of surveillance technology and specific exclusions. With technology constantly developing, whether innovations fall within or outside the ordinance will be an ever-evolving question.
- Exclusion of the district attorney and sheriff. Under certain conditions, the district attorney (DA) and sheriff are not subject to the same restrictions as other governmental agencies. This exclusion arguably provides an avenue for the DA and sheriff to more liberally use this technology in their law-enforcement practice. On the other hand, interpretation of the exclusion could become a method for defense counsel to exclude evidence in criminal cases.
- Exigent circumstance exception: In a narrow exception, the ordinance permits government entities to use surveillance technology in emergency situations without obtaining prior approval by the board of supervisors. This exception is qualified by a limited time frame and requires the return of evidence after the exigent circumstance has passed. Like other “exigent circumstance” areas of the law, what constitutes an emergency and its scope both could be litigious issues.
- Standard for approving surveillance. Reflecting the rationale for passing the ordinance, the standard of approval for use of surveillance technology is that (a) the benefits must outweigh the cost, (b) the use must safeguard civil liberties and rights, and (c) the use must not be based on discriminatory factors with a “disparate impact on a protected class.” S.F., Cal., Ordinance on Acquisition of Surveillance Technology (May 21, 2019). Interpretation of this standard could lead to arguments on both sides—those seeking approval and those opposing approval—relating to the frequency the board approves or disapproves use of this technology and the circumstances surrounding the approval or disapproval.
The ban on facial-recognition technology is also a contentious aspect of this new ordinance with potential national implications. The ordinance defines facial-recognition technology as “an automated or semi-automated process that assists in identifying or verifying an individual based on an individual’s face.” S.F., Cal., Ordinance on Acquisition of Surveillance Technology (May 21, 2019). While the board can approve of other surveillance-technology use, facial-recognition technology has been prohibited in the county. The prohibition applies whether facial-recognition technology is intentionally or unintentionally used and is specifically excluded even where used to assist city departments with internal investigations.
It is likely that similar interpretative issues will arise with the facial-recognition-technology ban as in the surveillance-technology context—such as how to define facial recognition in a rapidly developing marketplace, whether any exclusions apply, and the scope of any such exclusions. For example, while the stated intent of the ordinance is a complete prohibition on facial-recognition technology, the ban only applies to “any Department.” “Department” is defined elsewhere in the ordinance to exclude the DA and sheriff’s departments under certain conditions. Whether this is an oversight or express exclusion allowing the DA and sheriff’s offices to use facial-recognition technology is an issue yet to be decided.
Initial national implications of this ordinance could materialize quickly. Two days after the vote on the San Francisco ordinance, U.S. House Representative Jim Jordan, a Republican from Ohio on the House Oversight Committee, promoted a temporary ban on facial recognition technology while Congress determined how best to regulate this new law-enforcement tool. A hearing in early June was convened to address the national-security and privacy implications of facial-recognition technology’s potential use by agencies such as the Federal Bureau of Investigation and the Transportation Security Administration. Countervailing considerations in this ongoing discussion are the potential for civil-liberties infringement and the need for law enforcement to keep up with and employ technology for lawful security efforts.
At either the state/county level or the national level, the thorny constitutional questions involved in interpretation of surveillance-technology regulations will contribute to the ongoing tension between privacy and security. How courts choose to interpret these regulations and how law enforcement and the public responds will impact the state of constitutional law and law-enforcement practices in the technology age.
Madison Conkel is a J.D. candidate at the University of Georgia School of Law and Eileen H. Rumfelt is a member with Miller & Martin, PLLC in Atlanta, Georgia.