The Civil Rights Act of 1968, also known as the Fair Housing Act (FHA), banned housing discrimination nationwide on the basis of race, religion, national origin, and color. One key finding that persuaded Dr. Martin Luther King Jr., President Lyndon Johnson, and others to fight for years for the passage of this landmark law confirmed that many Americans were being denied rental housing because of their race. Black families were especially impacted by the discriminatory rejections. They were forced to move on and spend more time and money to find housing and often had to settle for substandard housing in unsafe neighborhoods and poor school districts to avoid homelessness.
June 03, 2024 HUMAN RIGHTS
Ghosts in the Machine: How Past and Present Biases Haunt Algorithmic Tenant Screening Systems
By Gary Rhoades
April 2024 marked the 56th year of the FHA’s attempt to end such unfair treatment. Despite the law’s broadly stated protections, its numerous state and local counterparts, and decades of enforcement, landlords’ use of high-tech algorithms for tenant screening threatens to erase the progress made. While employing algorithms to mine data such as criminal records, credit reports, and civil court records to make predictions about prospective tenants might partially remove the fallible human element, old and new biases, especially regarding race and source of income, still plague the screening results.
For example, in Malden, Massachusetts, tenant Mary Louis learned of an apartment in one of the city’s safer neighborhoods that had two full bathrooms and a washer and dryer in the unit. It was a housing opportunity that checked her wish boxes. She submitted an application to Granada Highlands, a housing provider that used a tenant screening software program called SafeRent by SafeRent Solutions.
Mary, who is Black and uses a housing voucher to pay for approximately 69 percent of her rent, had proof that she had paid her rent on time for 16 years. However, according to the class action lawsuit filed on her and others’ behalf, her application was denied because SafeRent’s algorithm gave her a low score that did not consider the value of her voucher or her tenancy record. According to the lawsuit, the algorithm “assigns disproportionately lower SafeRent Scores to Black and Hispanic rental applicants compared to white rental applicants” and also discriminates against voucher holders by refusing to include the voucher’s value.
Exactly what the SafeRent algorithm used in calculating that home-denying score was a mystery to Mary and even—allegedly in a double-blind setup—to those who used it to deny Mary a home. What Mary did know was that she ended up having to settle for a home with fewer bathrooms and no in-unit washer and dryer. The unit was in a less safe neighborhood and had higher rent.
Another lawsuit has been filed against SafeRent involving a tenant screening product with an algorithm that combs criminal records, allegedly without regard to the stages of criminal filings, inaccuracies in the data, or the date of the offenses. Several dozen more lawsuits and administrative claims have been filed across the country about software with similar algorithms. The FHA faces the challenge of whether it is strong enough to stop this new type of discrimination that is spun from inside computers. The programs’ inner workings are often unknown to the user and sometimes even the creator—the so-called “black box” issue.
Finding out what happens with an algorithm and holding parties accountable under the FHA come with challenges. While sales teams emphasize the magic of the black box’s secret formula for finding so-called “safe” tenants, the attorneys for the purveyors of the software in such cases have shrugged to say, often disingenuously, “Who knows what happens in there? Not my client.”
The Algorithm in Culture and Its Recent Dramatic Rise in Prominence
The dangers of data-centric technology and especially artificial intelligence (AI) have captured our imaginations for decades. In 1968, the same year the FHA was passed, the world met a sentient AI entity named HAL 9000 on a spaceship in the film 2001: A Space Odyssey. (Recall that HAL, speaking in its iconic voice of eerie calm, also decided for the sake of the mission to close the door against someone who wanted in.) Kazuo Ishiguro’s novel Klara and the Sun (2021) explored the integration of AI into humanity and society, and, as the author said in an interview for WIRED magazine, “I think there is this issue about how we could really hardwire the prejudices and biases of our age into black boxes, and won’t be able to unpack them.” On the documentary side, the acclaimed Coded Bias (2020) presents how Joy Buolamwini, an MIT-trained computer scientist and founder of the Algorithmic Justice League, discovered coded racial bias in facial recognition systems and then fought with others to expose that bias.
Since 2022, however, both AI’s notoriety and popularity have blown up. With the advent of CHAT-GPT and other new products and advances, algorithms have been writing magazine articles, researching new drugs, and predicting complicated scientific processes. In the current AI boom, algorithms have seemingly advanced further into our lives to points of no return.
But since 2022, lawmakers, enforcement agencies, and civil rights and consumer rights attorneys have also ramped up their examination of AI’s pitfalls, especially algorithmic bias. Algorithms’ roles in decision-making in housing, employment, consumer protection, and privacy are subject to lawsuits, new laws, and advisory documents. Congress has conducted hearings, the Biden administration has issued an executive order, and attorney generals (AGs) from 15 states have jointly served a letter on the Federal Trade Commission.
The AGs’ letter focused on tenant screening programs. And just as the U.S. Department of Justice filed an amicus brief in June 2022 advising the court in Mary Louis’s case that SafeRent’s software might violate the FHA, the department also settled a case first advanced in 2019 by the U.S. Department of Housing and Urban Development in which Meta—then Facebook—had been sued as an advertiser for housing discrimination. The Meta matter involved another type of double-blind scheme: allegedly neither the housing providers nor those seeking housing knew that Facebook’s advertising algorithm had, based on information about Facebook members’ race, gender, and other protected classes, effectively blocked those looking for housing from seeing providers’ housing advertisements. (See Gary Rhoades, “Facebook and the Fair Housing Act,” Los Angeles Daily Journal, April 11, 2019.)
The Creation of an Algorithmic Tenant Screening Product
For those of us in the legal field who are not computer scientists, data-centric technology and algorithms can be difficult to understand. First, it’s notable that the central mission seems to be to create a product that will be profitable and alluring for housing providers. A hypothetical software team is told that residential landlords want a simple, easy-to-use product that tells them who will be good or bad tenants. The sales team is on standby, ready to switch out good or bad for the more provocative safe or unsafe.
After research and consultation with someone with tenant screening experience, they come up with a set of instructions—the algorithm—for the computer to predict the “safety” of any prospective tenant. The software team then might expose those instructions to historical data to train the algorithm to generate more useful predictions. For example, an instant study of civil court systems data might train the algorithm to show that applicants from certain zip codes are more likely to get involved in some level of dispute or litigation with their housing provider. Whether the algorithm receives this machine training or not, the sales team might use puffery to imply that high-tech AI is present with a secret and infallible formula for predicting who will be an unsafe tenant. They emphasize that the program will comb government and private repositories that hold information about an applicant’s arrests, convictions, bankruptcies, credit scores, and eviction litigation and then apply that magic formula to the applicant’s data. Finally, the product is ready to be sold and downloaded by landlords who then often discard more hands-on holistic approaches to tenant screening in exchange for the simple score or label. After all, the work has already ostensibly been done by a secret formula applied to everything known about the tenant. However, the formula’s criteria and the applicant’s data come with deep flaws.
“Most of the criteria used to screen tenants are inherently arbitrary and are not based on any kind of empirical evidence or studies,” says Eric Dunn, litigation director at the National Housing Law Project and one of the plaintiffs’ attorneys in Arroyo v. SafeRent Solutions. “That’s especially true for criminal history screening—it’s largely just stereotypes and racist biases repackaged as concerns about safety and security, even though the studies have found there’s no connection between criminal history and being a poor tenant or posing any kind of safety hazard.”
Disparate Impact Analysis Applies in Algorithmic Bias Cases
A key fair housing issue raised with algorithmic bias is whether a landlord or software company can be held liable under fair housing laws for discrimination if a landlord uses software with the seeming race-neutral mission to exclude so-called “unsafe” renters, even if the software company owns no housing and did not directly deny an applicant.
Whether one actually provides housing is irrelevant under the FHA. It prohibits a wide array of discrimination, up and down the chain of housing, from the newspaper ad to the insurance company to the manager to the owner. Also, federal fair housing law has always prohibited not just outright intentional discrimination but also any policies and decisions that have a “disparate impact” or discriminatory effect on the protected classes. Fair housing advocates breathed a sigh of relief in 2015 when the Supreme Court in Texas Department of Housing and Community Affairs v. Inclusive Communities Project, Inc., 576 U.S. 519 (2015), upheld the use of disparate impact analysis in fair housing cases.
After the Inclusive Communities case and despite it, the Trump administration attempted to make it very difficult to prove algorithmic discrimination claims, but the Biden administration has since scuttled those regulations and implemented its own executive order that affirms the need to regulate algorithmic discrimination in housing and other areas. Thus, the use of a tenant screening program that has a statistically significant effect of excluding minorities should still be unlawful under the FHA, even if there is no intent to discriminate.
Conclusion: Both Litigation and Reform Needed to Hold the Line Against Algorithmic Bias
The cases against SafeRent are still in litigation or on appeal, and the record is mixed, but Dunn is optimistic that the FHA and other fair housing laws will hold. He emphasizes that having an automated system apply an arbitrary, non-empirically validated screening policy just reproduces the discriminatory outcomes baked into that policy. “Maybe it adds a false veneer of objectivity to have a machine do it instead of a person,” he says, “but that’s the only real difference. And all that is exacerbated by the frequent data errors and misidentification problems that regularly arise when you have machines processing all this information.”
Reform efforts have included calls for any algorithmic tenant screening to include disclosure of any reliance on an algorithm, provision of a report to the tenant of what data was used, the opportunity for the tenant to correct any errors, and adding the value of any housing vouchers. The AGs’ report also recommended audits for “race based or digital redlining resulting from biased underwriting” in all tenant screening products.
Dunn also laments that what is being lost is the “human common sense filter” to catch the machine’s errors. And perhaps, at the end of the day or at the beginning of the next congressional hearing, that is what can be pursued in the all-important civil rights issue of tenant screening—what some computer scientists and much of science fiction have aspired to—a way to dynamically combine objective data, truth, and technology with our common sense.