· A 20-month investigation into Facebook’s digital targeted marketing practices by the Attorney General of Washington state. This resulted in a consent order in which Facebook agreed to cease providing advertisers with the option to (1) exclude ethnic groups from advertisements for insurance and public accommodations; or (2) otherwise utilize exclusionary advertising tools that allow advertisers with ties to employment, housing, credit, insurance and/or places of public accommodation to discriminate based on race, creed, color, national origin, veteran or military status, sexual orientation and disability status.
- A civil suit brought by the National Fair Housing Alliance, the Communications Workers of America and several other consumer groups alleging discriminatory practices in Facebook's digital targeted marketing practices. This resulted in a settlement of $5 million and an agreement by Facebook to make changes to its look alike campaigns for housing, employment and credit-related advertisements (e.g., prohibiting attributes related to age, gender and zip codes).
- An ongoing Charge of Discrimination levied by the Department of Housing and Urban Development (“HUD”) alleging discriminatory housing practices in violation of the provisions of the Fair Housing Act that prohibit discrimination based on race, color, religion, sex, familial status, national origin or disability. Specifically, HUD alleges that Facebook (1) enabled advertisers of housing opportunities to target audiences using prohibited bases; and (2) used an ad-delivery algorithm that would independently discriminate based on prohibited bases even where advertisers did not use prohibited bases to target audiences.
- An ongoing civil suit in the Northern District of California alleging that Facebook's direct targeted marketing practices violated the Fair Housing Act, Equal Credit Opportunity Act and California fair lending laws.
While enforcement and litigation has primarily focused on Facebook and its practices to date, the New York Department of Financial Services recently expressed interest in investigating financial institutions and “Facebook advertisers to examine…disturbing allegations [of discriminatory practices]…to take whatever measures necessary to make certain that all financial services providers are in compliance with New York's stringent statutory and regulatory consumer protections.” This sentiment was echoed in a recent article by the Associate Director and Counsel to the Federal Reserve Board's Division of Consumer and Community Affairs which highlights the fair lending risk digital targeted marketing poses to financial institutions (i.e., steering and redlining) and notes that the “growing prevalence of AI-based technologies and vast amounts of available consumer data raises the risk that technology could effectively turbocharge or automate bias.” The commentators further note that it is “important to understand whether a platform employs algorithms — such as the ones HUD alleges in its charge against Facebook — that could result in advertisements being targeted based on prohibited characteristics or proxies for these characteristics, even if that is not what the lender intends.”
With this in mind, financial institutions must evaluate and mitigate not only the risks associated with their own digital targeted marketing activities, but also the activities of the platforms with which they associate. In doing so, they should consider taking the following actions:
- Evaluating the importance of digital targeted marketing to the financial institution and its risk tolerance with respect to same.
- Attempting to obtain as much information as possible about the possible presence of prohibited bases or close proxies in digital-marketing algorithms.
- Requiring indemnification in digital targeted marketing agreements, especially where platforms use proprietary black box analytics.
- Where available, using “special ad audience” programs intended for industries subject to anti-discrimination laws (e.g., housing, credit, employment, etc.).
- Considering using self-selected attribute criteria that avoid prohibited bases or close proxies in lieu of a platform's look alike program.
- Analyzing and testing responses to digital marketing campaigns for potentially disparate outcomes.
Given their potential benefits, financial institutions are unlikely to cease direct targeted marketing activities. But those that are prudent should engage in reasonable due diligence regarding the platforms they use—weighing the benefits against the risks of their use—while monitoring for future regulatory guidance or legal precedent.