October 26, 2020 Feature

Arcing Toward Justice: Strategies to Disrupt Implicit Bias and Limit Its Negative Impact

The Honorable Bernice B. Donald and Sarah E. Redfield

In an article published in the Summer 2019 issue of Criminal Justice, we defined “implicit bias” in terms of “learned associations absorbed from all around us, outside our direct awareness or control, that affect our understanding, actions, and decisions,” and emphasized the importance of becoming “bias literate” with respect to an understanding of not only “group preferences” but also “unintended micromessaging.” We then promised to share, in a future article, some “strategies for disrupting implicit bias/response.” Bernice B. Donald & Sarah Redfield, Arcing Toward Justice: Can Understanding Implicit Bias Help Change the Path of Criminal Justice?, 34 Crim. Just., no. 2, 2019.

Here, we make good on that promise.

Is Implicit Bias a Real Thing?

Social scientists have established conclusively that implicit bias exists. See Keith Payne, Laura Niemi & John M. Doris, How to Think About “Implicit Bias,” Sci. Am. (Mar. 27, 2018). Citing just one research-based example, consider the manifestation data regarding the prison population that shows that the severity of one’s sentence relates to one’s skin color. See, e.g., Mark Bennett & Victoria Plaut, Looking Criminal and the Presumption of Dangerousness: Afrocentric Facial Features, Skin Tone, and Criminal Justice, 51 U.C. Davis L. Rev. 745 (2018). If one does not believe, as we do not believe, that the judges who are imposing these sentences are racists or some other kind of *ists, then we need look elsewhere for an explanation of these disturbing and longstanding differences. One place legal scholars and scientists are looking for that explanation is to our increasing understanding of the role of implicit bias in our decisions. See, e.g., Justin D. Levinson & Danielle Young, Different Shades of Bias: Skin Tone, Implicit Racial Bias, and Judgments of Ambiguous Evidence, 112 W. Va. L. Rev. 307, 334–35 (2010).

How Do We “Measure” Implicit Bias?

It used to be that if researchers wanted to know if someone was biased, they would simply ask. In the mid-1990s, another approach came to the fore, by which researchers could measure a subject’s biases indirectly. One manifestation of this approach is the neuroscientific assessment of related brain activity. See generally Project Implicit, https://implicit.harvard.edu/implicit/; Jennifer T. Kubota, Mahzarin R. Banaji & Elizabeth A. Phelps, The Neuroscience of Race, 15 Nature Neurosci. 940, 942, 944 (2012). Another is the use of questionnaires like the Implicit Association Test (IAT). Anthony G. Greenwald, Debbie E. McGhee & Jordan L.K. Schwartz, Measuring Individual Differences in Implicit Cognition: The Implicit Association Test, 74 J. Personality & Soc. Psychol. 1464, 1465 (1998).

Readers can take the IAT themselves at https://implicit.harvard.edu. It is a valuable tool for building personal awareness. Many who take the IAT are initially concerned by the results. It helps to recall that implicit bias is part of being human—we all are implicitly biased. What matters most here is giving some thought to what we might learn about our implicit, unintended thinking so that we can interrupt it where appropriate to achieve more equitable and fair results.

Overall, direct measurement approaches have revealed that “people possess implicit attitudes and stereotypes about social groups that are often distinct from their explicitly endorsed beliefs and values. The evidence that behavior can be influenced by implicit social cognition contrasts with social policies that assume that people know and control the causes of their behavior.” Brian A. Nosek & Rachel G. Riskind, Policy Implications of Implicit Social Cognition, 6 Soc. Issues & Pol’y Rev. 113 (2012).

Can Knowledge-Based Training Lead to Transformation of Implicit Bias?

Not surprisingly, there are differences of opinion on whether and which training and interventions are best suited to neutralizing the impact of negative implicit biases. In our own work, we agree with those who, like psychologist and leading STEM researcher Dr. Patricia Devine, find “compelling evidence” that recipients of knowledge-based, habit-breaking training can “recognize bias and its consequences for minorities, then address it in the world around them.” Patrick S. Forscher et al., Breaking the Prejudice Habit: Mechanisms, Time Course, and Longevity, 72 J. Experimental Soc. Psychol. 133, 143 (2017); Patricia G. Devine et al., A Gender Bias-Breaking Intervention Led to Increased Hiring of Female Faculty in STEM Departments, 73 J. Experimental Soc. Psychol. 211, 213–14 (2017).

There is also evidence closer to home. Cornell Law Professor Rachlinski and his colleagues have concluded similarly when studying judges’ decision-making in relation to bias:

Our research supports three conclusions. First, judges, like the rest of us, carry implicit biases concerning race. Second, these implicit biases can affect judges’ judgment, at least in contexts where judges are unaware of a need to monitor their decisions for racial bias. Third, and conversely, when judges are aware of a need to monitor their own responses for the influence of implicit racial biases, and are motivated to suppress that bias, they appear able to do so.

Jeffrey J. Rachlinski, Sheri Lynn Johnson, Andrew J. Wistrich & Chris Guthrie, Does Unconscious Racial Bias Affect Trial Judges?, 84 Notre Dame L. Rev. 1195 (2008–2009).

There is further evidence that training judges about implicit bias, and providing tools to help limit discretion, produces a measurable change in their juvenile detention decisions, enabling them to meet their stated goal of keeping children with their families over an extended period. Jesse Russell & Alicia Summers, Reflective Decision-Making and Foster Care Placements, 19 Psychol. Pub. & Law 2 (2013).

Based on our research and our own experience training lawyers and judges about implicit bias, our view is that interruption is possible, and learning techniques for interruption is also possible. In this work, we have been influenced by the early work on habit-breaking (Patricia G. Devine, Patrick S. Forscher, Anthony J. Austin & William T.L. Cox, Long-Term Reduction in Implicit Bias: A Prejudice Habit-Breaking Intervention, 48 J. Experimental Soc. Psychol. 1267 (2012)) and by the somewhat broader work of the STEM researchers as they developed the concept of bias literacy and related training. See, e.g., Ruta Sevo & Daryl E. Chubin, Bias Literacy: A Review of Concepts in Research on Discrimination (2008).

In our own training efforts, we start from the observation that “the path from implicit bias to discriminatory action is not inevitable” and that “[p]eople’s awareness of potential bias, their motivation and opportunity to control it, and sometimes their consciously held beliefs can determine whether biases in the mind will manifest in action.” Nilanjana Dasgupta, Implicit Ingroup Favoritism, Outgroup Favoritism, and Their Behavioral Manifestation, 17 Soc. Just. Rsch. 143 (2004).

Our goal is always to create that awareness and offer action-oriented strategies. We know that good training that is data- and science-driven can help us learn when to focus on our quick, indirect, “System 1” implicit responses (learned associations absorbed from all around us, outside our direct awareness or control, that affect our understanding, actions, and decisions), and use our slower, “System 2,” direct thinking (verbally endorsed evaluations, deliberately generated understandings, actions, or decisions, directly experienced as our own) to reach intended, fair results. Bernice B. Donald & Sarah E. Redfield, Implicit Bias: Should the Legal Community Be Bothered?, 2 PLI Current 615, 625–26 (2018).

How Can We Use This Knowledge to Disrupt Our Own Implicit Bias?

As devoted as we are to the added value of in-person training and focused collegial interaction on these topics, we encourage readers to begin taking this journey on their own, as outlined below.

Become Aware

These CJS articles and prior presentations are a beginning. We urge you to learn more: read more, observe more. Motivation is significant: Try to work with more diverse individuals and teams, to flip the script and consider alternative perspectives. Notice times when you might be reacting quickly and implicitly when you meet a new person. While it is true that we need to categorize to survive, it is sometimes more useful to individuate, take time to ask a few more questions.

Watch the Media

Implicit bias is sometimes described as unsought cultural expertise condensed into stereotypes and attitudes; it’s expertise that comes from our culture, experience, environs, and trauma, and also from marketing and media. Mahzarin R. Banaji & Anthony G. Greenwald, The Implicit Revolution: Reconceiving the Relation Between Conscious and Unconscious, 72 Am. Psychol. 861, 867 (2018); Antidefamation League Training materials. Media portrayals are pervasive and can be pernicious. Think about the difference between the message you absorb when a (White) defendant is consistently described as a star athlete and is pictured in his dress shirt, jacket, and tie as compared with a defendant of color whose picture is a mugshot. Compare Black College Student Sentenced to 12 Years in Prison for Kissing a White Girl, BlackNews.com (Apr. 10, 2019), https://bit.ly/3eZch22, with Gabriella Paiella, Brock Turner’s Childhood Friend Blames His Felony Sexual-Assault Conviction on Political Correctness, The Cut (June 6, 2016), https://bit.ly/30MnUEH; see also Travis Dixon, A Dangerous Distortion of Our Families: Representations of Families, by Race, in News and Opinion Media (2017). We challenge you all to hold your own media contest, looking for examples of messages that further embed implicit bias. To get you started, we offer one from earlier GAP back-to-school advertising: She’s dressed as the “social butterfly, chambray shirts + logo sweaters are the talk of the playground” and he’s dressed as the “little scholar, your future starts here.” Tanveer Mann, Adverts Which Enforce Gender Stereotypes Could Be Banned as Early as Next Year, Metro (July 18, 2017), https://bit.ly/3jyJ4yC.

Focus in Your Own Context

Decide in your own context what decision points may be places where implicit bias is most apt to need to be interrupted and focus there. We are all cognitive misers, saving our mental energy. See E. Philip Tetlock, Accountability: A Social Check on the Fundamental Attribution Error, 48 Soc. Psychol. Q. 227, 228 (1985). We have neither the time nor the need to carefully consider or interrupt every decision point. If Sarah always, implicitly, without direct thought, reaches for that Maine Wild Blueberry Pie, well, not all that significant. But when we make a particular decision—sentencing or placement, for example—that’s the time to stare, to be more direct in our thinking, to focus on interruption of implicit bias. For each person/organization, key focus points will vary.

Be Especially Mindful of Confirmation Bias

Know that there is a strong tendency toward confirmation bias, where we tend to hear and pay attention more to information that confirms our view and to disregard information that doesn’t. (Recall that we discussed this in the last article by referencing the research on partners’ review of a memo concluding with much higher scores for the author described as Caucasian confirmed the perspective they likely started with, that is, that Caucasian NYU graduates would be better writers. Arin N. Reeves, Written in Black & White: Exploring Confirmation Bias in Racialized Perceptions of Writing Skills 2–3 (2014).) Again, ask for more information, slow down at key decision points, and develop a checklist for certain decisions.

Be Mindful Around Ambiguity and Discretion

Consider where you can lessen ambiguity and limit your discretion so as to limit opportunities to let implicit bias or stereotypes take over without your conscious intent. See Mark W. Bennett, Manifestations of Implicit Bias in the Courts, in Enhancing Justice: Reducing Bias 65 (Sarah E. Redfield ed., 2017). Consider, for example, the blinding approach used in orchestras, where once auditions were held behind screens, increasing numbers of women got seats. Claudia Goldin & Cecilia Rouse, Orchestrating Impartiality: The Impact of “Blind” Auditions on Female Musicians, 90 Am. Econ. Rev. 715 (2000). We obviously cannot reasonably make all our decisions with such a dramatic approach, but we can try to limit unnecessary cues that could prompt implicitly biased responses. See Anthony G. Greenwald & Calvin Lai, Implicit Social Cognition, 71 Annual Rev. of Psychol. 419 (2019).

Watch Your Messaging

The impact of negative micromessages—small messages that implicitly send signals that can cumulate in negative results—is well documented. See generally Mary Rowe, The Saturn’s Rings Phenomenon, 50 Harv. Med. Alumni Bull. 14 (1975). Prompt yourself to change them. This can be as simple as thinking consciously about what you display in your office. Or so simple as shaking hands with everyone if you shake hands with anyone; calling everyone by their title, Mr. Smith, Ms. Smith, Attorney Smith. Another key strategy here is to be vigilant about not interrupting or talking over others and remind those in your purview to do likewise.

Be Accountable

As the adage goes, what’s counted is what counts. Decide to count something in terms of your own decisions. (This can be sophisticated data analysis, but even if that is not possible, counting can help: Professor Redfield, for a simple example, counted who she called on first in large classes.) There are also many other layers of accountability beyond tracking. We are likely to be more deliberate and direct in our decisions if we know we are going to be evaluated, or even if we know we are going to be required to give a reason or will be individually identified as part of the decision. See, e.g., Galen V. Bodenhausen, Geoffrey P. Kramer & Karin Süsser, Happiness and Stereotypic Thinking in Social Judgment, 66 J. Pers. Soc. Psychol. 621 (1994); Jennifer S. Lerner & E. Philip Tetlock, Accounting for the Effects of Accountability, 125 Psychol. Bull. 255, 255 (1992).

Consider how these approaches work for you and your organization.

Think About and Plan for How/When to Intervene

There is interesting research about implementation intentions. If you have a specific intention to do something, you are more likely to do it than if you just tell yourself you will. Marieke A. Adriaanse et al., Breaking Habits with Implementation Intentions: A Test of Underlying Processes, 37 J. Personality & Soc. Psychol. Bull. 502 (2011). Take a few minutes and think about an instance of implicit bias you may have seen or experienced. Set yourself the task to write an IF, THEN statement. For example, IF I see my colleague being interrupted, THEN I will say something like, “Let’s let Sophie finish.” This is a simple example, but intervention in the moment or later can be a critical strategy for neutralizing bias.

Ask Yourself Just How Sure You Are

We tend to think we know more than perhaps we actually do. This is sometimes called blindspot bias but is perhaps more fondly known as the Lake Wobegon effect after Garrison Keillor’s “all the children are above average.” Professor Redfield joins psychologist Keith Payne in identifying as a “favorite study in this genre” one where “fellow college professors were asked to rate their teaching abilities compared with those of their colleagues. A stunning 94 percent said they were better than average.” Keith Payne, The Broken Ladder: How Inequality Affects the Way We Think, Live, and Die (2017). Cornell law professor Jeffrey Rachlinski and his coauthors have identified similar self-perceptions among judges. When they asked judges at an educational conference about their own abilities to “avoid racial prejudice in decisionmaking” as compared to other judges who were attending the same conference, 97 percent rated themselves in the top half. Rachlinski et al., supra, at 1226.

Enough said.

What Comes Next?

We’ve offered here a few sample pointers, derived from data-based work on successful approaches for training to interrupt implicit bias. There are others, including those that are more complex and more systemic in nature, that need a longer and more in-depth approach. But here is one more pointer for now: “Be Trained.”

For more information on parameters to look for to assure serious training, again, be in touch with us.

Entity:
Topic:

Honorable Bernice B. Donald

-

Honorable Bernice B. Donald is a judge on the US Court of Appeals, Sixth Circuit. Judge Donald is a frequent speaker on implicit bias.

Sarah E. Redfield

-

Sarah E. Redfield is a law professor at the University of New Hampshire. Her research and teaching focus on issues of diversity, equity, inclusion, and access, particularly regarding the impact and interruption of implicit bias.