Diversity Recruitment Resources: Understanding Implicit Bias

One of the challenges to creating a truly inclusive environment is the prevalence of implicit biases. We believe that, in general, members of our campus community are motivated to be fair-minded and to avoid stereotypes and prejudice and would not consciously engage in exclusionary practices. That does not mean, however, that we are immune to the impacts of our own unconscious biases.

Implicit or unconscious bias is a phenomenon in which we quickly and unconsciously sort people into social categories, often based on salient identity features. We tend to have unconscious preferences for those who share our own identities or for those in powerful groups, such as men over women or white people over people of color.

Across a wide range of scenarios, people tend to be more helpful to members of their own group and are more discriminatory to those in an “outgroup.” In other words, when we encounter people of a different race/ethnicity, culture, language, gender, etc., we may unconsciously (or implicitly) treat them as the “other.”  This capacity for ingroup-outgroup think is biological, but the social categories are learned, and then embedded into pathways in the brain that do not require conscious or rational thought.

The Implicit Association Test (IAT) was developed by Harvard researchers in the 1990s to better understand how our automatic vs. our reflective responses work. They started by asking participants to associate positive and negative concepts with either flowers or insects, timing their responses, then moved on to pairing positive and negative concepts with various social categorizations. After collecting hundreds of thousands of data points, the IAT has demonstrated that social category-based associations are extremely pervasive, they can be unconscious and persist even when we don’t want them to, they can influence judgment and behavior since most of our thinking is automatic, and they can challenge our self-concept.

Unfortunately, implicit biases have been shown to affect hiring processes as well, particularly in ways that disadvantage women and people of color. Hiring faculty can also be fraught with other types of bias: using educational pedigree as a proxy for individual excellence and overreliance on scholarly connections, which are often homogeneous by race and gender, can interfere with objective evaluation of candidates’ merit.

Fortunately, we also know that the effects of implicit bias can be mitigated by taking intentional steps. Implicit bias does the most damage when we make quick decisions and when there is ambiguity in the decision-making process. Slowing down, clearly defining and agreeing upon criteria, standardizing processes, and reducing subjectivity can help to reduce the impact of implicit biases.

We all have implicit biases. But if we work to make our unconscious biases conscious, we can begin to counteract their effects and make our hiring practices more equitable.

 

Learn more about implicit bias

If your search committee would like to participate in a workshop on implicit bias, please contact Jacki Black in the Office of Institutional Diversity and Inclusion at jacqueline.black@marquette.edu.

Here are some additional resources on implicit bias and its role in search and hiring practices:

  1. Preface: Biases and Heuristics(5:13)
  2. Lesson 1: Schemas(3:12)
  3. Lesson 2: Attitudes and Stereotypes(4:13)
  4. Lesson 3: Real World Consequences(3:45)
  5. Lesson 4: Explicit v. Implicit Bias(2:49)
  6. Lesson 5: The IAT(5:14)
  7. Lesson 6: Countermeasures(5:23)

 

Introduction to Recruiting for Diversity

Phase 1: Cast a Wide Net

Phase 2: Understanding Implicit Bias

Phase 3: Candidate Review

Phase 4: Extending the Offer and Making the Hire

Appendices