Study guide

Getting started

We recommend that you contemplate the following questions before delving into the case: What is algorithmic bias? What is implicit bias? How can biases (or disparities) be detected? Think precisely and think of real-world settings.

The FairBank case is not intended to explore all aspects of algorithmic bias. It focuses on a set of basic, common, and commonly misunderstood aspects of algorithmic bias.

Now start reading the case. The case is divided into three sections ordered chronologically and labelled “Monday”, “Wednesday”, and “Friday” meetings. These sections address, respectively, the detection, diagnosis, and mitigation of bias.

The numbers in the exhibits were generated from the interactive app so we recommend exploring the app immediately after reading each section of the case. You may find it helpful to play around with the parameters in the app to bolster your understanding of the numbers you see in the exhibits.


Your assignment is to assume the role of CEO Stephanie Treynor and write a one-page memo to the Board of FairBank assessing the situation and outlining a clear plan of action. Your assessment and recommendations can go beyond what the CEO has heard from her executive team.

Note on context

We framed this case within the context of disparities in loan approval decisions across Black and White applicants. However, this is not a case about racial disparities per se. Algorithmic biases can arise in many different contexts. You should recognize that the issues raised apply broadly to biases and disparities based on any attribute and affecting any group.

In any setting, let the context aid your understanding of the topic rather than impair it. In particular, an objective understanding of the underlying statistical issues is necessary to accurately diagnose and fix problems of bias.