Case Study: COMPAS in US Justice System
Learn about a case of data bias in the US Justice system, and how the resulting algorithms became biased.
We'll cover the following
This is the first of about six case studies you will encounter in this course. In each study we'll walk through an example of how disasters in pipelines were discovered and then fixed or exposed. These case studies are meant to be an in-context exercise that rehashes previous material. While the specific set-up of the studies might not be critical, the takeaways certainly are.
Case study
Let’s dive into the first case study.
History
The COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) dataset is an infamous datasets because oversight at the data level led to overtly racist decisions. It was an inflection point for many who realized algorithms and decisions are only as good as the data they rely on. Much of this case study will come from ProPublica, the organization that analyzed and exposed the dangers of the COMPAS dataset.
The COMPAS dataset was developed by an organization called Northpointe and was a collection of information on individuals who were arrested in Broward County, Florida, between 2013 and 2014. The dataset included a variety of demographic and criminal history information, as well as COMPAS scores, which are used to assess the likelihood that an individual will reoffend. The dataset (and the COMPAS algorithm that was built on top of it) was highly biased against Black people. Let’s dive into the details.
COMPAS data
ProPublica obtained data on 11,757 people who were assessed by Broward County using the COMPAS scores to determine whether a defendant would be detained or released prior to their trial. The COMPAS score was a number from 1 to 10, with 10 being the highest risk of reoffending.
Get hands-on with 1400+ tech skills courses.