This report refutes an article published in ProPublica claiming that commonly used risk assessment tools are racially biased.
In Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks published by ProPublica, the authors, Angwin, Larson, Mattu and Kirchner purport to assess the racial bias of a commonly used risk assessment, ultimately expanding their conclusions to imply that risk assessments that are commonly used in criminal justice settings are inherently biased against African Americans. The conclusions caused concern among criminal justice practitioners and researchers because of the importance the assessment of risk has become throughout the criminal justice world and because 30 years of research have demonstrated that risk assessment could be conducted without such bias.
In this paper, researchers Anthony Flores, Chris Lowenkamp and CJI’s Kristin Bechtel highlight the flaws and erroneous assumptions made in the ProPublica article and use the data employed by the Larson et al., to conduct an independent analysis. Findings revealed that the conclusions reached by Larson et al were not supported by the data. Using well-established research methods, the authors failed to find any evidence of predictive bias by race in the COMPAS data used in the ProPublica article. Flores, Lowenkamp, and Bechtel challenge the ProPublica authors understanding of the COMPAS risk assessment, how the instrument is scored and used within the criminal justice field, their understanding of research methods and statistics, and their due diligence in attempting to understand the topic of risk assessment before reporting their story.
Click here to read the report.