Skip to main content
padlock icon - secure page this page is secure

When Algorithms Import Private Bias into Public Enforcement: The Promise and Limitations of Statistical Debiasing Solutions

Buy Article:

Price: $30.00 + tax (Refund Policy)

We make two contributions to understanding the role of algorithms in regulatory enforcement. First, we illustrate how big-data analytics can inadvertently import private biases into public policy. We show that a much-hyped use of predictive analytics – using consumer data to target food-safety enforcement – can disproportionately harm Asian establishments. Second, we study a solution by Pope and Sydnor (2011), which aims to debias predictors via marginalization, while still using information of contested predictors. We find the solution may be limited when protected groups have distinct predictor distributions, due to model extrapolation. Common machine-learning techniques heighten these problems. (JEL: I18, C53, K23, K42)
No References
No Citations
No Supplementary Data
No Article Media
No Metrics

Keywords: algorithmic fairness; antidiscrimination; predictive targeting; racial bias

Appeared or available online: November 14, 2018

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more