Algorithmic Biases

Consequences of Algorithmic Bias

If biases are left unchecked, harm can come to marginalized communities. Broken systems in everyday life have the potential to place value in white supremacist patriarchal ideologies, while undermining marginalized communities. If data is collected from these systems and used, it creates the potential for discrimination to be replicated in the digital realm. Ultimately, communities who have been historically vulnerable to harm from corporate practices will continue to be harmed online.

BailMoney

Figure 1. “Bail Money”, from Thomas Hawk (2015)

Example: One example of algorithmic bias is the California Senate Bill 10. Signed into law in 2018 and made effective in October 1019, it replaced cash bail with a predictive algorithm in an attempt to make the bail system more equitable. The algorithm uses data about incarcerated people which places the individuals into low, medium, and high risk categories. Those placed in low risk are released before their court date, medium risk allows the judge to decide whether or not they are released, and high risk individuals have to stay in jail. This algorithmic model is problematic as it may decide that an individual is high risk simply based on their zip code. Due to systemic racism which led to gerrymandering and redlining in America, many neighborhoods are economically and/or racially segregated. Because of this, people in certain lower-income zip codes may have high numbers of people of color. This data is then used by the algorithm, disproportionately placing many marginalized people in high-risk categories and forcing them to remain in jail. One example of algorithmic bias is the California Senate Bill 10. Signed into law in 2018 and made effective in October 1019, it replaced cash bail with a predictive algorithm in an attempt to make the bail system more equitable. The algorithm uses data about incarcerated people which places the individuals into low, medium, and high risk categories. Those placed in low risk are released before their court date, medium risk allows the judge to decide whether or not they are released, and high risk individuals have to stay in jail. This algorithmic model is problematic as it may decide that an individual is high risk simply based on their zip code. Due to systemic racism which led to gerrymandering and redlining in America, many neighborhoods are economically and/or racially segregated. Because of this, people in certain lower-income zip codes may have high numbers of people of color. This data is then used by the algorithm, disproportionately placing many marginalized people in high-risk categories and forcing them to remain in jail.