Is AI replicating the racism baked into the policing system?
The civil rights battles of the past were fought in the streets, on buses, and at lunch counters. Today, they are unfolding in algorithms, housing policies, and the unseen codes of artificial intelligence. The Jim Crow laws that once enforced segregation in the American South may have been dismantled, but their spirit has mutated, embedding itself in the digital and economic structures that shape everyday life. This is the new face of systemic racism: a blend of technology, urban planning, and data-driven discrimination that redraws the racial map of America without the need for explicit laws.
In the realm of law enforcement, AI-driven predictive policing tools are disproportionately targeting Black communities, reinforcing historical biases in crime statistics. Systems trained on decades of racially skewed policing data flag Black neighbourhoods as high-crime areas, leading to heavier surveillance, increased police presence, and a self-fulfilling cycle of arrests. One of the most infamous examples is PredPol, a predictive policing system used in cities across the U.S., which was found to flag minority neighbourhoods, reinforcing racial profiling. Facial recognition software, already notorious for misidentifying people of colour, is another weapon in this digital arsenal, enabling wrongful arrests and an erosion of civil liberties under the guise of efficiency. Studies have shown that facial recognition technologies misidentify Black individuals at rates far higher than white individuals, leading to wrongful arrests and systemic violations of privacy.
Meanwhile, economic segregation is finding new life through algorithmic redlining. Mortgage lenders and landlords use AI-driven credit scoring systems that favour wealthier, predominantly white applicants while rejecting Black and brown homebuyers at disproportionate rates. Even ride-hailing and food delivery services have been found to discriminate against users based on ZIP codes, further entrenching disparities in access to transportation and employment opportunities.
The racial divide also widens due to climate change and urban planning decisions. Historically Black neighbourhoods, already vulnerable due to decades of disinvestment, are being targeted for redevelopment, pushing long-term residents out through gentrification. At the same time, climate disasters like hurricanes and wildfires disproportionately devastate Black and brown communities, yet recovery funds and infrastructure investments continue to favour wealthier, whiter areas.
Unlike the overt segregation of the Jim Crow era, today’s racial inequities are coded into systems that appear neutral on the surface. The challenge now is recognising and dismantling these hidden mechanisms before they solidify into an unchallenged status quo. The new Jim Crow isn’t a set of laws—it’s an algorithm, a policy, a decision made by an unseen hand that keeps systemic racism alive in a digital age.
Sources
Benjamin, Ruha. "Race After Technology: Abolitionist Tools for the New Jim Code." Polity Press, 2019.
Noble, Safiya Umoja. "Algorithms of Oppression: How Search Engines Reinforce Racism." NYU Press, 2018.
The Markup. "PredPol’s Predictive Policing Software Disproportionately Targets Black Neighborhoods."
MIT Media Lab. "Facial Recognition Software Is Biased Against People of Color."
ProPublica. "Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And It’s Biased Against Black People."
The New York Times. "How Climate Change Disproportionately Affects Black Communities."