Biased AI perpetuates racial injustice – TechCrunch

As we increasingly rely on AI, we must be vigilant to ensure these programs are helping to solve problems of racial injustice, rather than perpetuate and magnify them.

6/28/2020 7:30:00 PM

'This dangerous and unjust loop did not create all of the racial disparities under protest, but it reinforced and normalized them under the protected cover of a black box.'

As we increasingly rely on AI, we must be vigilant to ensure these programs are helping to solve problems of racial injustice, rather than perpetuate and magnify them.

, a nonprofit organization focused on reducing unconscious bias in artificial intelligence.The murder of George Floyd was shocking, but we know that his death was not unique. Too many Black lives have been stolen from their families and communities as a result of historical racism. There are deep and numerous threads woven into racial injustice that plague our country that have come to a head following the recent murders of George Floyd, Ahmaud Arbery and Breonna Taylor.

CAREN Act would outlaw false, racist 911 calls 'Batwoman' has found its new star Tulsa sees Covid-19 surge in the wake of Trump's June rally

Just as important as the process underway to admit to and understand the origin of racial discrimination will be our collective determination to forge a more equitable and inclusive path forward. As we commit to address this intolerable and untenable reality, our discussions must include the role of artificial intelligence (

AI). While racism has permeated our history, AI now plays a role in creating, exacerbating and hiding these disparities behind the facade of a seemingly neutral, scientific machine. In reality, AI is a mirror that reflects and magnifies the bias in our society.

I had the privilege of working with Deputy Attorney General Sally Yates to introduce implicit bias training to federal law enforcement at the Department of Justice, which I found to be as educational for those working on the curriculum as it was to those participating. Implicit bias is a fact of humanity that both facilitates (e.g., knowing it’s safe to cross the street) and impedes (e.g., false initial impressions based on race or gender) our activities. This phenomenon is now playing out at scale with AI.

As we havelearned, law enforcement activities such as predictive policing have too often targeted communities of color, resulting in a disproportionate number of arrests of persons of color. These arrests are then logged into the system and become data points, which are aggregated into larger data sets and, in recent years, have been used to create AI systems. This process creates a feedback loop where predictive policing algorithms lead law enforcement to patrol and thus observe crime only in neighborhoods they patrol, influencing the data and thus future recommendations. Likewise, arrests made during the current protests will result in data points in future data sets that will be used to build AI systems.

This feedback loop of bias within AI plays out throughout the criminal justice system and our society at large, such asdetermining how long to sentence a defendant,whether to approve an application for a home loan or whether to schedule an interview with a job candidate. In short, many AI programs are built on and propagate bias in decisions that will determine an individual and their family’s financial security and opportunities, or lack thereof —

often without the user even knowing their role in perpetuating bias.This dangerous and unjust loop did not create all of the racial disparities under protest, but it reinforced and normalized them under the protected cover of a black box. Read more: Women 2.0 »

ServiceNow BrandVoice: AI And The Rise Of Proactive ITProactive—even predictive—IT delivery is no longer a fantasy. Find out why AIOps is the key Sponsored by servicenow

Forbes Insights: Where Asia Is Taking The World With AIDr. Ayesha Khanna, CEO of ADDO AI, and Dr. Parag Khanna, Managing Partner of FutureMap, provide an overview of how AI is unfolding in Asia Sponsored by IBM

Forbes Insights: How AI Is Revamping The Call CenterA look at how AI is revamping the call center and helping redefine customer experience itself Sponsored by IBM IBM J io o

These 45 Brands Are Donating Their Sales Proceeds to the Black Lives Matter MovementYour purchases from these brands will financially support organizations leading the fight against racial injustice. Onde estavam ou estão aqueles que nunca olharão para a África. São anos de sofrimentos e fome, incluindo estas marcas em algum processo esplorando alguém, sem olhar se quer para este continente. 'Todas as vidas importam' Parem de dividir etnias e olhem para a pobreza. To the Democrat party? Bc all the donation links are to Act Blue aka a bunch of white people. Every 4 years people...every four years ClockWork Scam Its true but please answer me 😐

Daytime Emmys: ‘The Young and the Restless’ wins best drama“The Young and the Restless” won best drama at the first virtual Daytime Emmys on Friday night, with some winners speaking out about racial injustice. That award basically means you are the best of the worst art created. Will the 'Young and Restless' Americans vote and demand they be counted? See HBOs documentary 'KILLCHAIN.' Attention: 'Red' States! and 'Blue' States!' And 'Soon-to-Be' States! Any 'Wanna-Be' States out there? 'This Opera Ain't Over 'Til The Fat Man is singing in Sing-Sing!' Daytime Emmys: ‘The Young and the Restless’ wins best drama

Princeton To Drop Woodrow Wilson's Name From Public Policy SchoolAmid worldwide protests against racial injustice, Princeton University is renaming its public policy school in light of U.S. President Woodrow Wilson's r... They should of done that a long time ago. This isn't anything new but with what is going on now they decide to change it. Typical with education, reactive, rarely proactive. Reagan also.