Artificial İntelligence, Google, Facebook, Amazon, Crime, Technology

Artificial İntelligence, Google

Amid reckoning on police racism, algorithm bias in focus

Amid reckoning on police racism, algorithm bias in focus

5/7/2020 6:24:00 AM

Amid reckoning on police racism, algorithm bias in focus

A wave of protests over law enforcement abuses has highlighted concerns over artificial intelligence programs like facial recognition which ...

WorldFacial recognition technology is increasingly used in law enforcement, amid concerns that low accuracy for people of color could reinforce racial bias. (Photo: AFP/David McNew)05 Jul 2020 11:00AMShare this contentBookmarkWASHINGTON: A wave of protests over law enforcement abuses has highlighted concerns over artificial intelligence programs like facial recognition which critics say may reinforce racial bias.

Singapore likely to rebound from recession, achieve 7 per cent GDP growth in 2021: Report Beirut blast: Singapore Red Cross to contribute S$50,000 to relief effort Public warning system to be sounded twice on National Day as flag is raised, Pledge recited

While the protests have focused on police misconduct, activists point out flaws that may lead to unfair applications of technologies for law enforcement, including facial recognition, predictive policing and"risk assessment" algorithms.Advertisement

AdvertisementThe issue came to the forefront recently with the wrongful arrest in Detroit of an African American man based on a flawed algorithm which identified him as a robbery suspect.Critics of facial recognition use in law enforcement say the case underscores the pervasive impact of a flawed technology.

Mutale Nkonde, an AI researcher, said that even though the idea of bias and algorithms has been debated for years, the latest case and other incidents have driven home the message."What is different in this moment is we have explainability and people are really beginning to realise the way these algorithms are used for decision-making," said Nkonde, a fellow at Stanford University's Digital Society Lab and the Berkman-Klein Center at Harvard.

AdvertisementAdvertisementAmazon, IBM and Microsoft have said they would not sell facial recognition technology to law enforcement without rules to protect against unfair use. But many other vendors offer a range of technologies.SECRET ALGORITHMSNkonde said the technologies are only as good as the data they rely on.

"We know the criminal justice system is biased, so any model you create is going to have 'dirty data,'" she said.Daniel Castro of the Information Technology & Innovation Foundation, a Washington think tank, said however it would be counterproductive to ban a technology which automates investigative tasks and enables police to be more productive.

"There are (facial recognition) systems that are accurate, so we need to have more testing and transparency," Castro said."Everyone is concerned about false identification, but that can happen whether it's a person or a computer."

Singapore reports 301 new COVID-19 cases, including 4 in the community Hong Kong reports 95 new COVID-19 cases as local transmissions remain high Dinosaurs had cancer too, say scientists who discover first known case in the creatures

Seda Gurses, a researcher at the Netherlands-based Delft University of Technology, said one problem with analysing the systems is that they use proprietary, secret algorithms, sometimes from multiple vendors."This makes it very difficult to identify under what conditions the dataset was collected, what qualities these images had, how the algorithm was trained," Gurses said.

PREDICTIVE LIMITSThe use of artificial intelligence in"predictive policing," which is growing in many cities, has also raised concerns over reinforcing bias.The systems have been touted to help make better use of limited police budgets, but some research suggests it increases deployments to communities which have already been identified, rightly or wrongly, as high-crime zones.

These models"are susceptible to runaway feedback loops, where police are repeatedly sent back to the same neighborhoods regardless of the actual crime rate," said a 2019 report by the AI Now Institute at New York University, based a study of 13 cities using the technology.

These systems may be gamed by"biased police data," the report said.In a related matter, an outcry from academics prompted the cancellation of a research paper which claimed facial recognition algorithms could predict with 80 per cent accuracy if someone is likely to be a criminal.

ROBOT VS HUMANSIronically, many artificial intelligence programs for law enforcement and criminal justice were designed with the hope of reducing bias in the system.So-called risk assessment algorithms were designed to help judges and others in the system make unbiased recommendations on who is sent to jail, or released on bond or parole.

But the fairness of such a system was questioned in a 2019 report by the Partnership on AI, a consortium which includes tech giants including Google and Facebook, as well as organisations such as Amnesty International and the American Civil Liberties Union.

Philippines tops COVID-19 cases in Southeast Asia with 3,561 new infections Chinese court sentences Canadian to death for making drugs Winner of fashion design competition Singapore Stories to get museum showcase

"It is perhaps counterintuitive, but in complex settings like criminal justice, virtually all statistical predictions will be biased even if the data was accurate, and even if variables such as race are excluded, unless specific steps are taken to measure and mitigate bias," the report said.

Nkonde said recent research highlights the need to keep humans in the loop for important decisions."You cannot change the history of racism and sexism," she said."But you can make sure the algorithm does not become the final decision maker."

Castro said algorithms are designed to carry out what public officials want, and the solution to unfair practices lies more with policy than technology."We can't always agree on fairness," he said."When we use a computer to do something, the critique is leveled at the algorithm when it should be at the overall system."

Read more: CNA »

Jokowi's relations with ruling party hit low point amid claims administration gives strategic posts to outsidersJAKARTA - The mercurial relations between President Joko Widodo and the elites of the Indonesian Democratic Party of Struggle (PDI-P) have again hit a low point - this time over the appointment of non-party figures to the notoriously high-paying senior management posts in numerous state-owned enterprises.. Read more at straitstimes.com.

India coronavirus cases hit record high amid monsoon rainMUMBAI (REUTERS) - India recorded its highest single-day spike of coronavirus cases on Saturday (July 4), with over 22,000 new cases and 442 deaths, as infections rose in the western and southern parts of the country amid heavy monsoon rains.. Read more at straitstimes.com.

Incumbent Koike seen ahead as Tokyo votes for governor amid pandemicTOKYO: Voters in Tokyo went to the polls on Sunday (Jul 5) to elect their governor, with incumbent Yuriko Koike forecast to clinch a victory, as ...

Malaysians grow green fingers amid coronavirus shutdownKUALA LUMPUR - When Malaysia went on red alert from mid-March as coronavirus cases surged, most people were forced to stay at home and many lost their jobs.. Read more at straitstimes.com.

South Korean 'augmented reality' mirror allows contactless cosmetics shopping amid COVID-19An augmented reality (AR) mirror at the new Seoul flagship boutique of cosmetics powerhouse Amorepacific makes easy work of seeing if that scarlet ...

Havana stirs to life without tourists and amid scarcityHAVANA: The Cuban capital stirred to life on Friday (Jul 3) after more than three months of lockdown but there were no signs of tourists on ...