IBM abandoned one of its designs amid protests in the US

[ad_1]

The company said it refused to “work with any technology used for mass surveillance and violations of basic human rights and freedoms”

IBM abandoned one of its designs amid protests in the United States. Photo: parentology.com

IBM, the world’s largest software maker, will no longer develop and sell face recognition software. Arvind Krishna, the company’s general director, said this. In a letter For the US Congress.

Krishna said IBM “refuses to work with any technology used for mass surveillance, racial profiling and violations of basic human rights and freedoms.” In a letter, the head of the company also endorsed the need for police reform, calling for a nationwide dialogue to use modern facial recognition technology in the work of law enforcement agencies.

Previously, IBM tried to solve the problem with a bias of face recognition technology. In 2018, the company published a dataset for training systems that had an equal number of people from different ethnic groups, genders, and ages. Such a data set was meant to increase the accuracy of AI, which is typically trained on images of people in the same group. As it turned out, the company shares 1 million pictures from Flickr Photo Hosting. Later, the company stated that it only used images that were available in the public domain and that the owners of the images could refuse to use their photos at any time.

Recall, protests against police violence continued in the United States from May 26, which led to the death of 46-year-old African American American Wald. After his death across the country, people began participating in peaceful demonstrations, which later, due to clashes with police, escalated into pogroms and looting. As a result, dozens of major cities, including Washington, have introduced curfews.

Paypal magazine help

We previously wrote that many face recognition systems often identify black people by mistake. In late 2019, the US government shared such data, citing the results of a study by the National Institute of Standards and Technology. For the report, NIST tested 189 algorithms from 99 developers.

The study revealed that when faces are found in databases, many algorithms incorrectly identify African Americans and Asians. The error is 10–100 times higher than in Caucasian race people.

Read: Death of George Floyd: How America Affected Business

According to the content theverge.com


[ad_2]
Source link

Leave a Reply