Virginia State Senator Scott Surovell, who sponsored legislation permitting police use of facial recognition, speaks during a senate session, in Richmond, Virginia, U.S., in March 2022. (JoNathan Collins/Handout via REUTERS )

From 2019 through 2021, many American state and city governments passed laws restricting the use of facial recognition software in crime prevention.

Facial recognition is a way of using software to identify an individual by matching their face to a database of images. Studies at the time found the technology less successful at correctly identifying Black people than white people. So, under pressure from social justice organizations, many states put restrictions in place.

Now, local governments across the United States are starting to use facial recognition software again. The move comes as crime in the country has increased. At the same time, software companies say the technology has improved and no longer incorrectly recognizes Black people.

In the coming months, the states of Virginia and California will permit local police to use the recognition programs. Before this year, Vermont did not permit the software, but now officers can use the systems in cases of child sexual abuse.

In New Orleans, Louisiana, murder reports rose 67 percent over the last two years compared with the previous two. And police in the southern city are set to start using the technology later this month.

The leader of the New Orleans police department is Shaun Ferguson. He said the “technology is needed to solve these crimes and to hold individuals accountable.”

Improved technology

Ongoing research by the National Institute of Standards and Technology (NIST) has shown facial recognition technology is getting better. And testing from the Department of Homeland Security found the software is better at recognizing people of all skin colors.

But not all government agencies are certain about the improvements.

The General Services Administration reviews companies that want to work for the U.S. government. It recently said the tools do not work well enough when studying African Americans. In addition, the U.S. government is forming a group to study facial recognition and its use in police work.

In this Oct. 31, 2018, file photo, a man, who declined to be identified, has his face painted to represent efforts to defeat facial recognition during a protest at Amazon headquarters over the company's facial recognition system, "Rekognition," in Seattle. (AP Photo/Elaine Thompson, File)
In this Oct. 31, 2018, file photo, a man, who declined to be identified, has his face painted to represent efforts to defeat facial recognition during a protest at Amazon headquarters over the company’s facial recognition system, “Rekognition,” in Seattle. (AP Photo/Elaine Thompson, File)

Jake Parker works for the Security Industry Association, a trade group that represents software companies. He said there is a “growing interest” in the technology if it is used in “a nondiscriminatory way that benefits communities.”

One company that makes the software is Clearview AI. It recently settled a legal action brought by the American Civil Liberties Union (ACLU). The ACLU said Clearview violated the privacy of billions of people by collecting photos and information taken from social media without permission. Clearview, which helps police find matches in its social media database, said it welcomes “any regulation that helps society … while limiting potential downsides.”

Limited use of software

Starting on July 1, police in Virginia will be able to use facial recognition tools that have 98 percent or higher accuracy based on the NIST test. The software could be used with some controls. For example, it cannot be used with live video. That means police can only review images that have already been gathered.

Parker, from the Security Industry Association, said the state is the first in the nation to require the software first be approved for use by a U.S. government agency.

FILE - A U.S. Customs and Border Protection facial recognition device is shown at a United Airlines gate, Wednesday, July 12, 2017, at George Bush Intercontinental Airport, in Houston. (AP Photo/David J. Phillip)
FILE – A U.S. Customs and Border Protection facial recognition device is shown at a United Airlines gate, Wednesday, July 12, 2017, at George Bush Intercontinental Airport, in Houston. (AP Photo/David J. Phillip)

New York City Mayor Eric Adams is a former member of the city police department. He said facial recognition software could be used safely under existing rules.

Other states, such as Washington, are putting rules in place that require police departments to show that the software works under the same conditions as it will be used in day-to-day life.

In California, the use of facial recognition with body cameras may come back on January 1, 2023. Jennifer Jones is a lawyer for the ACLU in Northern California. She said news reports about crime are permitting police departments to re-start conversations about the software.

“Police departments are exploiting people’s fears” about crime in order to gain more power, she said, adding that new technology is “pushed in moments of crisis.”

I’m Dan Friedell.

Dan Friedell adapted this story for VOA Learning English based on reporting by Reuters.

Write to us in the Comments Section and visit WWW.VOA-STORY.COM

_________________________________________________________

Words in This Story

accountable– n. required to show responsibility for problems that may occur

benefit – n. to be useful or helpful

regulation – n. an official rule that says how something should be done

potential – n. something that may happen in the future

accuracy– n. the ability to work without making mistakes

exploit – v. to use something in a way that helps you unfairly