Page 34 - FCW, September/October 2019
P. 34

“You can identify someone from afar. They may never know. Andyoucandoitona massive scale.”
— NEEMA SINGH GULIANI, AMERICAN CIVIL LIBERTIES UNION
Privacy
34
September/October 2019 FCW.COM
Amid privacy concerns and recent research showing racial disparities in the accuracy of facial recognition tech- nology, some city and state officials are proposing to limit its use.
Law enforcement officials say facial recognition software can be an effec- tive crime-fighting tool, and some land- lords say it could enhance security in their buildings. But civil liberties activ- ists worry that vulnerable populations such as residents of public housing or rent-stabilized apartments are at risk for law enforcement overreach.
“This is a very dangerous technolo- gy,” said Neema Singh Guliani, a senior legislative counsel at the American Civil Liberties Union. “Facial recognition is different from other technologies. You can identify someone from afar. They may never know. And you can do it on a massive scale.”
The earliest forms of facial recogni- tion technology originated in the 1990s, and local law enforcement began using it in 2009. Today, its use has expanded to companies such as Facebook and Apple.
Such software uses biometrics to read the geometry of faces found in a photograph or video and compare the images to a database of other facial images to find a match. It’s used to verify personal identity — the FBI, for example, has access to 412 million facial images.
“Our industry certainly needs to do a better job of helping educate the public \[about\] how the technology works and how it’s used,” said Jake Parker, senior director of government relations at the Security Industry Association, a trade association based in Silver Spring, Md.
“Any technology has the potential to be misused,” Parker said. “But in the United States, we have a number of constitutional protections that limit what the government can do.”
A 2018 study by the Massachusetts Institute of Technology found that the software more often misidentifies darker-skinned people, particularly women of color, raising concerns about bias built into the technology. The study found the software had an error rate of 34.7% for darker-skinned women, compared with 0.8% for lighter- skinned men.
This year several cities — San Fran- cisco; Somerville, Mass.; and Oakland, Calif. — became the first to ban munici- pal departments, including police and housing agencies, from using facial recognition technology. And this year, lawmakers in at least 10 states intro- duced bills to ban or delay the use of the technology by government agencies and businesses.
“We’re concerned about government overreach,” Michigan state Rep. Isaac Robinson, a Democrat who sponsored one of the bills, told Stateline. “And pre-
serving our right to walk freely down the street without having our faces scanned.”
A handful of private apartment com- plexes in New York have started using the technology. But for now, few public housing complexes seem to be embrac- ing facial recognition software, said Adrianne Todman, CEO of the National Association of Housing and Redevelop- ment Officials.
In Detroit, one public housing com- plex uses live cameras as part of the citywide surveillance system Proj- ect Green Light Detroit. Images from those cameras could be loaded into the Detroit Police Department’s facial rec- ognition software.
Agencies rely more on cameras and security personnel to manage safety issues in their communities, Todman said. “They also rely on information they get from residents, who often are the most informed about what’s happening on their floors, in their buildings and in their neighborhoods.”
Legislative action
In May, U.S. Housing and Urban Devel- opment Secretary Ben Carson, a Detroit native, was asked about the use of the technology in public housing by U.S. Rep. Rashida Tlaib, a Democrat who is also from Detroit.
“I oppose the inappropriate use of it,” Carson said. He did not specify what use













































































   32   33   34   35   36