Untargeted Facial Recognition is Unvetted and Unsafe: New Orleans City Council Should Reject It
Untargeted Facial Recognition is Unvetted and Unsafe: New Orleans City Council Should Reject It

Untargeted Facial Recognition is Unvetted and Unsafe: New Orleans City Council Should Reject It

The New Orleans City Council is set to vote on an ill-conceived ordinance that would facilitate the use of untargeted facial recognition surveillance by the New Orleans Police Department (NOPD). Recent reports have revealed that not only has the NOPD intentionally misled the public for years about its use of facial recognition, but it has also relied on an unauthorized partnership with a private organization to violate existing restrictions on the use of facial recognition.
The use of untargeted facial recognition by law enforcement is flawed by design, and may have already caused improper police stops, investigations, and even arrests of innocent people. But because the program has been cloaked in secrecy — with the NOPD refusing to keep records on alerts the system triggered and how officers responded — whatever harms it has caused remain hidden. By touting a small set of success stories while sweeping a potential litany of failures under the rug, proponents of this system are seeking to exploit a distorted view of untested, unvetted, and unsafe AI mass surveillance. The New Orleans City Council should not be fooled by this effort — it should reject the proposed ordinance and impose stronger restrictions on the NOPD’s use of facial recognition.