Scroll Top

Zwicker & Conaway Introduce Bill to Regulate NJ Law Enforcement’s Use of Facial Recognition Technology

Assembly Committee Discussed Measure Monday, Heard Expert Testimony

Just recently, the New Jersey Attorney General prohibited law enforcement in all 21 counties from using Clearview AI’s facial recognition software after a New York Times article revealed the start-up had built its database by scrubbing billions of photos from online social media platforms.

In light of many concerns regarding the implications of this new technology, Assembly Democrats Andrew Zwicker and Herb Conaway introduced a measure (A-1210) to require the Attorney General or governing body of a county or municipality, as appropriate, to host a public hearing prior to the implementation of any facial recognition technology by a State or local law enforcement agency.

“The use of facial recognition technology raises serious data privacy concerns, so we need to be thoughtful in our consideration of policy around this issue,” said Assemblyman Andrew Zwicker, chair of the committee. “In creating a public forum for discussion, we can involve local New Jersey residents in the dialogue about facial recognition and its use in police work. While the benefits of facial recognition technology are real, the inherent risks to privacy, civil liberties, and civil rights are severely consequential and must be weighed equally.”

The Assembly Science, Innovation and Technology Committee during the Monday meeting received testimony from academic, policy and industry experts concerning the use, prevalence and risks of facial recognition technology.

“The fact is facial recognition technology is out there and law enforcement is using it,” said Conaway (D-Burling). “But without a better sense of how exactly they are using it, or an understanding of how effective it is in helping apprehend criminals and solve crimes, we can’t make any definitive assessments about whether applications of this technology are necessary, appropriate and ethical.”

‘Facial recognition technology,’ as defined within the bill, is a computer application that uses facial recognition algorithms to identify or verify a person from a digital image or video frame from a video source.

Further provisions of the legislation would require the initial public hearing to identify clear objectives and goals for agencies regarding use of the technology, and that the program’s efficacy be evaluated after five years.

A test run by the ACLU of Northern California further revealed facial recognition misidentified 26 members of California’s State Legislature as individuals from an arrest photo database. Over half of the legislators misidentified were lawmakers of color, indicating a worrisome racial disparity in software’s algorithmic accuracy.