Through the looking glass: The controversial use of facial recognition technology in New Orleans highlights tensions that cities face in balancing artificial intelligence-driven policing with civil liberties. Supporters call it a vital tool for modern law enforcement, while critics warn that unchecked surveillance endangers fundamental rights.
For two years, New Orleans quietly served as a testing ground for one of the most ambitious – and controversial – uses of facial recognition in American policing. More than 200 artificial intelligence-powered cameras scanned faces in real time, alerting police via phone when a match was detected. The Washington Post notes that a private nonprofit, Project NOLA, operates the system – not the city – sparking intense debate over privacy, oversight, and the future of law enforcement technology.
Unlike law enforcement's traditional use of facial recognition, which typically relies on submitting still images from crime scenes for later analysis, New Orleans took a more proactive and pervasive approach. Project NOLA installed cameras – many mounted outside businesses in high-crime areas like the French Quarter – that streamed video to a control room at the University of New Orleans. Advanced machine vision algorithms developed by Project NOLA founder Bryan Lagarde scanned footage for faces, even in poor lighting or at difficult angles.
The system's core is a watchlist-based model using a database of 30,000 faces that Lagarde, a former police officer, assembled from police mugshots and other law enforcement records. When a camera detected a face, the software performed real-time image comparisons with those in the database. If it found a match, the system instantly sent an alert to a law enforcement mobile app, identifying the person and their location. Officers could then respond immediately, often arriving within minutes.
Project NOLA's network is technically decentralized, with each camera owned by private citizens or businesses. Together, these cameras create an extensive surveillance web monitored by Project NOLA staff and occasionally by law enforcement analysts. The organization claims the footage is only retained for up to 30 days before deletion and does not sell or share data with private companies.
However, the technology's reach goes beyond real-time alerts. Project NOLA can upload an image and search all camera feeds for past appearances, effectively retracing a person's movements and associations over the previous month. This pervasive location tracking has raised significant Fourth Amendment concerns among civil liberties advocates.
The system's use reportedly violates a 2022 city ordinance restricting police facial recognition to targeted searches in violent crime investigations. The law requires officers to log and review each use. Officers must send images to a state-run fusion center, where trained examiners compare them to a database and confirm matches only if at least two experts agree. However, law enforcement officials frequently bypassed this process, relying instead on Project NOLA's automated alerts. Police departments did not document most of the uses in their reports and excluded them from mandatory disclosures to the city council.
Project NOLA's involvement blurs the line between public and private surveillance. The nonprofit operates independently, setting watchlists and managing alerts without direct police control. Officers cannot add or remove names from the database but often request assistance or footage from Project NOLA staff. This setup lets police benefit from continuous surveillance while avoiding transparency and oversight rules that apply to official law enforcement operations. Civil liberties groups have condemned the system as a "nightmare scenario," warning it allows authorities to track people without their knowledge or consent.
Also read: Passports may soon become obsolete as facial recognition and smartphones take over
The technology is sophisticated. Some cameras can identify individuals up to 700 feet away, using facial features, clothing, and physical characteristics. Another controversial aspect is that Duhua manufactures most of the system's components. Duhua is one of the many Chinese electronics companies the US government banned over security concerns. However, Project NOLA maintains that it uses highly secured American-made servers.
Since early 2023, Project NOLA's facial recognition network has contributed to at least 34 arrests, including some for nonviolent offenses. In one theft case, a detective provided Project NOLA with surveillance images. Using clothing and facial recognition tools, the nonprofit identified a suspect and added him to the watchlist. Later, cameras detected him in the French Quarter, triggering police alerts and leading to an arrest. However, the official report omitted any mention of facial recognition or real-time tracking.
In April, after inquiries from The Washington Post and internal concerns about the program's legality, Police Superintendent Anne Kirkpatrick paused automated alerts while the department reviews its compliance with city law. For now, Project NOLA staff still receive alerts and may relay information to police by phone, text, or email.
Image credit: KSLA News
New Orleans used AI surveillance without public knowledge or full oversight