New Orleans used AI surveillance without public knowledge or full oversight

Skye Jacobs

Posts: 609   +13
Staff
Through the looking glass: The controversial use of facial recognition technology in New Orleans highlights tensions that cities face in balancing artificial intelligence-driven policing with civil liberties. Supporters call it a vital tool for modern law enforcement, while critics warn that unchecked surveillance endangers fundamental rights.

For two years, New Orleans quietly served as a testing ground for one of the most ambitious – and controversial – uses of facial recognition in American policing. More than 200 artificial intelligence-powered cameras scanned faces in real time, alerting police via phone when a match was detected. The Washington Post notes that a private nonprofit, Project NOLA, operates the system – not the city – sparking intense debate over privacy, oversight, and the future of law enforcement technology.

Unlike law enforcement's traditional use of facial recognition, which typically relies on submitting still images from crime scenes for later analysis, New Orleans took a more proactive and pervasive approach. Project NOLA installed cameras – many mounted outside businesses in high-crime areas like the French Quarter – that streamed video to a control room at the University of New Orleans. Advanced machine vision algorithms developed by Project NOLA founder Bryan Lagarde scanned footage for faces, even in poor lighting or at difficult angles.

The system's core is a watchlist-based model using a database of 30,000 faces that Lagarde, a former police officer, assembled from police mugshots and other law enforcement records. When a camera detected a face, the software performed real-time image comparisons with those in the database. If it found a match, the system instantly sent an alert to a law enforcement mobile app, identifying the person and their location. Officers could then respond immediately, often arriving within minutes.

Project NOLA's network is technically decentralized, with each camera owned by private citizens or businesses. Together, these cameras create an extensive surveillance web monitored by Project NOLA staff and occasionally by law enforcement analysts. The organization claims the footage is only retained for up to 30 days before deletion and does not sell or share data with private companies.

However, the technology's reach goes beyond real-time alerts. Project NOLA can upload an image and search all camera feeds for past appearances, effectively retracing a person's movements and associations over the previous month. This pervasive location tracking has raised significant Fourth Amendment concerns among civil liberties advocates.

The system's use reportedly violates a 2022 city ordinance restricting police facial recognition to targeted searches in violent crime investigations. The law requires officers to log and review each use. Officers must send images to a state-run fusion center, where trained examiners compare them to a database and confirm matches only if at least two experts agree. However, law enforcement officials frequently bypassed this process, relying instead on Project NOLA's automated alerts. Police departments did not document most of the uses in their reports and excluded them from mandatory disclosures to the city council.

Project NOLA's involvement blurs the line between public and private surveillance. The nonprofit operates independently, setting watchlists and managing alerts without direct police control. Officers cannot add or remove names from the database but often request assistance or footage from Project NOLA staff. This setup lets police benefit from continuous surveillance while avoiding transparency and oversight rules that apply to official law enforcement operations. Civil liberties groups have condemned the system as a "nightmare scenario," warning it allows authorities to track people without their knowledge or consent.

Also read: Passports may soon become obsolete as facial recognition and smartphones take over

The technology is sophisticated. Some cameras can identify individuals up to 700 feet away, using facial features, clothing, and physical characteristics. Another controversial aspect is that Duhua manufactures most of the system's components. Duhua is one of the many Chinese electronics companies the US government banned over security concerns. However, Project NOLA maintains that it uses highly secured American-made servers.

Since early 2023, Project NOLA's facial recognition network has contributed to at least 34 arrests, including some for nonviolent offenses. In one theft case, a detective provided Project NOLA with surveillance images. Using clothing and facial recognition tools, the nonprofit identified a suspect and added him to the watchlist. Later, cameras detected him in the French Quarter, triggering police alerts and leading to an arrest. However, the official report omitted any mention of facial recognition or real-time tracking.

In April, after inquiries from The Washington Post and internal concerns about the program's legality, Police Superintendent Anne Kirkpatrick paused automated alerts while the department reviews its compliance with city law. For now, Project NOLA staff still receive alerts and may relay information to police by phone, text, or email.

Image credit: KSLA News

Permalink to story:

 
"Project NOLA's involvement blurs the line between public and private surveillance."

that's the only key point in this article and this whole TOPIC when it comes to surveillance.
 
When it comes to information there is nothing truly private. If a company or government wants to know, they will. Even if it breaks laws. There is no law that would stop anyone when it comes to getting info.

There are agencies within every government, some unknown that do things that many wouldn't like.

Privacy is long gone. Any law that was meant for privacy, is more of a pipe dream. It's just there for looks not to really do or mean anything.

In the '70s the FBI/government had 20 servers that monitored a lot of calls. Now some 55 years later and big advancements in technology over the years, you can just imagine what they have now.
Everything is monitored. AI now helps with all of that. Imperfect as tech is, it's not going anywhere.
 
So basically, New Orleans outsourced the surveillance state to a nonprofit that’s not bound by pesky laws or oversight. What could possibly go wrong? Just waiting for the "Black Mirror: Bourbon Street" episode at this point.
 
Like it or hate it, this is not a legal issue. It's perfectly legal to film people in public, anytime, any place, for any reason.
 
Like it or hate it, this is not a legal issue. It's perfectly legal to film people in public, anytime, any place, for any reason.
"The system's use reportedly violates a 2022 city ordinance restricting police facial recognition to targeted searches in violent crime investigations. The law requires officers to log and review each use. Officers must send images to a state-run fusion center, where trained examiners compare them to a database and confirm matches only if at least two experts agree. "

AHEM. It's actually ILLEGAL for authorities to do this. Reeding r hrd.
 
"The Washington Post notes that a private nonprofit, Project NOLA, operates the system – not the city – sparking intense debate over privacy, oversight, and the future of law enforcement technology."

"This pervasive location tracking has raised significant Fourth Amendment concerns among civil liberties advocates."

In general terms, the 4th Amendment does not apply to private business as such. It gets messy when the business acts as an agent for the Government or is contracted. We already have red light cameras, license plate readers on roads, etc. Only a matter of time this will be common and is no longer news.

 
"The system's use reportedly violates a 2022 city ordinance restricting police facial recognition to targeted searches in violent crime investigations. The law requires officers to log and review each use. Officers must send images to a state-run fusion center, where trained examiners compare them to a database and confirm matches only if at least two experts agree. "

AHEM. It's actually ILLEGAL for authorities to do this. Reeding r hrd.
Weird city ordinance. If they want to make that illegal, sure, but it's not a constitutional issue in the slightest, as the tagline suggests.
 
I am all for it. If you have done nothing wrong then nothing happens.

If there is an alleged rapist, murderer etc walking down the straight and the cops come and swoop them up, great. Lets make the streets safe for the law abiding citizens
 
Oh, for shame. How do people think we get all those great accident videos? I'd rather not be filmed but I'm not doing anything I need worry about.
 
Running such intricate, extensive and sophisticated network is so cheap that an NGO could do it. It is also very easy to obtain "permit" to have it all over the city that the city govt never ever realize until some "clueless" journalist find it out.
 
It's an interesting legal question. What is there to keep the public from linking up their security cameras, accessing a public wanted database, and performing auto look-ups? If it is communities with high crime rates, they might be racing to sign up once the infrastructure to join is created.
 
Back