(ILLINOIS, UNITED STATES) Federal immigration agents are using a facial recognition app that state and local police in Illinois are barred from deploying, exposing a stark gap between one of the country’s toughest privacy laws and the tools still available to Washington. While Illinois law blocks police agencies from working with Clearview AI, a facial recognition company that scrapes billions of images from the internet, U.S. Immigration and Customs Enforcement continues to access the technology inside the state with little apparent oversight. Privacy advocates say the split leaves residents subject to powerful surveillance they cannot easily see or challenge.
At the center is Illinois’s Biometric Information Privacy Act, or BIPA, a statute widely described by lawyers and digital rights groups as the strongest biometric privacy protection in the United States. Passed to regulate how companies handle fingerprints, face scans, and other sensitive identifiers, the law requires written consent before a private company can collect, store, or use a person’s biometric data, and it allows individuals to sue when that standard is broken. State police and local departments have been forced to steer clear of Clearview AI under that framework. But federal entities are not bound by state restrictions, and immigration officers can keep running facial searches even in places where city cops and county sheriffs cannot.

The result is a two-tier system that leaves Illinois residents with uneven protections depending on who is investigating them. Local officers cannot load up a facial recognition app such as Clearview AI without risking a lawsuit under state law, yet federal immigration agents, operating under federal authority, face no comparable state-level limits. Advocates say that tilt puts federal use beyond the reach of Illinois’s signature privacy safeguards and keeps critical details, like how often the tool is used, which databases are searched, and what happens to the resulting matches, out of public view.
“This is what dystopian nightmares are made of, this kind of continual expansion of surveillance without any real oversight or restrictions,” said Jeramie Scott, senior counsel at the Electronic Privacy Information Center.
His warning echoes years of concern from civil liberties groups that the speed and secrecy of face matching outstrip the public rules that typically govern searches, warrants, and accountability.
The federal government has maintained contracts with Clearview AI, according to privacy advocates tracking procurement records and public statements, even as Illinois police departments pulled back in the face of lawsuits. That arrangement lets immigration officers tap a system built on billions of face images lifted from websites and social media platforms, then run rapid matches that can help identify a person whose name is unknown or confirm an identity when officers have only a photo. Clearview AI’s pitch to law enforcement has long been speed and scale. In Illinois, the pitch has collided with a law designed to slow down the harvesting and use of biometrics unless people sign off first.
The contrast is especially sharp because BIPA’s core rules are so clear. Companies must tell people what they are collecting, why, and for how long; they must secure written consent; they must publish a retention schedule and guidelines for destroying biometric data. The statute’s private right of action, a rare feature in state privacy law, enables Illinois residents to bring suits on their own and has driven a wave of litigation that reshaped how businesses handle face measurements and other identifiers. The American Civil Liberties Union of Illinois has pushed cases and settlements that reinforced those guardrails. Yet none of those remedies reach into federal agencies’ software choices.
Federal use in Illinois stands on a legal island because state restrictions do not bind Washington’s investigative playbook. There is no statewide ban that can stop ICE from using Clearview AI. There is also no blanket federal prohibition that would bar immigration officers from running face searches within Illinois. Instead, federal use continues in a patchwork environment where local transparency rules and consent requirements apply to Illinois companies and police agencies, but not to federal operations running on a separate track.
Privacy lawyers point to Illinois appellate decisions that have extended protections beyond those found in the U.S. Constitution by emphasizing the state’s explicit right to privacy. Those rulings cemented BIPA’s reach inside Illinois courts and helped turn the law into a national template. Still, the rulings draw their power from state law. When federal agents operate under federal authority, those state court interpretations do not limit them.
The broader national landscape is uneven. By the end of 2024, fifteen states had enacted laws limiting police use of facial recognition, adopting guardrails like warrant requirements or restricting the technology to serious crimes. Those measures set boundaries for state and local investigations but leave federal agencies largely untouched. ICE, like other federal entities, sits outside many of those state statutes, and while agency guidance may exist in pockets, privacy groups say it is not binding in the way Illinois’s statute is, nor is it transparent to the public.
That gap matters in real cases. When local detectives in Illinois want to use a face-matching tool, BIPA forces them to account for consent and imposes the risk of lawsuits if they cut corners. When immigration officers run a search through Clearview AI in the same city, residents have no clear mechanism to find out, challenge, or seek damages. The mismatch leaves advocates warning that the strongest state privacy regime in the country can be sidestepped at a federal keyboard.
Scott’s critique goes to the heart of that concern.
“This is what dystopian nightmares are made of, this kind of continual expansion of surveillance without any real oversight or restrictions,” he said.
For groups like the Electronic Privacy Information Center and the ACLU of Illinois, oversight means written policies, independent audits, public reporting on usage, and strict rules for when and how agents can run a face search. In their view, none of that is guaranteed for federal immigration use inside Illinois.
Illinois officials have not dismantled the federal carve-out because the state cannot regulate federal agencies in the same way it polices private companies and its own departments. The state’s main tool remains BIPA, which has forced tech firms to rewrite terms, delete face templates, and rewrite their data retention schedules. Residents can sue and win damages when their biometric data is taken without consent. But when residents ask whether the same protections stop a federal officer from uploading a photo and running it through a massive database scraped from the public web, the answer is no.
Clearview AI’s role keeps drawing scrutiny. The company’s database, built by collecting images from social media and across the internet, turns casual photos into search entries. The facial recognition app produces potential matches within seconds. Supporters in law enforcement say that speed helps when seconds matter, like identifying a suspect in a violent crime or finding a missing person from a still image. Critics counter that false matches can lead to wrongful stops and arrests, and that mass scraping of faces, especially without consent, flips privacy on its head. Illinois law came down on the side of consent and control for residents dealing with companies. But the federal exemption illustrated by ICE’s ongoing use creates a surveillance lane immune to that decision.
The ACLU of Illinois has spent years defending and strengthening the Biometric Information Privacy Act through court cases and advocacy, arguing that consent, transparency, and the ability to sue are essential to keep biometric technology in check. Their efforts underscore the stakes of letting a federal work-around grow in the shadows. Without state-level leverage, groups have pressed Congress to set nationwide rules or for federal agencies to adopt binding limits themselves, but no comprehensive federal law has passed to match Illinois’s model.
Illinois courts’ expansive view of privacy helps explain why the divide feels so stark. In decisions that emphasize the state’s explicit right to privacy, judges have treated a face scan much like a fingerprint—an immutable marker that, once compromised, cannot be changed. That logic underpins BIPA’s stiff requirements: get consent first, tell people what you are doing, and delete data on a schedule. Even as those principles harden at the state level, they stop short at the federal border.
In practical terms, that means an Illinois resident can demand to know and seek damages if a retailer’s camera scans their face without a signed release, but cannot sue a federal immigration officer for running their photo through Clearview AI during an investigation. The transparency that BIPA forces onto private companies—public policies, retention limits, and clear disclosures—does not carry over to ICE’s use inside Illinois, where reports on volume, accuracy, and error rates are sparse or nonexistent in public records.
The stakes are not only theoretical. While no recent public reports name specific Illinois residents harmed by federal facial recognition use, advocates warn that harm, when it happens, can be hard to detect and harder to remedy. A wrong match might lead to a stop, an interview, or a home visit that never shows up in a docket. A correct match, drawn from a photo scraped without consent, can build a case that a resident has no easy way to challenge. When local police run into BIPA’s limits, they may hand off to federal partners who are free to run the search. That dynamic, critics say, invites forum shopping for surveillance.
For now, the legal reality is stable. As of November 2025, Illinois police agencies remain barred from using Clearview AI and similar facial recognition tools. ICE and other federal agencies continue to use those technologies inside Illinois, taking advantage of a legal gap with little state-level oversight or transparency. The split leaves residents relying on a patchwork of protections that change depending on which badge is in front of them.
Lawmakers elsewhere are watching Illinois, both for its successes and its gaps. The fifteen states that set limits by the end of 2024 did not settle on a single model. Some require a warrant; others restrict certain uses or mandate audits. Few, if any, go as far as Illinois in allowing people to sue. None directly bind federal agencies. For immigrants in particular—people who may already fear government scrutiny—the knowledge that a federal facial recognition app can run where state tools cannot adds to the sense of unequal rules and limited recourse.
Officials at the federal level could narrow the divide by adopting clear, public policies that mirror state best practices: narrow use cases, written approvals, retention limits, and regular reporting. Congress could set a baseline federal privacy law that recognizes biometric data as uniquely sensitive and extends consent requirements nationally. Until then, Illinois’s strongest rules will continue to stop at the edge of federal authority.
Illinois residents who want to understand the protections that do exist can review the state’s Biometric Information Privacy Act, which lays out consent rules, data handling obligations, and the right to sue. That law explains why a grocery chain or a mall cannot quietly scan a shopper’s face. It also explains, indirectly, why a federal immigration agent can still run the same face through Clearview AI in the same city. The law’s teeth bite hard where state jurisdiction reaches—but not beyond.
The debate over Clearview AI’s role in Illinois is not just about one product. It is about whether the rules that govern powerful surveillance tools should depend on which government is holding them. As long as federal agencies operate outside Illinois’s biometric privacy regime, the answer, for people living in the state, remains uneven: strong rights against private companies and local police, and far fewer when the badge says federal.
This Article in a Nutshell
Illinois’s BIPA imposes strict consent, notice, and retention rules on biometric data for companies and local police, and allows private lawsuits. However, federal agencies like ICE are not bound by state law and continue using Clearview AI’s facial recognition inside Illinois, creating a two-tier system with limited transparency and oversight. Privacy advocates and civil-rights groups call for federal policies or legislation to close the gap, demanding written rules, audits, usage reports, and retention limits to protect residents.