DHS Wants a Single Search Engine to Flag Faces and Fingerprints Across Agencies

0
1

The Department of Homeland Security is moving to consolidate its face recognition and other biometric technologies into a single system capable of comparing faces, fingerprints, iris scans, and other identifiers collected across its enforcement agencies, according to records reviewed by WIRED.

The agency is asking private biometric contractors how to build a unified platform that would let employees search faces and fingerprints across large government databases already filled with biometrics gathered in different contexts. The goal is to connect components including Customs and Border Protection, Immigration and Customs Enforcement, the Transportation Security Administration, US Citizenship and Immigration Services, the Secret Service, and DHS headquarters, replacing a patchwork of tools that do not share data easily.

The system would support watchlisting, detention, or removal operations and comes as DHS is pushing biometric surveillance far beyond ports of entry and into the hands of intelligence units and masked agents operating hundreds of miles from the border.

The records show DHS is trying to buy a single “matching engine” that can take different kinds of biometrics—faces, fingerprints, iris scans, and more—and run them through the same backend, giving multiple DHS agencies one shared system. In theory, that means the platform would handle both identity checks and investigative searches.

For face recognition specifically, identity verification means the system compares one photo to a single stored record and returns a yes-or-no answer based on similarity. For investigations, it searches a large database and returns a ranked list of the closest-looking faces for a human to review instead of independently making a call.

Both types of searches come with real technical limits. In identity checks, the systems are more sensitive, and so they are less likely to wrongly flag an innocent person. They will, however, fail to identify a match when the photo submitted is slightly blurry, angled, or outdated. For investigative searches, the cutoff is considerably lower, and while the system is more likely to include the right person somewhere in the results, it also produces many more false positives that necessitate human review.

The documents make clear that DHS wants control over how strict or permissive a match should be—depending on the context.

The department also wants the system wired directly into its existing infrastructure. Contractors would be expected to connect the matcher to current biometric sensors, enrollment systems, and data repositories so information collected in one DHS component can be searched against records held by another.

It’s unclear how workable this is. Different DHS agencies have bought their biometric systems from different companies over many years. Each system turns a face or fingerprint into a string of numbers, but many are designed only to work with the specific software that created them.

In practice, this means a new department-wide search tool cannot simply “flip a switch” and make everything compatible. DHS would likely have to convert old records into a common format, rebuild them using a new algorithm, or create software bridges that translate between systems. All of these approaches take time and money, and each can affect speed and accuracy.

At the scale DHS is proposing—potentially billions of records—even small compatibility gaps can spiral into large problems.

The documents also contain a placeholder indicating DHS wants to incorporate voiceprint analysis, but it contains no detailed plans for how they would be collected, stored, or searched. The agency previously used voiceprints in its “Alternative to Detention” program, which allowed immigrants to remain in their communities but required them to submit to intensive monitoring, including GPS ankle trackers and routine check-ins that confirmed their identity using biometric voiceprints.

The Justice Department’s Criminal Resource Manual notes that most federal courts have ruled voiceprint evidence admissible, citing cases from the 1970s and ’80s, while also acknowledging that at least one federal appeals court found the technique inadmissible and that its scientific validity has remained in question since the Supreme Court’s 1993 decision in Daubert v. Merrell Dow Pharmaceuticals. Those questions have only intensified given the rise of AI systems capable of convincingly mimicking a person’s voice.

Got a Tip?
Are you a current or former government employee who wants to talk about US immigration enforcement? We’d like to hear from you. Using a nonwork phone or computer, contact the reporter securely on Signal at dell.3030.

Civil liberties advocates and lawmakers are warning that DHS biometric tools are bleeding into political policing, with Americans photographed and face-scanned in public spaces during and after protests using tools designed to identify people, map relationships, and augment “derogatory” watchlists with little transparency or realistic avenues for redress.

DHS did not immediately respond to questions about how the proposed system would be used in investigative face searches involving US persons, or what safeguards, threshold settings, and oversight mechanisms would govern its use across components.

DHS has not made public the privacy rules that govern how its agents use facial recognition in the field, leaving the public unable to see basic guardrails, like when agents can scan someone, what counts as a valid reason for doing so, and how long results are kept.

Reporting by WIRED this month found DHS rolled out a mobile face recognition tool known as Mobile Fortify last year after rolling back centralized privacy review and department-wide limits. At the same time, the agency revoked the policy dictating the use of biometrics established during the Biden administration, and has yet to publish versions of its own, leaving it unclear what rules now govern the technology.

In announcing the ICE Out of Our Faces Act in early February, US senator Ed Markey framed facial recognition as an enforcement tool that is no longer confined to controlled checkpoints. ICE and CBP are using it “to track, target, and surveil individuals across the country,” he says, arguing the point is not simply identification, but intimidation—technology that can be pointed at people in public spaces to determine who they are and pull up information about them, including in situations where people are merely engaged in lawful protest or public criticism of the government.

Markey and the bill’s other sponsors describe the legislation as a direct attempt to cut off that capability by barring ICE and CBP from acquiring or using facial recognition and other biometric identification systems. It goes beyond symbolic restrictions, requiring the agencies to delete biometric identifiers they’ve already collected, and allowing individuals and state attorneys general to seek civil penalties for violations.

Jeff Migliozzi, who directs communications for the nonprofit Freedom for Immigrants, which runs the National Immigration Detention Hotline, said the expansion of DHS’s biometric infrastructure raises serious civil rights concerns.

“This development is deeply alarming,” he tells WIRED. “These biometric technologies, which have long been riddled with inherent racial biases, are aiding the government’s ability to rapidly expand its surveillance power not only over immigrants and other over-policed communities, but, as we are increasingly seeing with this administration, over political dissidents and anyone in this country regardless of citizenship status as well.”

“The sheer scale and speed at which big tech is converging its biometric, data, and AI capabilities in direct support of expanding the government’s surveillance and deportation dragnet,” Migliozzi says, “is incredibly alarming and is a nightmare for civil rights and personal privacy.”

Disclaimer : This story is auto aggregated by a computer programme and has not been created or edited by DOWNTHENEWS. Publisher: wired.com