Facial Recognition Technology Catches Imposter at Airport, Officials Say

Using newly installed facial-recognition technology, a man entering the U.S. from Brazil with a French passport at Washington, D.C.’s Dulles airport was identified as an impostor, the U.S. Customs and Border Patrol said on Thursday.

The identification came three days into the use of a new cutting-edge facial comparison system that matches a person’s face and the picture in a passport, visa, and other travel documents.

The person entering was sent to secondary screening, at which point the CBP said he became “visibly nervous” and was subject to a search, which revealed a Republic of Congo I.D. card beneath an insert in his shoe. That photo matched.

Attempted entry into the U.S. using false documents is a crime.

The CBP provided information about the incident with an image of the ID in the man’s shoes with some redactions. But the agency didn’t provide any independent verification about the use of the facial-recognition technology to flag the traveler.

The facial-comparison system has been installed as a technology demonstration in 14 airports and put into use on August 20. It’s intended to provide better accuracy and speed the processing of arriving passengers from international destinations. American citizens are currently allowed to decline the comparison scan.

The CBP’s privacy policy states it doesn’t store the “biographic data” captured for any travelers, and that the photos are transmitted only for identity verification. Photos of U.S. citizens are deleted within 12 hours of verification and non-U.S. citizens within 14 days.

Facial-recognition systems used in public places, like airports, and public accommodations, like malls and football stadiums, have come under criticism since the introduction of systems that had any chance of matching faces in video against a database, as well as recording faces for future matching or other purposes.

The American Civil Liberties Union, for instance, notes that governments can use the technology for continuous surveillance without any suspicion of wrongdoing, and use motor-vehicle agency photographic databases to identify and track people without their knowledge.

At the same time, the reliability of such systems in providing false positives—inaccurate matches that are presented as correct—could put innocent people in the path of law enforcement. The ACLU recently demonstrated that principle with a version of Amazon’s Rekognition facial-recognition technology marketed to various organizations, including police departments.

The ACLU scanned the official photo for every member of Congress, and the system matched 28 of them to criminal mugshots. Amazon said the system was designed to help filter matches for humans for further review, and that the ACLU could have used its best practices to set a “confidence threshold” for matching that reduces false positives.

This facial-comparison system, as described by the CBP, only matches people against photos the CBP already has of them, providing a different set of potential risks for false negatives, or errors in matching correctly, which the CBP says it mitigates through manual screening.