On March 26, 2022, at around 8:20 a.m., a man in light-blue Nike sweatpants boarded a bus near a shopping plaza in Timonium, outside Baltimore. After the bus driver ordered him to observe a rule requiring passengers to wear face masks, he approached the fare box and began arguing with her. “I hit bitches,” he said, leaning over a plastic shield that the driver was sitting behind. When she pulled out her iPhone to call the police, he reached around the shield, snatched the device, and raced off. The bus driver followed the man outside, where he punched her in the face repeatedly. He then stood by the curb, laughing, as his victim wiped blood from her nose.
By the time police officers canvassed the area, the assailant had fled, but the incident had been captured on surveillance cameras. Officers with the Maryland Transit Administration Police extracted still images from the footage and created a Be on the Lookout bulletin, which was disseminated to law-enforcement agencies. It included several pictures of the alleged perpetrator: a slender Black man whose face was partially obscured by a baseball cap and a hoodie. The bulletin was also sent to the state’s attorney’s office of nearby Harford County, and an analyst there decided to run a facial-recognition search. She fed a still image into software that used algorithms to identify faces that had similar characteristics in a vast database of pictures. This “probe photograph” generated a list of potential matches. (Researchers have identified roughly eighty “nodal points” that convey the distinct geometry of a human face.) The match that stood out to the analyst was Alonzo Cornelius Sawyer, a Maryland resident in his mid-fifties.
On March 28th, Sawyer became a person of interest in the case, and his name was forwarded to the M.T.A.’s criminal-investigation unit, where detectives combed through police databases for information about him. They discovered that he had recently been on probation for a string of traffic violations. In three days, he was scheduled to answer a summons by appearing at a courthouse in Harford County, after being stopped for allegedly driving without a license. Sawyer showed up at the hearing in good spirits, laughing with a guard at the entrance. On his way out, after he’d learned that his case had been postponed, a U.S. deputy marshal grabbed him from behind, slammed him against a wall, and asked him if he was Alonzo Sawyer. “Yeah,” Sawyer said. The marshal told him that he had a warrant for his arrest. “Tell me what it’s about,” Sawyer pleaded. The marshal told him that he’d find out soon enough.
Sawyer was handcuffed and taken to the M.T.A.’s police headquarters, in Baltimore, where two officers interrogated him about where he’d been on March 26th. Sawyer said that he and his wife, who were about to move into a new apartment, had been staying at the house of his sister-in-law, who lived in Abingdon, a suburb forty minutes northeast of Baltimore, in Harford County. But he could not remember if he’d spent the entire day there. When was the last time he’d been in Baltimore County? Sawyer said that he couldn’t recall, but insisted that he didn’t ride M.T.A. buses and hadn’t been involved in a confrontation that day. The officers then showed him the Be on the Lookout bulletin, and one of them asked, “Then who was this?” Sawyer stared at the photographs and said, “I don’t know—who is it?” Like him, the man had a thin face and a goatee. But he looked no older than thirty-five, Sawyer thought—young enough to be his son. And although their skin color was similar, the color of their clothing was not. Pointing at the assailant’s sweatpants, he said, “I don’t wear light blue—I don’t even own anything in that color.”
The officers weren’t persuaded. Sawyer was transported to the Baltimore County Detention Center and charged with two counts of second-degree assault and multiple charges related to the theft of the phone. He was denied bail, owing to the viciousness of the crime, which carried a potential twenty-five-year sentence.
In 2016, the Center on Privacy and Technology, at Georgetown University Law Center, published a report, “The Perpetual Line-Up,” which estimated that the faces of a hundred and seventeen million Americans were in facial-recognition databases that state and local law-enforcement agencies could access. Many of these images came from government sources—driver’s-license photographs, mug shots, and the like. Other pictures came from sources such as surveillance cameras and social media.
In the years since the report’s publication, the technology has only grown more ubiquitous, not least because selling it is a lucrative business, and A.I. companies have successfully persuaded law-enforcement agencies to become customers. A 2021 investigation by BuzzFeed News found that employees at nearly two thousand public agencies had used or tested software developed by Clearview AI, a facial-recognition firm with a database containing billions of images that have been scraped off the Internet. The company has marketed its services to the police by promising that its software is “100% accurate across all demographic groups.”
Proponents view facial-recognition technology as an invaluable tool that can help make policing more efficient and insure that criminals are held accountable. The technology’s reputation got a boost after it helped investigators identify numerous rioters who stormed the U.S. Capitol on January 6, 2021. In “Your Face Belongs to Us,” a new book that traces the history of facial-recognition technology, Kashmir Hill, a reporter at the Times, describes how, in 2019, a Department of Homeland Security agent investigating a child-sex-abuse case e-mailed a suspect’s photograph to colleagues, one of whom ran the image through Clearview AI’s platform. The agent received back an Instagram photograph of a muscular man and a muscular woman posing at a bodybuilding expo in Las Vegas. In the background of the image was someone who resembled the suspect; he was standing behind a table at the booth of a dietary-supplement company. The agent called the company, which was based in Florida. The man, identified as Andres Rafael Viola, was arrested, and in his subsequent trial federal authorities presented enough other evidence, such as images obtained from his electronic devices, to secure a conviction. Viola was sentenced to thirty-five years in prison.
It’s not hard to imagine why law-enforcement officials might desire a tool capable of such feats. Critics, however, fear that the police could use automated face recognition for more objectionable purposes, such as monitoring the activities of peaceful protesters and impinging on citizens’ privacy. And questions remain about how reliable the tool is. Like all machine-learning systems, facial-recognition software makes predictions by discerning patterns in large volumes of data. This analysis is often done using artificial neural networks, which mimic the function of the human brain. The technology is trained with photographs of faces, just as ChatGPT is trained with text, and builds a statistical model that can assign a confidence score to indicate how similar two images are. But even a confidence score of ninety-nine per cent isn’t a guaranteed match. The companies that market such technology acknowledge that the score reflects an “algorithmic best guess”—one whose accuracy may vary depending on the quality of the probe photograph, which can be compromised by factors such as lighting and camera angle. Moreover, if the data set used to train the algorithm is imbalanced—more male faces than female ones, or more white faces than Black ones—the model may perform worse for some demographic groups. Jonathan Frankle, a neural-networks specialist who has researched facial-recognition technology, told me, “As with all things in machine learning, you’re only as good as your data. If my training data heavily represents a certain group, my model will likely be more reliable at assessing members of that group, because that’s what it saw.”