A pair of students at Harvard have built what big tech companies refused to release publicly due to the overwhelming risks and danger involved: smart glasses with facial recognition technology that automatically looks up someone’s face and identifies them. The students have gone a step further too. Their customized glasses also pull other information about their subject from around the web, including their home address, phone number, and family members.
The project is designed to raise awareness of what is possible with this technology, and the pair are not releasing their code, AnhPhu Nguyen, one of the creators, told 404 Media. But the experiment, tested in some cases on unsuspecting people in the real world according to a demo video, still shows the razor thin line between a world in which people can move around with relative anonymity, to one where your identity and personal information can be pulled up in an instant by strangers.
Nguyen and co-creator Caine Ardayfio call the project I-XRAY. It uses a pair of Meta’s commercially available Ray Ban smart glasses, and allows a user to “just go from face to name,” Nguyen said.
The demo video posted to X on Tuesday shows the pair using the tech against various people. In one of the first examples, Ardayfio walks towards the wearer. “To use it, you just put the glasses on, and then as you walk by people, the glasses will detect when somebody’s face is in frame,” the video says. “After a few seconds, their personal information pops up on your phone.”
In another example, the demo shows a test on what it describes as “a REAL person in the subway.” Ardayfio looks at the results of a face match on his phone, and then approaches a woman he calls Betsy. He introduces himself and claims the pair met through a particular foundation, presumably referencing something included in the search results.
“In our video, we purposefully added reactions we got from random people on the subway in Boston, acting as if we knew them,” Nguyen told 404 Media.
The video beeps out the surname of the woman, but 404 Media was able to easily identify them based on information included in the demo. That woman did not respond to a request for comment and 404 Media is not publishing her name because it is unclear if she consented to being used as a test subject.
In the demo, the pair say they were able to identify dozens of people without their knowledge. In some cases, the data was not accurate and provided the wrong name, according to some responses in the video.
“The motivation for this was mainly because we thought it was interesting, it was cool,” Nguyen said. When the pair started to show their project to others, “a lot of people reacted that, oh, this is obviously really cool, we can use this for networking, I can use this to play pranks on my friends, make funny videos,” Nguyen said. Then, some mentioned the potential for stalking. Nguyen gave the example of “Some dude could just find some girl’s home address on the train and just follow them home.”
Ardayfio told 404 Media that when the pair did show the technology to other Harvard students and people on the subway, some said, “Dude, holy shit, this is the craziest thing I’ve ever seen. How do you know my mom’s phone number?”
FROM FACE TO NAME TO ADDRESS TO MORE
Being able to use a pair of glasses or a smartphone’s camera to instantly unmask someone has been a redline in technology for decades. In her book about the rise of facial recognition, New York Times reporter Kashmir Hill detailed how both Facebook and Google had the technology to use facial recognition in combination with a camera feed, but declined to release it. As Hill mentions, Google’s chairman Eric Schmidt said more than ten years ago that Google “built that technology, and we withheld it.”
“As far as I know, it’s the only technology that Google built and, after looking at it, we decided to stop,” he added.
A company called Clearview AI broke that unwritten rule and developed a powerful facial recognition system using billions of images scraped from social media. Primarily, Clearview sells its product to law enforcement. Clearview has also explored a pair of smart glasses that would run its facial recognition technology. The company signed a contract with the U.S. Air Force on a related study.
Now although Nguyen and Ardayfio haven’t released the code for their project, they have demonstrated in a public setting that it is absolutely possible for someone to use mostly off-the-shelf products and services to build a pair of glasses that automatically dox people.
Nguyen showed 404 Media the glasses in a demonstration over a Google Hangout on Tuesday. He took a photo of Ardayfio, and the system automatically sent his picture to a facial recognition site online. It then scraped the sites where his face was found elsewhere on the web. A couple of minutes or so later, Nguyen’s phone showed Ardayfio’s name, and a range of biographical information such as the school he went to, a program he was previously on, and an essay he wrote.
In an accompanying Google Doc laying out the project, the pair say I-XRAY uses Pimeyes to lookup peoples’ faces. Pimeyes is a facial recognition service that, unlike Clearview, is available to anyone. It has been used by researchers to identify January 6 rioters and stalkers to unmask sex workers. After uploading a photo of someone’s face, Pimeyes provides a list of faces it believes are a match, and the URLs where those images came from. In the demonstration to 404 Media, the system worked by automatically visiting the Pimeyes website, uploading a photo like a human user would, then rapidly opening the resulting URLs. The test did not work on me because I’ve previously requested that Pimeyes block lookups of my face; you can request a block yourself here.
Those URLs can include things like yearbook archives, profiles on employer’s websites, or local sports clubs someone might be a member of. I-XRAY then scrapes those URLs, and uses a large language model (LLM) to infer the person’s name, job, and other personal details, the document says.
Armed with the name, I-XRAY then performs a lookup on a people search site. These are commercially-accessible data brokers that often store a wide range of peoples’ personal information such as phone numbers, home addresses, and social media profiles. They can also include information about the subject’s family members. During the demo to 404 Media, Nguyen said they removed Ardayfio’s home address from the people search site used in case of “crazy people.”
From that, the wearer of the glasses automatically has information that, in many circumstances, will likely be enough to identify a stranger on the street, where they work, where they went to school, where they live, and their contact information.
“We would show people photos of them from kindergarten, and they had never even seen the photo before,” Ardayfio said. “Most people were surprised by how much data they have online.”
The document says I-XRAY uses a pair of Meta Ray Bans 2. When asked for comment, a Meta spokesperson wrote in an email “that Pimeyes facial recognition technology could be used with ANY camera, correct? In other words, this isn't something that only is possible because of Meta Ray-Bans? If so, I think that's an important point to note in the piece.”
Of course, that ignores why the pair specifically choose to use Meta’s Ray Bans: because in passing, they look just like any other pair of glasses. Nyugen said they decided on smart glasses when thinking of the creepiest way for a bad actor to use this string of different technologies.
The Meta spokesperson declined to comment further, but pointed to Meta’s terms of service for Facebook View, Facebook’s accompanying app for the glasses, which say “You are also responsible for using Facebook View in a safe, lawful, and respectful manner.” Meta’s Ray Bans do include a light that is designed to turn on when the device is filming, indicating to other people that they might be recorded.
Pimeyes told 404 Media in an email that “we must state that the details provided are quite surprising to us.”
“Our system finds websites that publish similar images but is not designed to identify individuals, either directly or indirectly. The only information our users receive is a list of sources where images with a high similarity rate to the search material are found,” the email added, ignoring the obvious fact that showing a similar face to that in an uploaded image, along with a link of where that face is online, is a way to identify someone.
Last year 404 Media reported on a TikTok account whose owner was using off-the-shelf facial recognition tech like Pimeyes to dox random people on the internet for the amusement of millions of viewers. One victim said at the time they “felt a bit violated really.”
“I think people could definitely take the idea and run with it,” Ardayfio said, referring to the glasses. He added that if someone wanted to stalk somebody else, they could have already done it in a less technical way before the pair’s project, such as using Pimeyes and manually going through the results. “If people do run with this idea, I think that’s really bad. I would hope that awareness that we’ve spread on how to protect your data would outweigh any of the negative impacts this could have.” Those guides are included in the Google document.