Forbes recently published the Forbes 400 List for 2024, listing the
400 richest people in the United States.
This inspired me to
make a histogram to show the distribution of wealth in the United States.
It turns out that if you put Elon Musk on the graph,
almost the entire US population is crammed into a vertical bar, one pixel wide.
Each pixel is $500 million wide, illustrating that $500 million essentially rounds to zero from the
perspective of the wealthiest Americans.
The histogram above shows the wealth distribution in red.
Note that the visible red line is one pixel wide at the left and disappears everywhere else—this is the important point: essentially the entire US population is in that first bar.
The graph is drawn with the scale of 1 pixel = $500 million in the X axis, and 1 pixel = 1 million people in the Y axis.
Away from the origin, the red line is invisible—a tiny fraction of a pixel tall since so few people have more than 500 million dollars.
Since the median US household wealth is about $190,000, half the population would be crammed into a microscopic red line 1/2500 of a pixel wide using the scale above.
(The line would be much narrower than the wavelength of light so it would be literally invisible).
The very rich are so rich that you could take someone with a thousand times the median amount of money,
and they would still have almost nothing compared to the richest Americans. If you increased their money
by a factor of a thousand yet again, you'd be at Bezos' level, but still well short of Elon Musk.
Another way to visualize the extreme distribution of wealth in the US is to imagine everyone in the US standing up while someone counts
off millions of dollars, once per second. When your net worth is reached, you sit down.
At the first count of $1 million, most people sit down, with 22 million people left standing.
As the count continues—$2 million, $3 million, $4 million—more people sit down.
After 6 seconds, everyone except the "1%" has taken their seat.
As the counting approaches the 17-minute mark, only billionaires are left standing, but there are still days
of counting ahead.
Bill Gates sits down after a bit over one day, leaving 8 people,
but the process is nowhere near the end.
After about two days and 20 hours of counting, Elon Musk finally sits down.
Sources
The main source of data is the Forbes 400 List for 2024.
Forbes claims there are 813 billionaires in the US here.
Median wealth data is from the Federal Reserve;
note that it is from 2022 and household rather than personal.
The current US population estimate is from Worldometer.
I estimated wealth above $500 million, extrapolating from
2019 data.
I made a similar graph in 2013; you can see my post here for comparison.
Disclaimers: Wealth data has a lot of sources of error including people vs households, what gets counted, and changing time periods, but I've tried to make this graph as accurate as possible.
I'm not making any prescriptive judgements here, just presenting the data.
Obviously, if you want to see the details of the curve, a logarithmic scale makes more sense, but I
want to show the "true" shape of the curve.
I should also mention that wealth and income are very different things; this post looks strictly at wealth.
“Since the median US household wealth is about $190,000, half the population would be crammed into a microscopic red line 1/2500 of a pixel wide using the scale above. (The line would be much narrower than the wavelength of light so it would be literally invisible).”
Go to https://brilliant.org/MinutePhysics to try Brilliant for 30 days for FREE, and to get 20% off an annual premium subscription to Brilliant.
We think of the moon as orbiting the earth, following a spiraling trajectory as the earth itself orbits the sun. But this is wrong. Not only is the moon's orbit NOT a spiral... there's an argument that the moon actually orbits the sun, not the earth! The moon's trajectory is more like a 12-sided polygon with curved corners than it is a spiral or even a wiggly line.
A pair of students at Harvard have built what big tech companies refused to release publicly due to the overwhelming risks and danger involved: smart glasses with facial recognition technology that automatically looks up someone’s face and identifies them. The students have gone a step further too. Their customized glasses also pull other information about their subject from around the web, including their home address, phone number, and family members.
The project is designed to raise awareness of what is possible with this technology, and the pair are not releasing their code, AnhPhu Nguyen, one of the creators, told 404 Media. But the experiment, tested in some cases on unsuspecting people in the real world according to a demo video, still shows the razor thin line between a world in which people can move around with relative anonymity, to one where your identity and personal information can be pulled up in an instant by strangers.
Nguyen and co-creator Caine Ardayfio call the project I-XRAY. It uses a pair of Meta’s commercially available Ray Ban smart glasses, and allows a user to “just go from face to name,” Nguyen said.
The demo video posted to X on Tuesday shows the pair using the tech against various people. In one of the first examples, Ardayfio walks towards the wearer. “To use it, you just put the glasses on, and then as you walk by people, the glasses will detect when somebody’s face is in frame,” the video says. “After a few seconds, their personal information pops up on your phone.”
In another example, the demo shows a test on what it describes as “a REAL person in the subway.” Ardayfio looks at the results of a face match on his phone, and then approaches a woman he calls Betsy. He introduces himself and claims the pair met through a particular foundation, presumably referencing something included in the search results.
“In our video, we purposefully added reactions we got from random people on the subway in Boston, acting as if we knew them,” Nguyen told 404 Media.
The video beeps out the surname of the woman, but 404 Media was able to easily identify them based on information included in the demo. That woman did not respond to a request for comment and 404 Media is not publishing her name because it is unclear if she consented to being used as a test subject.
In the demo, the pair say they were able to identify dozens of people without their knowledge. In some cases, the data was not accurate and provided the wrong name, according to some responses in the video.
“The motivation for this was mainly because we thought it was interesting, it was cool,” Nguyen said. When the pair started to show their project to others, “a lot of people reacted that, oh, this is obviously really cool, we can use this for networking, I can use this to play pranks on my friends, make funny videos,” Nguyen said. Then, some mentioned the potential for stalking. Nguyen gave the example of “Some dude could just find some girl’s home address on the train and just follow them home.”
Ardayfio told 404 Media that when the pair did show the technology to other Harvard students and people on the subway, some said, “Dude, holy shit, this is the craziest thing I’ve ever seen. How do you know my mom’s phone number?”
FROM FACE TO NAME TO ADDRESS TO MORE
Being able to use a pair of glasses or a smartphone’s camera to instantly unmask someone has been a redline in technology for decades. In her book about the rise of facial recognition, New York Times reporter Kashmir Hill detailed how both Facebook and Google had the technology to use facial recognition in combination with a camera feed, but declined to release it. As Hill mentions, Google’s chairman Eric Schmidt said more than ten years ago that Google “built that technology, and we withheld it.”
“As far as I know, it’s the only technology that Google built and, after looking at it, we decided to stop,” he added.
A company called Clearview AI broke that unwritten rule and developed a powerful facial recognition system using billions of images scraped from social media. Primarily, Clearview sells its product to law enforcement. Clearview has also explored a pair of smart glasses that would run its facial recognition technology. The company signed a contract with the U.S. Air Force on a related study.
Now although Nguyen and Ardayfio haven’t released the code for their project, they have demonstrated in a public setting that it is absolutely possible for someone to use mostly off-the-shelf products and services to build a pair of glasses that automatically dox people.
Nguyen showed 404 Media the glasses in a demonstration over a Google Hangout on Tuesday. He took a photo of Ardayfio, and the system automatically sent his picture to a facial recognition site online. It then scraped the sites where his face was found elsewhere on the web. A couple of minutes or so later, Nguyen’s phone showed Ardayfio’s name, and a range of biographical information such as the school he went to, a program he was previously on, and an essay he wrote.
In an accompanying Google Doc laying out the project, the pair say I-XRAY uses Pimeyes to lookup peoples’ faces. Pimeyes is a facial recognition service that, unlike Clearview, is available to anyone. It has been used by researchers to identify January 6 rioters and stalkers to unmask sex workers. After uploading a photo of someone’s face, Pimeyes provides a list of faces it believes are a match, and the URLs where those images came from. In the demonstration to 404 Media, the system worked by automatically visiting the Pimeyes website, uploading a photo like a human user would, then rapidly opening the resulting URLs. The test did not work on me because I’ve previously requested that Pimeyes block lookups of my face; you can request a block yourself here.
Those URLs can include things like yearbook archives, profiles on employer’s websites, or local sports clubs someone might be a member of. I-XRAY then scrapes those URLs, and uses a large language model (LLM) to infer the person’s name, job, and other personal details, the document says.
Armed with the name, I-XRAY then performs a lookup on a people search site. These are commercially-accessible data brokers that often store a wide range of peoples’ personal information such as phone numbers, home addresses, and social media profiles. They can also include information about the subject’s family members. During the demo to 404 Media, Nguyen said they removed Ardayfio’s home address from the people search site used in case of “crazy people.”
From that, the wearer of the glasses automatically has information that, in many circumstances, will likely be enough to identify a stranger on the street, where they work, where they went to school, where they live, and their contact information.
“We would show people photos of them from kindergarten, and they had never even seen the photo before,” Ardayfio said. “Most people were surprised by how much data they have online.”
The document says I-XRAY uses a pair of Meta Ray Bans 2. When asked for comment, a Meta spokesperson wrote in an email “that Pimeyes facial recognition technology could be used with ANY camera, correct? In other words, this isn't something that only is possible because of Meta Ray-Bans? If so, I think that's an important point to note in the piece.”
Of course, that ignores why the pair specifically choose to use Meta’s Ray Bans: because in passing, they look just like any other pair of glasses. Nyugen said they decided on smart glasses when thinking of the creepiest way for a bad actor to use this string of different technologies.
The Meta spokesperson declined to comment further, but pointed to Meta’s terms of service for Facebook View, Facebook’s accompanying app for the glasses, which say “You are also responsible for using Facebook View in a safe, lawful, and respectful manner.” Meta’s Ray Bans do include a light that is designed to turn on when the device is filming, indicating to other people that they might be recorded.
Pimeyes told 404 Media in an email that “we must state that the details provided are quite surprising to us.”
“Our system finds websites that publish similar images but is not designed to identify individuals, either directly or indirectly. The only information our users receive is a list of sources where images with a high similarity rate to the search material are found,” the email added, ignoring the obvious fact that showing a similar face to that in an uploaded image, along with a link of where that face is online, is a way to identify someone.
Last year 404 Media reported on a TikTok account whose owner was using off-the-shelf facial recognition tech like Pimeyes to dox random people on the internet for the amusement of millions of viewers. One victim said at the time they “felt a bit violated really.”
“I think people could definitely take the idea and run with it,” Ardayfio said, referring to the glasses. He added that if someone wanted to stalk somebody else, they could have already done it in a less technical way before the pair’s project, such as using Pimeyes and manually going through the results. “If people do run with this idea, I think that’s really bad. I would hope that awareness that we’ve spread on how to protect your data would outweigh any of the negative impacts this could have.” Those guides are included in the Google document.