The photo shows the face of a young woman with long dark hair and a soft smile who says she is “feeling pretty today .” And on Reddit — where Claudia, as she’s named, offers to sell nude photos to anyone who privately messages her — she is quite popular: “Holy crap you are beautiful,” one commenter said.
Tech is not your friend. We are. Sign up for The Tech Friend newsletter.
But Claudia is fake — a bundle of surprisingly convincing photos made by artificial-intelligence image tools, possibly deployed to pull in cash from unsuspecting buyers, according to two synthetic-media researchers.
The rapid advances in AI-image generators like Midjourney and Stable Diffusion have gained global attention in recent weeks for their inventive art pieces and impressive fakes of ex-presidents and popes.
But Claudia’s case hints at the technology’s more explicit side: By allowing anyone to create images of fake people that look uncannily real, the tools are reshaping how porn is made and consumed.
New technology has for years been pioneered through porn, and AI-image tools have not broken from that pattern. Thousands of accounts are now registered in discussion boards and chatrooms devoted to the creation and refinement of synthetic people, the majority of whom resemble girls and women — a rapid shift that could upend a multibillion-dollar industry, undermine demand for real-world models and actors and fuel deeper concerns about female objectification and exploitation.
A systems administrator at a hospital in the Midwest — who, like the other AI-porn creators and viewers interviewed for this story, spoke on the condition of anonymity — said he has been using Stable Diffusion tools to create fetish photos of adult women in diapers, and that advances in image quality have made it so their fakeness doesn’t matter.
“The average person who’s looking at this stuff, I don’t think they care,” he said. “I don’t expect the person I’m looking at online to be the person they say they are. I’m not going to meet this person in real life. … At the end of the day, if they’re not real, who really cares?”
The Claudia account didn’t respond to requests for comment, making it impossible to confirm how the photos were made — or how much money they raised from the months-old ruse.
But the researchers said the photos carried several clear hallmarks of a fake, including strange background details and a neck mole that went missing between poses. “Actually rather easy to create,” one AI programmer said.
AI porn is easy to make now. For women, that’s a nightmare.
The researchers identified several online profiles of women they believe are fake avatars based on the telltale artifacts that some AI image generators leave behind. Using profiles on Instagram, Reddit, Twitter and OnlyFans, the accounts shared images of women in varying stages of undress — and told viewers they should pay or subscribe if they wanted to see more.
The suspected fake accounts did not respond to questions. And because most AI-generated images are not watermarked or fingerprinted in any way at the time of creation, it can be challenging for any viewer to confirm whether they’re real or not.
One account published the videos of an amateur porn actor from Puerto Rico alongside edited images showing the woman’s face on someone else’s body. Neither the fake nor the real account responded to requests for comment.
Hundreds of online accounts followed and commented on the fake porn accounts, leaving comments that suggested they believed the women were real.
“Feel a bit cheated,” Reddit user “legalbeagle1966” said after a Washington Post reporter told him Claudia was likely a fraud. A week earlier, he’d commented on her photo that she looked “pretty sexy and perfect.”
AI can now create images out of thin air. See how it works.
Many of the newest fake images rely on AI programs, known as diffusion models, that allow anyone to type in a short prompt of words and create a fake photo for free. The images can then be edited even further to make them more convincing, including to cover up glitchy spots and refine their quality.
The tools are even simpler to use than the “deepfake” software that fueled worries over AI images in 2017. Where deepfakes used deep-learning AI techniques to edit existing videos, diffusion models generate entirely new photos by following the patterns found across the billions of images they’ve analyzed before.
But the new class of images raise many of the same concerns, including that they could be used to impersonate real women. On some forums, users talk of how to use diffusion models and other AI-powered techniques, such as “inpainting,” to superimpose the faces of real women onto the bodies of AI-generated fakes.
“To humiliate and push women out of the public sphere, they don’t even need to look exactly like the women. They rely on the shock effect,” said Sam Gregory, the executive director of Witness, a nonprofit group that specializes in video technology and human rights.
Tech is not your friend. We are. Sign up for The Tech Friend newsletter.
But Claudia is fake — a bundle of surprisingly convincing photos made by artificial-intelligence image tools, possibly deployed to pull in cash from unsuspecting buyers, according to two synthetic-media researchers.
The rapid advances in AI-image generators like Midjourney and Stable Diffusion have gained global attention in recent weeks for their inventive art pieces and impressive fakes of ex-presidents and popes.
But Claudia’s case hints at the technology’s more explicit side: By allowing anyone to create images of fake people that look uncannily real, the tools are reshaping how porn is made and consumed.
New technology has for years been pioneered through porn, and AI-image tools have not broken from that pattern. Thousands of accounts are now registered in discussion boards and chatrooms devoted to the creation and refinement of synthetic people, the majority of whom resemble girls and women — a rapid shift that could upend a multibillion-dollar industry, undermine demand for real-world models and actors and fuel deeper concerns about female objectification and exploitation.
A systems administrator at a hospital in the Midwest — who, like the other AI-porn creators and viewers interviewed for this story, spoke on the condition of anonymity — said he has been using Stable Diffusion tools to create fetish photos of adult women in diapers, and that advances in image quality have made it so their fakeness doesn’t matter.
“The average person who’s looking at this stuff, I don’t think they care,” he said. “I don’t expect the person I’m looking at online to be the person they say they are. I’m not going to meet this person in real life. … At the end of the day, if they’re not real, who really cares?”
The Claudia account didn’t respond to requests for comment, making it impossible to confirm how the photos were made — or how much money they raised from the months-old ruse.
But the researchers said the photos carried several clear hallmarks of a fake, including strange background details and a neck mole that went missing between poses. “Actually rather easy to create,” one AI programmer said.
AI porn is easy to make now. For women, that’s a nightmare.
The researchers identified several online profiles of women they believe are fake avatars based on the telltale artifacts that some AI image generators leave behind. Using profiles on Instagram, Reddit, Twitter and OnlyFans, the accounts shared images of women in varying stages of undress — and told viewers they should pay or subscribe if they wanted to see more.
The suspected fake accounts did not respond to questions. And because most AI-generated images are not watermarked or fingerprinted in any way at the time of creation, it can be challenging for any viewer to confirm whether they’re real or not.
One account published the videos of an amateur porn actor from Puerto Rico alongside edited images showing the woman’s face on someone else’s body. Neither the fake nor the real account responded to requests for comment.
Hundreds of online accounts followed and commented on the fake porn accounts, leaving comments that suggested they believed the women were real.
“Feel a bit cheated,” Reddit user “legalbeagle1966” said after a Washington Post reporter told him Claudia was likely a fraud. A week earlier, he’d commented on her photo that she looked “pretty sexy and perfect.”
AI can now create images out of thin air. See how it works.
Many of the newest fake images rely on AI programs, known as diffusion models, that allow anyone to type in a short prompt of words and create a fake photo for free. The images can then be edited even further to make them more convincing, including to cover up glitchy spots and refine their quality.
The tools are even simpler to use than the “deepfake” software that fueled worries over AI images in 2017. Where deepfakes used deep-learning AI techniques to edit existing videos, diffusion models generate entirely new photos by following the patterns found across the billions of images they’ve analyzed before.
But the new class of images raise many of the same concerns, including that they could be used to impersonate real women. On some forums, users talk of how to use diffusion models and other AI-powered techniques, such as “inpainting,” to superimpose the faces of real women onto the bodies of AI-generated fakes.
“To humiliate and push women out of the public sphere, they don’t even need to look exactly like the women. They rely on the shock effect,” said Sam Gregory, the executive director of Witness, a nonprofit group that specializes in video technology and human rights.