ADVERTISEMENT

AI fake nudes are booming. It’s ruining real teens’ lives.

cigaretteman

HR King
May 29, 2001
77,442
58,934
113
When Gabi Belle learned there was a naked photo of her circulating on the internet, her body turned cold. The YouTube influencer had never posed for the image, which showed her standing in a field without clothes. She knew it must be fake.

Tech is not your friend. We are. Sign up for The Tech Friend newsletter.

But when Belle, 26, messaged a colleague asking for help removing the image he told her there were nearly 100 fake photos scattered across the web, mostly housed on websites known for hosting porn generated by artificial intelligence. They were taken down in July, Belle said, but new images depicting her in graphic sexual situations have already surfaced.

“I felt yucky and violated,” Belle said in an interview. “Those private parts are not meant for the world to see because I have not consented to that. So it’s really strange that someone would make images of me.”


Artificial intelligence is fueling an unprecedented boom this year in fake pornographic images and videos. It’s enabled by a rise in cheap and easy-to-use AI tools that can “undress” people in photographs — analyzing what their naked bodies would look like and imposing it into an image — or seamlessly swap a face into a pornographic video.
On the top 10 websites that host AI-generated porn photos, fake nudes have ballooned by more than 290 percent since 2018, according to Genevieve Oh, an industry analyst. These sites feature celebrities and political figures such as New York Rep. Alexandria Ocasio-Cortez alongside ordinary teenage girls, whose likenesses have been seized by bad actors to incite shame, extort money or live out private fantasies.
Victims have little recourse. There’s no federal law governing deepfake porn, and only a handful of states have enacted regulations. President Biden’s AI executive order issued Monday recommends, but does not require, companies to label AI-generated photos, videos and audio to indicate computer-generated work.





Meanwhile, legal scholars warn that AI fake images may not fall under copyright protections for personal likenesses, because they draw from data sets populated by millions of images. “This is clearly a very serious problem,” said Tiffany Li, a law professor at the University of San Francisco.
The advent of AI images comes at a particular risk for women and teens, many of whom aren’t prepared for such visibility. A 2019 study by Sensity AI, a company that monitors deepfakes, found 96 percent of deepfake images are pornography, and 99 percent of those photos target women.
“It’s now very much targeting girls,” said Sophie Maddocks, a researcher and digital rights advocate at the University of Pennsylvania. “Young girls and women who aren’t in the public eye.”

‘Look, Mom. What have they done to me?’​

On Sept. 17, Miriam Al Adib Mendiri was returning to her home in southern Spain from a trip when she found her 14-year-old daughter distraught. Her daughter shared a nude picture of herself.



“Look, Mom. What have they done to me?” Al Adib Mendiri recalled her daughter saying.
She’d never posed nude. But a group of local boys had grabbed clothed photos from the social media profiles of several girls in their town and used an AI “nudifier” app to create the naked pictures, according to police.
Scarlett Johansson on fake AI-generated sex videos: ‘Nothing can stop someone from cutting and pasting my image’
The application is one of many AI tools that use real images to create naked photos, which have flooded the web recent months. By analyzing millions of images, AI software can better predict how a body will look naked and fluidly overlay a face into a pornographic video, said Gang Wang, an expert in AI at the University of Illinois at Urbana-Champaign.

Though many AI image-generators block users from creating pornographic material, open source software, such as Stable Diffusion, makes its code public, letting amateur developers adapt the technology — often for nefarious purposes. (Stability AI, the maker of Stable Diffusion, did not return a request for comment.)


Once these apps are public, they use referral programs that encourage users to share these AI-generated photos on social media in exchange for cash, Oh said.

When Oh examined the top 10 websites that host fake porn images, she found more than 415,000 had been uploaded this year, garnering nearly 90 million views.
AI-generated porn videos have also exploded across the web. After scouring the 40 most popular websites for faked videos, Oh found more than 143,000 videos had been added in 2023 — a figure that surpasses all new videos from 2016 to 2022. The fake videos have received more than 4.2 billion views, Oh found.

The Federal Bureau of Investigation warned in June of an uptick of sexual extortion from scammers demanding payment or photos in exchange for not distributing sexual images. While it’s unclear what percentage of these images are AI-generated, the practice is expanding. As of September, over 26,800 people have been victims of “sextortion” campaigns, a 149 percent rise from 2019, the FBI told The Post.

‘You’re not safe as a woman’​

In May, a poster on a popular pornography forum started a thread called “I can fake your crush.” The idea was simple: “Send me whoever you want to see nude and I can fake them” using AI, the moderator wrote.


Within hours, photos of women came flooding in. “Can u do this girl? not a celeb or influencer,” one poster asked. “My co-worker and my neighbor?” another one added.

Minutes after a request, a naked version of the image would appear on the thread. “Thkx a lot bro, it’s perfect,” one user wrote.
These fake images reveal how AI amplifies our worst stereotypes
Celebrities are a popular target for fake porn creators aiming to capitalize on search interest for nude photos of famous actors. But websites featuring famous people can lead to a surge in other kinds of nudes. The sites often include “amateur” content from unknown individuals and host ads that market AI porn-making tools.
Google has polices in place to prevent nonconsensual sexual images from appearing in search results, but its protections for deepfake images are not as robust. Deepfake porn and the tools to make it show up prominently on the company’s search engines, even without specifically searching for AI-generated content. Oh documented more than a dozen examples in screenshots, which were independently confirmed by The Post.



Ned Adriance, a spokesman for Google, said in a statement the company is “actively working to bring more protections to search” and that the company lets users request the removal of involuntary fake porn.
Google is in the process of “building more expansive safeguards” that would not require victims to individually request content gets taken down, he said.
Li, of the University of San Francisco, said it can be hard to penalize creators of this content. Section 230 in the Communications Decency Act shields social media companies from liability for the content posted on their sites, leaving little burden for websites to police images.
Victims can request that companies remove photos and videos of their likeness. But because AI draws from a plethora of images in a data set to create a faked photo, it’s harder for a victim to claim the content is derived solely from their likeness, Li said.



“Maybe you can still say: ‘It’s a copyright violation, it’s clear they took my original copyrighted photo and then just added a little bit to it,’” Li said. “But for deep fakes … it’s not that clear … what the original photos were.”
See why AI like ChatGPT has gotten so good, so fast
In the absence of federal laws, at least nine states — including California, Texas and Virginia — have passed legislation targeting deepfakes. But these laws vary in scope: In some states victims can press criminal charges, while others only allow civil lawsuits — though it can be difficult to ascertain whom to sue.
 
  • Sad
Reactions: InsaneHawkJJP
AI is going to be a disaster all around for mankind. Simply wont be able to deal with it responsibly.
No kidding. AI is taking the "wouldja" to a different and quite violating level. Society, at their worst, is pretty dark and bleak.

Have I mentioned the cabin in the woods yet? ...or maybe an RV and I just drive around to secluded places with my better half in retirement.
 
I'm not sure how exactly we would regulate this necessarily but it seems like something both sides aught to be interested in passing.
Going to have to implement clear laws, stern repercussions, then they'll have to have a task force assigned to combat it.

Sarcastic Season 9 GIF by The Office

...more great use of tax payer's money because humans are terrible.
 
Last edited:
The application is one of many AI tools that use real images to create naked photos, which have flooded the web recent months. By analyzing millions of images, AI software can better predict how a body will look naked and fluidly overlay a face into a pornographic video, said Gang Wang, an expert in AI at the University of Illinois at Urbana-Champaign.

Oh, come on.
 
I'm not sure how exactly we would regulate this necessarily but it seems like something both sides aught to be interested in passing.
Pandoras box has been opened. AI is probably the most significant invention of our time, and not in a good way imo. The shit just hit the streets, the thought of where this technology will be in 5 years is terrifying.
 
Pandoras box has been opened. AI is probably the most significant invention of our time, and not in a good way imo. The shit just hit the streets, the thought of where this technology will be in 5 years is terrifying.
As a tech writer of sorts, I fear I may soon be out of a job. Engineers and LCM professionals will be able to generate their own documentation once this thing goes mainstream. AT&T already has their proprietary version of ChatGPT. I need to make as much money while I can before I have to start stocking shelves at Lowes.
 
  • Like
Reactions: Moral
In the long run, these fakes might actually help. I think there will be so much of it nobody will believe any of it is real.
 
Does anyone believe that the Internet is a net positive for humankind? I surely don't, but maybe I'm just an old guy shaking my fist at the clouds.
I don't think the internet as much as social media and simpler things made more complex. We took some good things like SMS (literally short messages--Honey, I'll be 30 min late for dinner) and MySpace (ooo, cool I can do a little html and post some pics for mom back home) and destroyed it.

everybody loves raymond my posts GIF
 
I dunno, I see it as kind of a good thing, as this should kill revenge porn and teen suicide over threats to reveal shared photos. Just tell everyone it's AI now if threatened and move on.
 
As a tech writer of sorts, I fear I may soon be out of a job. Engineers and LCM professionals will be able to generate their own documentation once this thing goes mainstream. AT&T already has their proprietary version of ChatGPT. I need to make as much money while I can before I have to start stocking shelves at Lowes.
A friend of mine is a vp of marketing and communications at a large corporation. He recently showed me how they’ve used AI for a marketing campaign… from the concept stage to the copy writing to design. It was actually really good. He just plugged in some key words of the type of campaign they wanted, and within minutes he had a concept, story boards, the writing, etc. So yes, writers will likely be a thing of the past at some point.
 
I'm not sure how exactly we would regulate this necessarily but it seems like something both sides aught to be interested in passing.
I agree...but I have little hope congress would be able to keep up. They'd always be behind the power curve and AI will advance at an exponential rate.
 
A friend of mine is a vp of marketing and communications at a large corporation. He recently showed me how they’ve used AI for a marketing campaign… from the concept stage to the copy writing to design. It was actually really good. He just plugged in some key words of the type of campaign they wanted, and within minutes he had a concept, story boards, the writing, etc. So yes, writers will likely be a thing of the past at some point.
For right now, I'm simply able to double my output. There's still a manual design phase to what I do, so there's a good balance, but once people catch on--I'll be out on my arse. I've got about 12 years left in the work force, so time is NOT on my side. :confused::(
 
If people are generating AI nude photos with actual underage kids faces they should be subject to the same laws as any other pedophile would be. Shut that shit down now.

While I agree, I think laws should be equally strict for people doing this to anyone without consent. It's all gross and makes everyone feel horrible.
 
ADVERTISEMENT
ADVERTISEMENT