After only four days on the market, an AI-powered software tool that creates realistic nude images of women based off of photographs was pulled off the market by its creator.
DeepNude, the app created by someone with the alias “Alberto,” allows users to upload photographs of women, and then feeds back a photo of them “undressed.” The app, powered by AI, uses algorithms to create realistic looking nude photos of the targeted females, which could easily be posed as if they took them themselves.
Over the past couple of months, the threat of deepfakes, or videos and images manipulated by artificial intelligence, has become increasingly popular.
Just this past week a group of bipartisan U.S. Senators introduced legislation that will crack down on deepfake videos and content on the internet.
DeepNude, which uses the techniques that are targeted in the congressional legislation, was taken down just days after being on the market, due to the intense criticism of the app.
The application was first spotted by Samantha Cole of Motherboard, who reported that the application was available for download from Windows with a $99 offer that provides higher resolution photos.
After discovering the software, Motherboard went a step further, downloading the free version, and using various test photos to see how accurate the application is.
“The results vary dramatically, but when fed a well lit, high-resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic,” Cole wrote. “The algorithm accurately fills in details where clothing used to be, angles of the breasts beneath the clothing, nipples, and shadows.”
Cole also noted that “in a paid version, which costs $50, the watermark is removed, but a stamp that says “FAKE” is placed in the upper-left corner.”
“Cropping out the ‘fake’ stamp or removing it with Photoshop would be very easy,” she added.
The reactions to the software have been negative across the board. Katelyn Bowden, founder, and CEO of Badass, a revenge porn activism organization, called the software “terrifying.”
“This is absolutely terrifying,” she said. “Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.”
Danielle Citron, professor of law at the University of Maryland Carey School of Law, called the software an “invasion of sexual privacy.”
Citron, who testified in front of Congress earlier this year over the impending Deepfake threat, warned of the implications it would have on victims.
“Yes, it isn’t your actual vagina, but… others think that they are seeing you naked,” Citron said. “As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”
Hany Farid, a computer-science professor at UC Berkeley, pointed out that regardless of laws put in place, there needs to be an effort to get better at detecting deepfakes.
“We are going to have to get better at detecting deepfakes, and academics and researchers are going to have to think more critically about how to better safeguard their technological advances so that they do not get weaponized and used in unintended and harmful ways,” Farid said.
“In addition, social media platforms are going to have to think more carefully about how to define and enforce rules surrounding this content. And, our legislators are going to have to think about how to thoughtfully regulate in this space.”
The creator of “DeepNude” said he created the app because he was fascinated with the idea
Alberto, the alias behind the app, explained to Motherboard in an email that he created the app because he was fascinated by the idea that he could make this a reality.
“Like everyone, I was fascinated by the idea that they could really exist and this memory remained,” he wrote to Motherboard. “About two years ago I discovered the potential of AI and started studying the basics. When I found out that GAN networks were able to transform a daytime photo into a nighttime one, I realized that it would be possible to transform a dressed photo into a nude one.”
“Eureka,” he wrote. “I realized that x-ray glasses are possible! Driven by fun and enthusiasm for that discovery, I did my first tests, obtaining interesting results.”
Alberto also pointed out in his email that the algorithm is made for females, because nude images of women are increasingly easier to find online.
“The networks are multiple, because each one has a different task: locate the clothes. Mask the clothes. Speculate anatomical positions. Render it,” he said. “All this makes processing slow (30 seconds in a normal computer), but this can be improved and accelerated in the future.”
He also added that he has thought to himself, “Is this right? Can it hurt someone?”
“I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial),” he said.
“I also said to myself: the technology is ready (within everyone’s reach),” he said. “So if someone has bad intentions, having DeepNude doesn’t change much… If I don’t do it, someone else will do it in a year.”