Advertisement
News

‘Dream’ AI Girlfriend Randomly Turns Into Nude Jennifer Lopez, Has Four Legs

A nightmare scenario previously only imagined by AI researchers, where AI image generators accidentally spit out non-consensual pornography of real people, is now reality.
‘Dream’ AI Girlfriend Randomly Turns Into Nude Jennifer Lopez, Has Four Legs
Images: DreamGF. Composition: Samantha Cole

DreamGF, a service that allows users to generate AI girlfriends that chat with them and send them AI generated nudes, is randomly making some girlfriends look like Jennifer Lopez and other celebrities without their consent. DreamGF users are also complaining that their AI girlfriends will abruptly demand they climax and will sometimes send them nudes of deformed bodies.

The news is yet another example of an AI tool that’s behaving in a way developers are unable to control, despite it violating their own policies, customers’ desires, and in some instances, the law. It also proves a nightmare scenario previously only imagined by AI researchers, where an AI image generator will spit out non-consensual pornography of real people even if it’s not asked to, just because their images were included in the datasets the AI was trained on. This is something that can happen to anyone whose image is online.

The company’s recent press release, titled “DreamGF AI Offers Taylor Swift 2 Million For Exclusive Imaging Rights,” is a patently absurd attention grab in which the company that makes malfunctioning virtual girlfriends shares what it claims was an offer to one of the biggest and richest artists on the planet, proposing to give her two (2) whole million dollars so they can license her likeness.

“As someone who stands as an emblem of empowerment, individuality, and creativity, Taylor Swift is an icon that countless individuals look up to and admire,” the offer the company claims to have made to Swift reads. “We believe that collaborating with your team and Taylor Swift to offer her image for our platform will not only enhance our user experience but also shed light on the responsible ways AI can be integrated into modern entertainment.”

Under a section titled “Why Taylor Swift and DreamGF.ai?” the offer lists “Ethics and Safety” as the first reason. “Our platform is resolute in prioritizing ethics and safety, going above and beyond to ensure that AI companionship remains respectful and positive. Taylor's association will further amplify our mission of promoting healthy interactions in the digital sphere,” the offer states.

The Discord server for DreamGF users, however, tells a different story. In the channel “photo-share,” where users share images of their AI girlfriends, often covered in semen, was the unmistakable likeness of Jennifer Lopez, nude and mid-penetration.

I scrolled a bit more, and saw another AI-generated nude of someone who looked like Margot Robbie, and another image of Lopez.

I thought: Surely DreamGF users have found a way to manipulate whatever text-to-image AI tool the service is using to generate images of specific celebrities, but when I tested the site myself it wouldn’t allow me to write prompts in the same way DALL-E or Stable Diffusion do. Instead, I had to pick from a menu of character traits, some of which are locked behind more expensive membership tiers. Making a girlfriend in high heels, for example, is part of the “bronze plan,” while making a girlfriend in daisy dukes is part of the “silver plan.”

I tried generating a couple girlfriends but didn’t get anyone who looked like a real person I could recognize. Then I reached my allotted quota of girlfriends for the day.

Reading the Discord more carefully, I noticed that not only were users not subverting DreamGF’s tool to create celebrity nudes, they were actually upset that the tool was giving them celebrity girlfriends they didn’t ask for.

“Honestly we need more control... I don't want Selena Gomez... I don't want Jennifer Lopez.. I want my girl... to look the same…” one DreamGF user said.

“Yeah, would be nice if the generated images stayed constant when it came to their faces...I've got this smoking hot irish chick with long dark red hair, but most of the images come out with short light ginger (or pink) hair, and now it keeps making her look like Selina Gomez... facepalm,” said another user.

“What i've seen there is not much consistency especially between different poses or clothing state, probably because the source model is different,” another user said. “You can see it with asian girls where the [blowjob] face is always caucasian. Or for russian girls where in cowgirl position the girl becomes Jennifer Lopez (latina).”

“I've personally have not seen that, it might be just a training issue with our AI but definitely I will dig deeper into that and see,” DreamGF’s vice president of business development Jeff Dillon told me on a call when I asked him about the non-consensual AI generated nude images of celebrities his company was giving users, against their wishes. “Somehow the AI is associating those characteristics with probably these pretty big known personalities, for Latina, you know, they would definitely be known.”

In some countries, including the UK, France, and Germany, non-consensual image based abuse is a crime at the national level. “Some nations have national laws criminalizing the distribution of nonconsensual intimate imagery (also called revenge porn),” Tiffany Li, a law professor at the University of San Francisco School of Law and an expert on privacy, artificial intelligence, and technology platform governance, told me in August. “In the U.S., there’s no federal law, but most states have laws on nonconsensual intimate imagery. Some states also have laws specifically regarding deepfakes.”

This is not the only fuck up that is annoying paying DreamGF users. The service also sometimes generates horrific girlfriends who pose seductively at the camera, but with melting faces or mutant limbs, or girlfriends who don’t understand what the user is looking to get out of a conversation.

“I'm going to unsubscribe. Most of the features don't work or are ‘temporarily turned off.’ Most of the generated images are cursed,” one user said. “The chat ai can't keep a conversation going without losing track of who it even is, switching to weird third person perspective. Half the personalities are basically unusable.”

“I get good chat going, the AI is set up properly, very good start, like 10 messages in or so but then suddenly the AI decides I should cum and end it all,” another user said. “The thing is that the sex part haven't even started yet.”

Dillon said that DreamGF has a team of between 20-25 developers, mostly in Bulgaria, and that they previously worked at an NFT company. Dillon is also still the CEO of Virtual Mate, which makes a virtual reality enabled penis masturbator.

“We're still a new company,” Dillon said of DreamGF. “So I think that's the challenge with any new tech is figure out all those bugs and then how do you develop around that and put blockers.”

After our call, I sent Dillon the images and comments I saw on Discord about DreamGF making nude images of real people. He confirmed DreamGF uses Stable Diffusion to generate the images, and said, “yes we have seen some images that resemble these celebrities in their younger ages and we are issuing an update which will be on Monday that will remove and autodetect similar occurrences.”

Dillon is probably basically correct that the dataset Stable Diffusion was trained on was over-indexed on images of Jennifer Lopez, and when users ask it to produce an image of an attractive “Latina” woman, it is generating an image of Lopez, and other celebrities, even if they don’t want or ask for it.

This is not the first time I’ve seen Stable Diffusion do this. In May, I talked to someone who made a Stable Diffusion model that generated a character he called “Gloria Nobody,” which I and several users on the AI model marketplace Civitai noticed looked very similar to Mia Khalifa. When I asked them if they did this intentionally, they said they didn’t and that her likeness just popped up.

In January, Eric Wallace, a PhD student at UC Berkeley researching machine learning and computer security co-authored a preprint paper that might explain this phenomenon. Wallace and his co-authors found that image-generation tools, including Stable Diffusion, can unintentionally spit out images that are nearly identical to the images they were trained on.

“If the models are trained on images of specific individuals, the models can reproduce images that resemble those people. In the worst case, the model may even directly output verbatim copies of images from the training set,” Wallace told me in an email in August. “You can imagine many bad scenarios where this comes into play. For example, imagine a user trains the model on a mix of (a) consensual pornographic content, e.g., professional pornographic content, and (b) non-pornographic images of celebrities or specific people, e.g., portraits, random photos, etc.”

It appears that Wallace’s imagined scenario is now a reality.

Advertisement