Advertisement
Features

a16z Funded AI Platform Generated Images That “Could Be Categorized as Child Pornography,” Leaked Documents Show

OctoML, the engine that powers a16z funded Civitai, thought the images could qualify as “child pornography,” but ultimately decided to keep working with the company anyway, internal Slack chats and other material shows.
a16z Funded AI Platform Generated Images That “Could Be Categorized as Child Pornography,” Leaked Documents Show
Images generated with the AI model "BetterBoys2.5D," which is hosted on Civitai. These are not images generated by the prompts in this article.

Content warning: This story covers synthetic child sexual abuse images. 

OctoML, a Seattle-based startup that helps companies optimize and deploy their machine learning models, debated internally whether it was ethical and legally risky for it to generate images for Civitai, an AI model sharing and image generating platform backed by venture capital firm Andreessen Horowitz, after it discovered Civitai generated content that OctoML co-founder Thierry Moreau said “could be categorized as child pornography,” according to internal OctoML Slack messages and documents viewed by 404 Media. 

OctoML has raised $132 million in funding, and is an AWS partner. Several posts on OctoML’s site explain how OctoAI, its product that powers Civitai image generations, runs on Amazon servers.

“What’s absolutely staggering is that this is the #3 all time downloaded model on CivitAI, and is presented as a pretty SFW model,” Moreau, who is also OctoML’s VP, technology partnerships, said in a company Slack room called #ai_ethics on June 8, 2023. Moreau was referring to an AI model called “Deliberate” that can produce pornographic images. “A fairly innocent and short prompt ‘[girl: boy: 15], hyperdetailed’ automatically generated unethical/shocking content—read something could be categorized as child pornography,” his Slack message added.

Other internal messages and documents show OctoML knew that at least 60 percent of the images on Civitai were what Moreau defined as “NSFW” content, and that some of it was nonconsensual, meaning nude and sexual images were generated of real people. According to Slack messages, OctoML thought that the fact that Civitai users are mostly producing sexual images posed two big ethical and potentially legally problematic issues for the company. The first is that the AI models could produce what could qualify as sexual images of children, and the second was that the AI models were producing sexual images of real people, primarily female celebrities.

“It’s no secret that one of the uses of generative AI is the generation of synthetic pornography—that data being trained on very real pornography,” Moreau said in one of the Slack messages. “It begs the question of what use of generative AI is considered ethical when some use cases showcased open a slippery slope towards the generation of child pornography, non-consensual ‘fake porn’, images of extreme violence.”

404 Media also viewed logs of the text prompts written by Civitai users that OctoML turned into images. Just one example of these prompts showed users attempting to generate an image of a “girl and dog, short girl, pimp, slut, petite girl, potty, vulva, very young, orgasm, nsfw, lascivious, lewd pose, interspecies, zoophilia, sex with dog.” The same prompt instructed the AI to make the girl in the image not look “adult, old” or have “big breasts.”

After discovering Civitai was being used to generate what some OctoML employees thought could qualify as explicit images of children, OctoML ultimately decided to keep working with the company, but not advertise the relationship like it had previously. OctoML previously hosted a roundtable and published promotional blogs that featured Civitai, but later on published a Civitai “case study” advertising OctoML’s service that omitted its name for “PR” reasons.