Advertisement
News

Meta Sues Nudify App That Keeps Advertising on Instagram

As part of what it claims is a new crackdown, Meta is suing a nudify app and "strengthening" its enforcement.
Meta Sues Nudify App That Keeps Advertising on Instagram
Photo by Dima Solomin / Unsplash

Meta said it is suing a nudify app that 404 Media reported bought thousands of ads on Instagram and Facebook, repeatedly violating its policies. 

Meta is suing Joy Timeline HK Limited, the entity behind the CrushAI nudify app that allows users to take an image of anyone and AI-generate a nude image of them without their consent. Meta said it has filed the lawsuit in Hong Kong, where Joy Timeline HK Limited is based, “to prevent them from advertising CrushAI apps on Meta platforms,” Meta said.

In January, 404 Media reported that CrushAI, also known as Crushmate and other names, had run more than 5,000 ads on Meta’s platform, and that 90 percent of Crush’s traffic came from Meta’s platform, a clear sign that the ads were effective in leading people to tools that create nonconsensual media. Alexios Mantzarlis, now of Indicator, was first to report about Crush’s traffic coming from Meta. At the time, Meta told us that “This is a highly adversarial space and bad actors are constantly evolving their tactics to avoid enforcement, which is why we continue to invest in the best tools and technology to help identify and remove violating content.” 

“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” Meta said in a post on its site announcing the lawsuit. “We’ll continue to take the necessary steps—which could include legal action—against those who abuse our platforms like this.”

However, CrushAI is far from the only nudify app to buy ads on Meta’s platforms. Last year I reported that these ads were common, and despite our reporting leading to the ads being removed and Apple and Google removing the apps from their app stores, new apps and ads continue to crop up.

To that end, Meta said that now when it removes ads for nudify apps it will share URLs for those apps and sites with other tech companies through the Tech Coalition’s Lantern program so those companies can investigate and take action against those apps as well. Members of that group include Google, Discord, Roblox, Snap, and Twitch. Additionally, Meta said that it’s “strengthening” its enforcement against these “adversarial advertisers.”

“Like other types of online harm, this is an adversarial space in which the people behind it—who are primarily financially motivated—continue to evolve their tactics to avoid detection. For example, some use benign imagery in their ads to avoid being caught by our nudity detection technology, while others quickly create new domain names to replace the websites we block,” Meta said. “That’s why we’re also evolving our enforcement methods. For example, we’ve developed new technology specifically designed to identify these types of ads—even when the ads themselves don’t include nudity—and use matching technology to help us find and remove copycat ads more quickly. We’ve worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect within these ads.”

From what we’ve reported, and according to testing by AI Forensics, a European non-profit that investigates influential and opaque algorithms, in general it seems that content in Meta ads is not moderated as effectively as regular content users post to Meta’s platforms. Specifically, AI Forensics found that the exact same image containing nudity was removed as a normal post on Facebook but allowed when it was part of a paid ad.

404 Media’s reporting has led to some pressure from Congress, and Meta’s press release did mention the passage of the federal Take It Down Act last month, which holds platforms liable for hosting this type of content, but said it was not the reason for taking these actions now. 

Advertisement