Advertisement
News

Microsoft Closes Loophole That Created AI Porn of Taylor Swift

Following 404 Media’s reporting, Microsoft has made changes to a tool people were using to make AI nudes of celebrities.
 Microsoft logo on a building next to the Taylor Swift Reputation album
Pexels / Unsplash / Collage by 404 Media
🖥️
404 Media is reported and written by humans for humans. Sign up above for free access to this article.

Microsoft has introduced more protections to Designer, an AI text-to-image generation tool that people were using to make nonconsensual sexual images of celebrities. Microsoft made the changes after 404 Media reported that the AI-generated nude images of Taylor Swift that went viral on Twitter last week came from 4chan and a Telegram channel where people were using Designer to make AI-generated images of celebrities. 

"We are investigating these reports and are taking appropriate action to address them," a Microsoft spokesperson told us in an email on Friday. "Our Code of Conduct prohibits the use of our tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service. We have large teams working on the development of guardrails and other safety systems in line with our responsible AI principles, including content filtering, operational monitoring and abuse detection to mitigate misuse of the system and help create a safer environment for users.”

Advertisement