Advertisement
News

‘What Was She Supposed to Report?:’ Police Report Shows How a High School Deepfake Nightmare Unfolded

An in-depth police report obtained by 404 Media shows how a school, and then the police, investigated a wave of AI-powered “nudify” apps in a high school.
‘What Was She Supposed to Report?:’ Police Report Shows How a High School Deepfake Nightmare Unfolded
Image: Unsplash/Moren Hsu. Collage by 404 Media.
🖥️
404 Media is a journalist-owned website. Sign up to support our work and for free access to this article. Learn why we require this here.

Police in Washington state were alarmed that administrators at a high school did not report that students used AI to take photos from other students’ Instagram accounts and “undress” around seven of their underage classmates, which police characterized as a possible sex crime against children. A police report obtained by 404 Media shows in excruciating detail how the deepfake scandal took over the school, expresses frustration that school administration tried to handle it internally, and shows that a school staffer, who was also a victim of a nonconsensual AI-generated nude, was put in charge of the school’s investigation. 

The incident, which happened at Issaquah High School in suburban Seattle, has been mentioned in several local and national news articles in recent months, but very few details about what happened have been made public. State lawmakers say that they want to update laws because of what happened. The police report, which hasn’t been covered previously, shows how relatively common and easy to use AI tools which are being promoted to minors via massive social media platforms are already being used to harass and bully girls, spread fear in small communities, and create horrifying situations for students, parents, teachers, and police. 

The police report makes clear that the images were created with a web-based “nudify” or “undress” app, which automatically and instantly edits photos of women to make them appear naked. The students who used the app to create naked images of other students told police they discovered the app on TikTok and posted some of them on Snapchat or showed them to other students at the lunch table at school. A student “reportedly admitted to making the photos,” the police report says. “[Redacted] went on to tell his friends that he found an app on TikTok for ‘naked AI.’ He then went onto [sic] Safari app and gave them a step by step of how it was done.” The redaction is in the copy of the police report that 404 Media obtained. 404 Media is not publishing the full police report because, even though it is redacted, there is enough information throughout that individual students could be identified.

Sign up for free access to this post

Free members get access to posts like this one along with an email round-up of our week's stories.
Subscribe
Advertisement