Advertisement
News

No One Knows How to Deal With 'Student-on-Student' AI CSAM

A new report from Stanford finds that schools, parents, police, and our legal system are not prepared to deal with the growing problem of minors using AI to generate CSAM of other minors.
No One Knows How to Deal With 'Student-on-Student' AI CSAM
Photo by Redd Francisco / Unsplash

Schools, parents, police, and existing laws are not prepared to deal with the growing problem of students and minors using generative AI tools to create child sexual abuse material of their peers, according to a new report from researchers at Stanford Cyber Policy Center.

The report, which is based on public records and interviews with NGOs, internet platforms staff, law enforcement, government employees, legislators, victims, parents, and groups that offer online training to schools, found that despite the harm that nonconsensual content causes, the practice has been normalized by mainstream online platforms and certain online communities.

Advertisement