Advertisement
News

AI Dataset for Detecting Nudity Contained Child Sexual Abuse Images

The Canadian Centre for Child Protection found more than 120 images of identified or known victims of CSAM in the dataset.
AI Dataset for Detecting Nudity Contained Child Sexual Abuse Images
Photo by Stefan Heinemann / Unsplash

A large image dataset used to develop AI tools for detecting nudity contains a number of images of child sexual abuse material (CSAM), according to the Canadian Centre for Child Protection (C3P). 

The NudeNet dataset, which contains more than 700,000 images scraped from the internet, was used to train an AI image classifier which could automatically detect nudity in an image. C3P found that more than 250 academic works either cited or used the NudeNet dataset since it was available download from Academic Torrents, a platform for sharing research data, in June 2019.

Sign up for free access to this post

Free members get access to posts like this one along with an email round-up of our week's stories.
Subscribe
Advertisement