A large image dataset used to develop AI tools for detecting nudity contains a number of images of child sexual abuse material (CSAM), according to the Canadian Centre for Child Protection (C3P).
The NudeNet dataset, which contains more than 700,000 images scraped from the internet, was used to train an AI image classifier which could automatically detect nudity in an image. C3P found that more than 250 academic works either cited or used the NudeNet dataset since it was available download from Academic Torrents, a platform for sharing research data, in June 2019.
This post is for paid members only
Become a paid member for unlimited ad-free access to articles, bonus podcast content, and more.
Subscribe
Sign up for free access to this post
Free members get access to posts like this one along with an email round-up of our week's stories.
Subscribe
Already have an account? Sign in