Child Abuse Images Removed from Artificial Intelligence Image Generator Training Sources

2024-08-31

Artificial intelligence researchers have removed more than 2,000 web links to suspected child sexual abuse images from a database used to train popular AI image generation tools.




The LAION research database is a huge online index of images and captions and is the source of leading AI image generation tools such as Stable Diffusion and Midjourney. But a report last year found that the database contained links to child pornography, making it easy for some AI tools to produce realistic, deeply faked images depicting children. That report prompted the nonprofit organization Large-scale Artificial Intelligence Open Network to immediately remove its dataset. Eight months later, a partnership with Stanford University's watchdog group and anti-abuse organizations in Canada and the U.K. resolved the issue and released a cleaned-up database for future AI research.




The release of the cleaned-up version of the LAION database comes at a time when governments around the world are paying close attention to how some technological tools are being used to produce or distribute illegal images of children.