Popular artificial intelligence (AI) apps like image generator Stable Diffusion trained with a database full of content sexual abuse of minors. This was revealed on Wednesday by research from the Stanford University Internet Observatory.

The authors of the report claim to have found at least 1,008 images. child abuse In LAION-5B, a popular open source data library for training from different companies artificial intelligence. This excess material would be removed from both social networks such as pornographic web pages.

LAION is a non-profit organization that manages these databases containing billions of data. Internet. He explained after being warned Bloomberg This will temporarily withdraw data made available to companies. Developers of Imagen, an artificial intelligence tool Googleit also found that another bookstore they accessed contained “a wide range of inappropriate content, including pornographic images, racial slurs and harmful social stereotypes.”

Facilitate pedophile content

Experts draw attention to the existence of the content pedophile This database could provide better capabilities to image creators who use it to create fake content that realistically recreates false search situations. child pornography.

“These give the model the advantage of being able to produce child abuse content that resembles real-life child abuse,” says David Thiel, chief technologist at the Stanford Internet Observatory and author of the report. Additionally, you run the risk of being re-victimized victims concrete.

This raises the possibility to worry from experts. Another report published in July by the University of California in conjunction with the Thorn organization documented that improving artificial intelligence programs was accelerating the creation of synthetic but reliable images that “facilitate” sexual exploitation. minors stale.