AI models are trained on child pornography – Kommersant
[ad_1]
The Stanford Internet Observatory (SIO) has found hundreds of images containing child sexual abuse in a public dataset that is used to train generative neural networks, including Stable Diffusion, it said. press service Stanford Cyber Policy Center.
The report claims that many large neural networks that generate images based on a text query use the data bank of the non-profit organization LAION. Among this data, researchers found child pornography, the press service reports.
It is reported that work is currently underway to remove the source materials. The researchers provided links to them to the US National Center for Missing and Exploited Children and the Canadian Child Advocacy Center. A previous SIO report, produced in conjunction with non-profit child online safety group Thorn, found that some generative neural networks can create realistic images that facilitate child sexual exploitation.
[ad_2]
Source link