He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
You don’t need CSAM training data to create CSAM images. If your model knows how children looks like, how naked human bodies look like, then it can create naked children. That’s simply how generative models like this work and has absolutely nothing to do with specifically trained models for CSAM using actual CSAM material.
So while I disagree with him, in that lack of education is the cause of CSAM or pedophilia… I’d say it could help with the general hysteria about LLMs, like the one’s coming from you, who just let their emotions run wild when those topics arise. You people need to understand that the goal should be the protection of potential victims, not the punishment of victimless thought crimes.
You don’t need CSAM training data to create CSAM images. If your model knows how children looks like, how naked human bodies look like, then it can create naked children. That’s simply how generative models like this work and has absolutely nothing to do with specifically trained models for CSAM using actual CSAM material.
So while I disagree with him, in that lack of education is the cause of CSAM or pedophilia… I’d say it could help with the general hysteria about LLMs, like the one’s coming from you, who just let their emotions run wild when those topics arise. You people need to understand that the goal should be the protection of potential victims, not the punishment of victimless thought crimes.