Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.
I get how fucking creepy and downright sickening this all feels, but I’m genuinely surprised that it’s illegal or criminal if there’s no actual children involved.
It mentions sexual extortion and that’s definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.
On one hand I don’t think this kind of thing can be consequence free (from a practical standpoint). On the other hand… how old were the subjects? You can’t look at a person to determine their age and someone that looks like a child but is actually adult wouldn’t be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.
This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.
I think it’s pretty stupid. Borders on Thought Crime kind of stuff.
I’d rather see that kind of enforcement and effort go towards actually finding people who are harming children.
Ehhhhh…
It also borders on real CSAM
Walk me through how any rendering “borders on” proof that a child was raped.