He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
for some reason the US seems to hold a weird position on this one. I don’t really understand it.
It’s written to be illegal, but if you look at prosecution cases, i think there have been only a handful of charged cases. The prominent ones which also include relevant previous offenses, or worse.
It’s also interesting when you consider that there are almost definitely large image boards hosted in the US that host what could be constituted as “cartoon CSAM” notably e621, i’d have to verify their hosting location, but i believe they’re in the US. And so far i don’t believe they’ve ever had any issues with it. And i’m sure there are other good examples as well.
I suppose you could argue they’re exempt on the publisher rules. But these sites don’t moderate against these images, generally. And i feel like this would be the rare exception where it wouldnt be applicable.
The law is fucking weird dude. There is a massive disconnect between what we should be seeing, and what we are seeing. I assume because the authorities who moderate this shit almost exclusively go after real CSAM, on account of it actually being a literal offense, as opposed to drawn CSAM, being a proxy offense.
It seems to me to be a lesser charge. A net that catches a larger population and they can then go fishing for bigger fish to make the prosecutor look good. Or as I’ve heard from others, it is used to simplify prosecution. PedoAnon can’t argue “it’s a deepfake, not a real kid” to the SWAT team.
There is a massive disconnect between what we should be seeing, and what we are seeing. I assume because the authorities who moderate this shit almost exclusively go after real CSAM, on account of it actually being a literal offense, as opposed to drawn CSAM, being a proxy offense.
This can be attributed to no proper funding of CSAM enforcement. Pedos get picked up if they become an active embarrassment like the article dude. Otherwise all the money is just spent on the database getting bigger and keeping the lights on. Which works for congress. A public pedo gets nailed to the wall because of the database, the spooky spectre of the pedo out for your kids remains, vote for me please…
It seems to me to be a lesser charge. A net that catches a larger population and they can then go fishing for bigger fish to make the prosecutor look good. Or as I’ve heard from others, it is used to simplify prosecution. PedoAnon can’t argue “it’s a deepfake, not a real kid” to the SWAT team.
ah that could be a possibility as well. Just ensuring reasonable flexibility in prosecution so you can be sure of what you get.
Cartoon CSAM is illegal in the United States
https://www.thefederalcriminalattorneys.com/possession-of-lolicon
https://en.wikipedia.org/wiki/PROTECT_Act_of_2003
for some reason the US seems to hold a weird position on this one. I don’t really understand it.
It’s written to be illegal, but if you look at prosecution cases, i think there have been only a handful of charged cases. The prominent ones which also include relevant previous offenses, or worse.
It’s also interesting when you consider that there are almost definitely large image boards hosted in the US that host what could be constituted as “cartoon CSAM” notably e621, i’d have to verify their hosting location, but i believe they’re in the US. And so far i don’t believe they’ve ever had any issues with it. And i’m sure there are other good examples as well.
I suppose you could argue they’re exempt on the publisher rules. But these sites don’t moderate against these images, generally. And i feel like this would be the rare exception where it wouldnt be applicable.
The law is fucking weird dude. There is a massive disconnect between what we should be seeing, and what we are seeing. I assume because the authorities who moderate this shit almost exclusively go after real CSAM, on account of it actually being a literal offense, as opposed to drawn CSAM, being a proxy offense.
It seems to me to be a lesser charge. A net that catches a larger population and they can then go fishing for bigger fish to make the prosecutor look good. Or as I’ve heard from others, it is used to simplify prosecution. PedoAnon can’t argue “it’s a deepfake, not a real kid” to the SWAT team.
ah that could be a possibility as well. Just ensuring reasonable flexibility in prosecution so you can be sure of what you get.