• 41 Posts
  • 2.3K Comments
Joined 1 year ago
cake
Cake day: December 20th, 2023

help-circle










  • As an advocate for online and offline safety of children, I did read into the research. None of the research I’ve found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.

    For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.

    Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.

    They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.


  • Much as all in modern AI - it’s able to train without much human intervention.

    My point is, even if results are not perfectly accurate and resembling a child’s body, they work. They are widely used, in fact, so widely that Europol made a giant issue out of it. People get off to whatever it manages to produce, and that’s what matters.

    I do not care about how accurate it is, because it’s not me who consumes this content. I care about how efficient it is at curbing worse desires in pedophiles, because I care about safety of children.



  • I actually do not agree with them being arrested.

    While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.

    AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.

    By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there’s one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.


  • I’m afraid Europol is shooting themselves in the foot here.

    What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

    Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there’s no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it’s easier to escalate, and that’s dangerous.

    As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.





  • AllerotoLinux@lemmy.mlWhich browser do you use and why?
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    3 days ago

    Zen for regular activities (I pin all important services), Firefox for browsing for something else.

    GNU IceCat is also amazing as concept, but generally unusable since it ends up blocking too much and manually allowing everything is a hassle. But still, the pages that work are clean, and I love that by default the browser doesn’t do anything without your permission - it doesn’t even connect to update and telemetry services, it has 0 connections on startup, unlike almost anything (qutebrowser does the same, but, unless you are a strong Vim fanboy, you won’t like the experience).


  • AllerotoProton @lemmy.worldI'm out
    link
    fedilink
    English
    arrow-up
    21
    ·
    3 days ago

    They stopped their official Mastodon presence citing having not enough resources to maintain communities everywhere, which caused outrage among the Fediverse.

    Previously, they vocally supported the actions of Trump administration on the matters of Internet privacy, which caused a massive backlash.

    So, essentially, they have alienated a lot of userbase by making questionable moves.