• deathbird@mander.xyz
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    2
    ·
    5 months ago

    the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well… You need to feed it CSAM.

    First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.

    But also, AI systems can blend multiple elements together. They don’t need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.

    • PotatoKat@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      5 months ago

      You ignored the second part of their post. Even if it didn’t use any csam is it right to use pictures of real children to generate csam? I really don’t think it is.

      • deathbird@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        There are probably safeguards in place to prevent the creation of CSAM, just like there are for other illegal and offensive things, but determined people work around them.