• KairuByte@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 hours ago

    I mean, there’s another side to this.

    Assume you have exacting control of training data. You give it consensual sexual play, including rough play, bdsm play, and cnc play. We are 100% certain the content is consensual in this hypothetical.

    Is the output a grey area, even if it seems like real rape?

    Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?

    Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?

    We can build on that further. What if they take the time to animate this scene? Is that a grey area?

    When does the above cross into a problem? Is it the AI making something that seems like rape but is built on consensual content? The thought of a person imagining a real rape? The putting of that thought onto a still image? The animating?

    Or is it none of them?

    • surewhynotlem@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      Consensual training data makes it ok. I think AI companies should be accountable for curating inputs.

      Any art is ok as long as the artist consents. Even if they’re drawing horrible things, it’s just a drawing.

      Now the real question is, should we include rapes of people who have died and have no family? Because then you can’t even argue increased suffering of the victim.

      But maybe this just gets solved by curation and the “don’t be a dick” rule. Because the above sounds kinda dickish.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 hours ago

      Is the output a grey area, even if it seems like real rape?

      on a base semantic and mechanic level, no, not at all. They aren’t real people, there aren’t any victims involved, and there aren’t any perpetrators. You might even be able to argue the opposite, that this is actually a net positive, because it prevents people from consuming real abuse.

      Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?

      until you can either publicly display yours, or someone else process of thought, or read peoples minds, definitionally, this is an impossible question to answer. So the default is no, because it’s not possible to be based in any frame of reality.

      Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?

      assuming it depicts no real persons or identities, no, there is nothing necessarily wrong about this, in fact i would defer back to the first answer for this one.

      We can build on that further. What if they take the time to animate this scene? Is that a grey area?

      this is the same as the previous question, media format makes no difference, it’s telling the same story.

      When does the above cross into a problem?

      most people would argue, and i think precedent would probably agree, that this would start to be a problem when explicit external influences are a part of the motivation, rather than an explicitly internally motivated process. There is necessarily a morality line that must be crossed to become a more negative thing, than it is a positive thing. The question is how to define that line in regards to AI.

    • Clent@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 hours ago

      We already allow simulated rape in tv and movies. AI simply allows a more graphical portrayal.