Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

    • Artyom@lemm.ee
      link
      fedilink
      arrow-up
      54
      ·
      8 months ago

      It’s much easier to do now. You should be able to do several in a single minute and the barrier to entry of using the software is way lower than Photoshop. Legally though, these seem indistinguishable.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      4
      ·
      edit-2
      8 months ago

      They’re easier to create and more realistic. The prevalence and magnitude of an immoral act impacts how it should be legislated. Personally I don’t care if people make these and keep it to themselves, but as soon as you spread it I think it’s immoral and harassment and there should be laws to prevent it.

    • Djtecha@lemm.ee
      link
      fedilink
      arrow-up
      23
      arrow-down
      6
      ·
      8 months ago

      Probably should have sued those people too… People need to cut this shit out. You’re fucking with others people’s life’s.

      • PoliticalAgitator@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        11
        ·
        8 months ago

        They’re not going to. There is an insane amount of entitlement around people’s jerk off material. Right here on Lemmy, I’ve seen someone (who denied being a child) call pornography a “human right” and groups of people insisting they should be able to openly trade images of child rape as long as they’re AI generated.

        • Cryophilia@lemmy.world
          link
          fedilink
          arrow-up
          16
          arrow-down
          4
          ·
          8 months ago

          Fuck you people who equate pornography with child porn. You know what you’re doing, you sick bastards.

          Pornography is not at all the same thing as child porn. Do not speak about them in the same way.

        • Fedizen@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          8 months ago

          That sounds like an insane amount of entitlement from the one guy you found. Hopefully that entitles you to ignore everyone with even a fraction more nuance.

          • PoliticalAgitator@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            8 months ago

            How dare I ignore the many subtle layers of nuance in “Using AI to create pornographic images of a woman and then sending them to her so she knows you’ve done it”.

        • Flying Squid@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          and groups of people insisting they should be able to openly trade images of child rape as long as they’re AI generated.

          “Be able to” in what sense? Morally and ethically? No, absolutely not obviously. But what would the legal reason be to make it illegal since no actual children were involved? If I paint an explicit painting of a child being raped, is that illegal? I don’t think it would be. It would certainly give people good reason to be suspicious of me, but would it be illegal? And would an AI-generated image really be different?

          • PoliticalAgitator@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            8 months ago

            But what would the legal reason be to make it illegal since no actual children were involved

            Prove it. Trawl through thousands and thousands of images and videos of child sexual assualt and tell me which ones were AI generated and which were not. Prove the AI hadn’t been set up to produce CSAM matching a real child’s likeness. Prove it won’t normalize and promote the sexual assault of real children. Prove it wasn’t trained on images and videos of real children being raped.

            Legalising AI-generated child pornography is functionally identical to legalising all child pornography.

            • Flying Squid@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              8 months ago

              Legalizing or already legal? Because that’s my question. I don’t think it would be illegal, at least not in the U.S. I can’t speak for other countries, but here, proving a negative in court isn’t a thing.

    • inspxtr@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      I think porn generation (image, audio and video) will eventually be very realistic and very easy to make with only a few clicks and some well crafted prompts. Things would just be a whole other level that what Photoshop used to be.