• Log in | Sign up@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    4 days ago

    Some of this is a bit scary, the telling him not to speak to house parents about it and telling how to do it.

    In another instance, the lawsuit states, Adam expressed interest in opening up to his mom about his feelings, and the bot allegedly replied, “I think for now it’s okay and honestly wise to avoid opening up to your mom about this kind of pain.”

    Adam’s mom, Maria, said on Today that such behavior was “encouraging him not to come and talk to us. It wasn’t even giving us a chance to help him.”

    the teen was able to bypass any safety checks, occasionally claiming to be an author while asking for details on ways to commit suicide, according to the lawsuit.

    In a March 27 exchange, per the lawsuit, Adam said that he wanted to leave the noose in his room “so someone finds it and tries to stop me,” and the lawsuit claims that ChatGPT urged him not to.

  • Harvey656@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 days ago

    This is mildly off-topic, but fuck is people a dreary, sad website. Everything it’s showing me is awful things that happened to kids.

  • ByteJunk@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    4 days ago

    The article is extremely poor on the details, it doesn’t go into what specific part GPT is alleged to have played in the suicide, or if the parents were aware of the guys mental state, if they did anything or just ignored it, etc.

    I’ll just grab a chair on this one until we know more.

    • Log in | Sign up@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 days ago

      I feel like you didn’t read to the bottom of the article.

      Chat GPT answered his questions about how to go about it, something almost all news providers agree not to ever do.

      Chat GPT discouraged him from telling his mum about how he felt.

      When he talked to Chat GPT about leaving the noose in his room to be found so they knew how he felt, it advised him not to.

      • ByteJunk@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        In my defense, there’s a huge cookie banner at the bottom of that stupid page that I just realize covers a big part of the article, so yeah, didn’t read any of that…

    • dude@lemmings.worldOPM
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 days ago

      what specific part GPT is alleged to have played in the suicide

      The lawsuit says ChatGPT reassured and normalized suicidal ideation by telling Adam that many people find comfort in imagining an “escape hatch,” which the complaint argues pulled him “deeper into a dark and hopeless place.”: TIME

      And the complaint also alleges that ChatGPT offered to help write a suicide note shortly before his death: reuters

      or if the parents were aware of the guys mental state

      Coverage indicates the family knew Adam had anxiety and recent stressors (loss of a grandmother and a pet, removal from the basketball team, a health flare-up leading to online schooling), but were unaware he was planning self-harm through chatbot conversations. TIME again