• mods_mum
    link
    fedilink
    arrow-up
    17
    arrow-down
    4
    ·
    3 months ago

    I mean LLMs can and will produce completely nonsensical outputs. It’s less of AI and more like a bad text prediction

    • davidgro@lemmy.world
      link
      fedilink
      arrow-up
      19
      ·
      3 months ago

      Yeah, but the point of the post is to highlight bias - and if there’s one thing an LLM has, it’s bias. I mean that literally: considering their probabilistic nature, it could be said that the only thing an LLM consists of is bias to certain words given other words. (The weights, to oversimplify)

    • slazer2au@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 months ago

      Regurgitation machine prone to hallucinations is my go-to for explaining what LLMs really are.

      • Steve@communick.news
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        3 months ago

        I heard them described as bullshiting machines. The have no concept of, or regard for truth or lies, and just spout whatever sounds good. Much of the time it’s true. Too often it’s not. Sometimes it’s hard to tell the difference.