cm0002@lemmy.world to Technology@lemmy.zipEnglish · 6 days agoChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands whywww.pcgamer.comexternal-linkmessage-square16linkfedilinkarrow-up166arrow-down12cross-posted to: technology@hexbear.nettechnology@lemmygrad.mltechnology@lemmy.ml
arrow-up164arrow-down1external-linkChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands whywww.pcgamer.comcm0002@lemmy.world to Technology@lemmy.zipEnglish · 6 days agomessage-square16linkfedilinkcross-posted to: technology@hexbear.nettechnology@lemmygrad.mltechnology@lemmy.ml
minus-squareOptional@lemmy.worldlinkfedilinkEnglisharrow-up26arrow-down1·5 days ago*raises hand* Because it never “understood” what any “word” ever “meant” anyway?
minus-squaregeekwithsoul@lemm.eelinkfedilinkEnglisharrow-up11·5 days agoYeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.
*raises hand*
Because it never “understood” what any “word” ever “meant” anyway?
Yeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.