mozz@mbin.grits.dev to Technology@beehaw.org · 7 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square199fedilinkarrow-up1488arrow-down10file-text
arrow-up1488arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 7 months agomessage-square199fedilinkfile-text
minus-squarejarfil@beehaw.orglinkfedilinkarrow-up6·edit-27 months agoHAL from “2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well. But surely nobody would ever use these LLMs on space missions… right?.. right!?
HAL from “2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well.
But surely nobody would ever use these LLMs on space missions… right?.. right!?