Da Cap’n@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 month agoAI Programming Assistant Tells User to Stop Being Lazy and Learn to Codefuturism.comexternal-linkmessage-square21linkfedilinkarrow-up1278arrow-down124
arrow-up1254arrow-down1external-linkAI Programming Assistant Tells User to Stop Being Lazy and Learn to Codefuturism.comDa Cap’n@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 month agomessage-square21linkfedilink
minus-squarePattyMcB@lemmy.worldlinkfedilinkEnglisharrow-up36arrow-down1·1 month agoI’ll take “bullshit that never happened” for 400, Alex
minus-squareqantravon@lemmy.worldlinkfedilinkEnglisharrow-up48·1 month agoOh it did, they fed it data from Stack Overflow.
minus-squareRhaedas@fedia.iolinkfedilinkarrow-up16·1 month agoCould have been Reddit data. Stack Overflow is more heavy on the probability of telling them it’s a question that’s been asked before.
minus-squarekameecoding@lemmy.worldlinkfedilinkEnglisharrow-up3·1 month agoOr that what they are trying to do is stupid and just do X instead, which to be fair is usually actually the correct take
minus-squareDeebster@infosec.publinkfedilinkEnglisharrow-up16·1 month agoNot that it couldn’t be faked, but here’s the bug report with screenshot: https://forum.cursor.com/t/cursor-told-me-i-should-learn-coding-instead-of-asking-it-to-generate-it-limit-of-800-locs/61132
minus-squareGloomy@mander.xyzlinkfedilinkEnglisharrow-up9·1 month agoSo going trough this the reason the AI replied like it did is because the user used a feature called “quick question” that can only respond with text and not code. (It’s the last answer marked as “solution” in the linked thread. Still funny tough.
I’ll take “bullshit that never happened” for 400, Alex
Oh it did, they fed it data from Stack Overflow.
Could have been Reddit data. Stack Overflow is more heavy on the probability of telling them it’s a question that’s been asked before.
Or that what they are trying to do is stupid and just do X instead, which to be fair is usually actually the correct take
Not that it couldn’t be faked, but here’s the bug report with screenshot: https://forum.cursor.com/t/cursor-told-me-i-should-learn-coding-instead-of-asking-it-to-generate-it-limit-of-800-locs/61132
So going trough this the reason the AI replied like it did is because the user used a feature called “quick question” that can only respond with text and not code. (It’s the last answer marked as “solution” in the linked thread.
Still funny tough.