You can hardly get online these days without hearing some AI booster talk about how AI coding is going to replace human programmers. AI code is absolutely up to production quality! Also, you’re all…
“AI” is nowhere because it doesn’t exist. Sure, there are programs that are good at summarizing Stackexchange but is that so really amazing? Maybe it saves devs a few seconds? Do we credit “AI” with amazing writing when people use grammar correction? The hype is so inane. Don’t feed into it with this nonsense.
As the article explains, they haven’t been able to find any meaningful contributions to actual problems. I’m sure that plagarized summaries can help with your boilerplates/etc but that’s not “AI”.
“AI” is a very broad term. Back when I went to university, my AI course started out with Wumpus World. While this is an extremely simple problem, it’s still considered “AI”.
The enemies in computer games that are controlled by the computer are also considered “AI”.
Machine learning algorithms, like recommender algorithms, and image recognition are also considered “AI”
LLMs like ChatGPT and Claude are also “AI”
None of these things are conscious, self aware or intelligent, yet they are part of the field called “AI”.
These are however not “AGI” (Artificial General Intelligence). AGI is when the machine becomes conscious, and self aware. This is the scenario that all the sci-fi movies portray. We are still very far away from this stage.
Yes, we do care that it’s unintelligent because that’s the reason it can’t be trusted with anything important. This is not being pedantic. This technology is unreliable dogshit. We’ll still be having this conversation in 2030 if it hasn’t cooked us all or lost it’s undeserved hype.
“AI” is nowhere because it doesn’t exist. Sure, there are programs that are good at summarizing Stackexchange but is that so really amazing? Maybe it saves devs a few seconds? Do we credit “AI” with amazing writing when people use grammar correction? The hype is so inane. Don’t feed into it with this nonsense.
As the article explains, they haven’t been able to find any meaningful contributions to actual problems. I’m sure that plagarized summaries can help with your boilerplates/etc but that’s not “AI”.
“AI” is a very broad term. Back when I went to university, my AI course started out with Wumpus World. While this is an extremely simple problem, it’s still considered “AI”.
The enemies in computer games that are controlled by the computer are also considered “AI”.
Machine learning algorithms, like recommender algorithms, and image recognition are also considered “AI”
LLMs like ChatGPT and Claude are also “AI”
None of these things are conscious, self aware or intelligent, yet they are part of the field called “AI”.
These are however not “AGI” (Artificial General Intelligence). AGI is when the machine becomes conscious, and self aware. This is the scenario that all the sci-fi movies portray. We are still very far away from this stage.
Thats like saying search engines dont exist.
AI definitely exists. Its basically just a slightly faster way to get code from stack exchange, except with less context and more uncertainty
If the only point you can make is picking apart that LLMs don’t “count” as AI, then sorry mate but 2022 called, ot wants it’s discussion back.
No one really cares about this distinction anymore. It’s like literally vs figuratively.
LLMs are branded under the concept of AI, arguing it doesn’t count is not a discussion people really care about anymore in the industry.
Yes, we do care that it’s unintelligent because that’s the reason it can’t be trusted with anything important. This is not being pedantic. This technology is unreliable dogshit. We’ll still be having this conversation in 2030 if it hasn’t cooked us all or lost it’s undeserved hype.
Lol, wait, is that why you were contending the “AI” title?
lmao