Yes but then we built a weapon with with to murder truth, and with it meaning, so everything is just vibesy meaning-mush now. And you’re a big dumb meanie for hating the thing that saved ys from having/being able to know things. Meanie.
No. Artificial Intelligence has to be imitating intelligent behavior - such as the ghosts imitating how, ostensibly, a ghost trapped in a maze and hungry for yellow circular flesh would behave, and how CS1.6 bots imitate the behavior of intelligent players. They artificially reproduce intelligent behavior.
Which means LLMs are very much AI. They are not, however, AGI.
As far as I’m concerned, “intelligence” in the context of AI basically just means the ability to do things that we consider to be difficult. It’s both very hand-wavy and a constantly moving goalpost. So a hypothetical pacman ghost is intelligent before we’ve figured out how to do it. After it’s been figured out and implemented, it ceases to be intelligent but we continue to call it intelligent for historical reasons.
Oh noo you called me a robot racist. Lol fuck off dude you know that’s not what I’m saying
The problem with supporters of AI is they learned everything they know from the companies trying to sell it to them. Like a 50s mom excited about her magic tupperware.
AI implies intelligence
To me that means an autonomous being that understands what it is.
First of all these programs aren’t autonomous, they need to be seeded by us. We send a prompt or question, even when left alone to its own devices it doesn’t do anything until it is given an objective or reward by us.
Looking up the most common answer isn’t intelligence, there is no understanding of cause and effect going on inside the algorithm, just regurgitating the dataset
These models do not reason, though some do a very good job of trying to convince us.
If you want an even older example, the ghosts in Pac-Man could be considered AI as well.
By this logic any solid state machine is AI.
These words used to mean things before marketing teams started calling everything they want to sell “AI”
Yes but then we built a weapon with with to murder truth, and with it meaning, so everything is just vibesy meaning-mush now. And you’re a big dumb meanie for hating the thing that saved ys from having/being able to know things. Meanie.
No. Artificial Intelligence has to be imitating intelligent behavior - such as the ghosts imitating how, ostensibly, a ghost trapped in a maze and hungry for yellow circular flesh would behave, and how CS1.6 bots imitate the behavior of intelligent players. They artificially reproduce intelligent behavior.
Which means LLMs are very much AI. They are not, however, AGI.
What if I told you agi is made up by the same people that misuse ai
No, the logic for a Pac Man ghost is a solid state machine
Stupid people attributing intelligence to something that is probably not is a shameful hill to die on.
Your god is just an autocomplete bot that you refuse to learn about outside the hype bubble
As far as I’m concerned, “intelligence” in the context of AI basically just means the ability to do things that we consider to be difficult. It’s both very hand-wavy and a constantly moving goalpost. So a hypothetical pacman ghost is intelligent before we’ve figured out how to do it. After it’s been figured out and implemented, it ceases to be intelligent but we continue to call it intelligent for historical reasons.
Okay, what is your definition of AI then, if nothing burned onto silicon can count?
If LLMs aren’t AI, then absolutely nothing up to this point probably counts either.
Oh noo you called me a robot racist. Lol fuck off dude you know that’s not what I’m saying
The problem with supporters of AI is they learned everything they know from the companies trying to sell it to them. Like a 50s mom excited about her magic tupperware.
AI implies intelligence
To me that means an autonomous being that understands what it is.
First of all these programs aren’t autonomous, they need to be seeded by us. We send a prompt or question, even when left alone to its own devices it doesn’t do anything until it is given an objective or reward by us.
Looking up the most common answer isn’t intelligence, there is no understanding of cause and effect going on inside the algorithm, just regurgitating the dataset
These models do not reason, though some do a very good job of trying to convince us.
A little thought experiment: How would you determine whether another human being understands what it is? What would that look like in a machine?
Okay but if i say something from outside the hype bubble then all my friends except chatgpt will go away.
Also chatgpt is my friend and always will be, and it even told me i don’t have to take the psych meds that give me tummy aches!