ThisIsFine.gif

  • lukewarm_ozone
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    Every time there’s an AI hype cycle the charlatans start accusing the naysayers of moving goalposts. Heck that exact same thing was happing constantly during the Watson hype. Remember that? Or before that the Alpha Go hype. Remember that?

    Not really. As far as I can see the goalpost moving is just objectively happening.

    But fundamentally you can’t make a machine think without understanding thought.

    If “think” means anything coherent at all, then this is a factual claim. So what do you mean by it, then? Specifically: what event would have to happen for you to decide “oh shit, I was wrong, they sure did make a machine that could think”?