Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youā€™ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cutā€™nā€™paste it into its own post ā€” thereā€™s no quota for posting and the bar really isnā€™t that high.

The post Xitter web has spawned soo many ā€œesotericā€ right wing freaks, but thereā€™s no appropriate sneer-space for them. Iā€™m talking redscare-ish, reality challenged ā€œculture criticsā€ who write about everything but understand nothing. Iā€™m talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyā€™re inescapable at this point, yet I donā€™t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldnā€™t be surgeons because they didnā€™t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canā€™t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting thisā€¦)

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    16
    Ā·
    8 days ago

    Utterly rancid linkedin post:

    text inside image:

    Why can planes ā€œflyā€ but AI cannot ā€œthinkā€?

    An airplane does not flap its wings. And an autopilot is not the same as a pilot. Still, everybody is ok with saying that a plane ā€œfliesā€ and an autopilot ā€œpilotsā€ a plane.

    This is the difference between the same system and a system that performs the same function.

    When it comes to flight, we focus on function, not mechanism. A plane achieves the same outcome as birds (staying airborne) through entirely different means, yet we comfortably use the word ā€œflyā€ for both.

    With Generative AI, something strange happens. We insist that only biological brains can ā€œthinkā€ or ā€œunderstandā€ language. In contrast to planes, we focus on the system, not the function. When AI strings together words (which it does, among other things), we try to create new terms to avoid admitting similarity of function.

    When we use a verb to describe an AI function that resembles human cognition, we are immediately accused of ā€œanthropomorphizing.ā€ In some way, popular opinion dictates that no system other than the human brain can think.

    I wonder: why?

    • scruiser@awful.systems
      link
      fedilink
      English
      arrow-up
      14
      Ā·
      edit-2
      8 days ago

      I can use bad analogies also!

      • If airplanes can fly, why canā€™t they fly to the moon? It is a straightforward extension of existing flight technology, and plotting airplane max altitude from 1900-1920 shows exponential improvement in max altitude. People who are denying moon-plane potential just arenā€™t looking at the hard quantitative numbers in the industry. In fact, with no atmosphere in the way, past a certain threshold airplanes should be able to get higher and higher and faster and faster without anything to slow them down.

      I think Eliezer might have started the bad airplane analogiesā€¦ let me see if I can find a linkā€¦ and I found an analogy from the same author as the 2027 fanfic forecast: https://www.lesswrong.com/posts/HhWhaSzQr6xmBki8F/birds-brains-planes-and-ai-against-appeals-to-the-complexity

      Eliezer used a tortured metaphor about rockets, so I still blame him for the tortured airplane metaphor: https://www.lesswrong.com/posts/Gg9a4y8reWKtLe3Tn/the-rocket-alignment-problem

      • swlabr@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        Ā·
        7 days ago

        JFC I click on the rocket alignment link, itā€™s a yud dialogue between ā€œalfonsoā€ and ā€œbethā€. I am not dexyā€™ed up enough to read this shit.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      13
      Ā·
      edit-2
      8 days ago

      Yes the 2 rs in strawberry machine thinks. In the same way that an airplane flies. /s

      E: it gets even worse as half the AI field says the airplanes fly like how birds do. That is why the anthropomorphization is bad. Because it both doesnā€™t think as in the function, nor think as in the system. And by anthropomorphizing people make it look like it can do both.

    • rook@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      Ā·
      edit-2
      8 days ago

      Dijkstra did it first, but it is very ai-booster to steal work without credit or understanding, I guess.

      The question of whether Machines Can Thinkā€¦ is about as relevant as the question of whether Submarines Can Swim.

      Threats to computing science

      • bitofhope@awful.systems
        link
        fedilink
        English
        arrow-up
        6
        Ā·
        7 days ago

        You ever see a random shitpost video, like the backgroud music, look it up and realize you already have the vinyl record? That just happened to me.