cross-posted from: https://fedia.io/m/fuckcars@lemmy.world/t/2201156

In case you were worried about the roads being too safe, you can rest easily knowing that Teslas will be rolling out with unsupervised “Full Self Driving” in a couple days.

It doesn’t seem to be going great, even in supervised mode. This one couldn’t safely drive down a simple, perfectly straight road in broad daylight :( Veered off the road for no good reason. Glad nobody got badly hurt.

We analyze the onboard camera footage, and try to figure out what went wrong. Turns out, a lot. We also talk through how camera-only autonomous cars work, Tesla’s upcoming autonomous taxi rollout, and how AI hallucinations figure into everything.

  • Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    17
    ·
    1 day ago

    You and I are (mostly) able to safely navigate a vehicle with 3D stereoscopic vision. It’s not a sensor issue, it’s a computation issue.

    • brygphilomena@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 hours ago

      If I eventually end up on a fully self driving vehicle, I want it to be better than what you and I can do with our eyes.

      Is it possible to drive with just stereoscopic vision, yea. But why is Tesla against BEING BETTER than humans?

    • Computation NOW cannot replicate what humans do with our rather limited senses.

      “Self-driving” cars are being made NOW.

      That means it’s the NOW computation we worry about, not some hypothetical future computation capabilities. And the NOW computation cannot do the job safely with just vision.

    • TwanHE@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      21 hours ago

      In theory maybe, but our brains are basically a supercomputer on steroids when it comes to interpreting and improving the “video feed” our eyes give us.

      Could it be done with just cameras, probably some time in the future, but why the fuck wouldn’t you use a depth sensor now, and even in the future as a redundancy.

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        16 hours ago

        our brains are basically a supercomputer on steroids

        Yeah I mean that’s what I said.

    • this_1_is_mine@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      22 hours ago

      I can also identify a mirror. Tesla smashed that head on. If you can’t effectively understand the image then it’s not enough information.

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        21 hours ago

        I can also identify a mirror.

        My point exactly.

        If you can’t effectively understand the image then it’s not enough information.

        No, it’s just not able to process the information it has.

    • arrakark@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 hours ago

      I get what you are saying. But adding a new form of data input into the system would probably function to improve performance, not decrease it. I don’t think it makes sense to not add LIDAR into Teslas.

      All of this feels like Elon was asked to justify not putting a rather expensive (at the time) set of sensors into the Teslas, and he just doubles down and says that they will compensate with software.