Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

  • Jeffool @lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    19 hours ago

    This would be hilarious if it weren’t for shitty cars causing deaths.

    That said, I always wondered why we don’t find a system like RFID that could penetrate concrete and asphalt, and plant passive receivers in roads? We re-pave roads so damn often in this country (the U.S.) it seems like we could’ve knocked it out in the past couple of decades, minus our most rural areas.

    I know RFID itself isn’t strong enough, but I imagine that would’ve been an easier problem than figuring our complete self driving. Not to mention making GPS a secondary system for U.S. road travel in most cases.

    Maybe it’s just a dumb shower thought?

    • JustAnotherKay@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      What you’re describing is just a higher level of autonomy. If I remember correctly, you’re describing level 3 whereas Tesla’s are level 2. I believe VW made a level 3 proof of concept mini bus back around 2020 but the legislation doesn’t allow for the sensors in the road yet because… Oh that’s right. A level 2 car manufacturer owns like half the world right now which means nobody is allowed to innovate or do better than him. Huh, that sucks.

      • towerful@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 hours ago

        Echolocation is specifically audio based.
        Lidar is a similar technique, but much more accurate and precise.
        Project a grid of laser beam, read when the laser bounces back, you know the distance to that part of the grid.

      • Jeffool @lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        18 hours ago

        I don’t know the value of echolocation in this case, as I’m generally ignorant here, but it’s straight wild to me that they went purely on visuals.

        • dan@upvote.au
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          16 hours ago

          Tesla used to also have radar (and maybe lidar?) but they removed it as a cost cutting measure. If you ever see older videos of a Tesla slowing down or stopping due to a potential collision a few cars ahead, that’s from before they switched to only relying on cameras. The collision avoidance was significantly better back then.