New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • hark@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    3
    ·
    1 year ago

    Setting aside the driver issue, isn’t this another case that could’ve been prevented with LIDAR?

  • redcalcium@lemmy.institute
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    1 year ago

    Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash.

    Is the video slowed down? In the video, if you pause 2.5s before the crash, the stopped police car seems to be very close already. A (awake) human driver would’ve recognized the stopped police car from way more distance than that.

    • Thetimefarm@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      I find that it can be hard to tell when a car ahead is stopped, maybe the visual system on the tesla has similar limitations. I think autopilot is controlled by the cameras alone but I’m not super up to date on tesla stuff. I would assume even a basic radar set up could tell something was stationary from quite far away.

  • Snapz@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    5
    ·
    1 year ago

    This source keeps pushing tesla propaganda. There’s always an angle trying to sell that it wasn’t the tesla’s fault

  • hoodlem@hoodlem.me
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    In fact, by the time the crash happens, it’s alerted the driver to pay more attention no less than 150 times over the course of about 45 minutes. Nevertheless, the system didn’t recognize a lack of engagement to the point that it shut down Autopilot

    I blame the driver, but if the above is true there was a problem with the Tesla as well. The Tesla is intended to disengage and disable autopilot for the remainder of the drive after a small number of ignored alerts. If the car didn’t do that, there’s a bug in the Tesla software.

    I think it’s more likely the driver used a trick to make the car think he was engaged when he was not. You can do things like put a water bottle wedged in the steering wheel to make the car think you have tugged on the steering wheel to prove you are engaged. (Don’t ask me how I know)

    • RushingSquirrel@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      After 3 alerts, it’s off until you park. There are visual cues that precede the alert though and these do not count. I don’t recall how many there are and for how long, but you start by seeing a message asking to have your hands on the wheel, then a blue line at the top, them the line starts pulsing ,then you’ve got an audio alert that is the first strike. Three strikes during the same drive and you need to park before using autopilot again.

      • meco03211@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        And those alerts don’t come if you’ve overridden the system by putting a weight on the wheel or something.

  • Jeena@jemmy.jeena.net
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    So if the guy behind the wheel died and couldn’t react to the alerts then the car can’t do a decision to just stop instead of crashing into a police car?

    • pec@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      1 year ago

      He was reacting to alerts, complying to them by simply touching the steering wheel. He did that 150 times during that 45 minute trip ( not all the trip was on auto pilot).

      So if the guy died the car would of disengaged auto pilot (I’m not sure how this works).

      You can check the video in the article. It’s quite informative .

      Edit

      I saw another video and it takes ~60 seconds after taking off your hand from the steering wheel for the car to safely come to a full stop.

      • socsa@lemmy.ml
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        1 year ago

        So the headline should be “drunk driver hits police car.”

        • Landmammals@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Was he drunk? The article seems to use the fact that the car nagged him 150 times as evidence that he was impaired.

      • tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        TBH if you’re not used to it the steering wheel check can warn frequently. It’s checking for a small amount of torque on the wheel rather than actually holding it (as there are no pressure sensors) and that catches people out but the prompt says to put your hands on the wheel… I could believe 150 times on a long journey.

  • Pablo@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    It’s also so misleading that Tesla use the word Autopilot for what is basically adaptive cruise control and lane assist

  • thatKamGuy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?

    Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!

    • RushingSquirrel@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      I believe this is caused by the fog combined with flashing lights and upward/curved road. The Tesla autopilot system is super impressive in almost all situations but you can clearly see the limits in extreme situations. Here, the drunk driver is definitely at fault, I don’t understand why they’d sue Tesla.

  • MrSpArkle@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I think Mercedes is the only car company that will accept blame for a self-driving or self-parking failure. That should tell you something.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      2
      ·
      edit-2
      1 year ago

      Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn’t change these facts.

      In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

      Source

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          Perhaps. I’m sure you’ll provide me with the independent data you’re basing that “Teslas are not safe” claim on

      • tiny_electron@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        There is a bias here in the numbers. Teslas are expensive and not everyone is buying them. The lower accident rate can be explained by the different demographic driving the vehicle rather than Teslas being better. For exemple, younger people might be more likely to cause accident because of different factors and they are also less likely to buy a Tesla because they are so expensive. I dont have the numbers for this, but we should all be careful with the claims of Tesla on safety when they compared themself to the global average.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          Sure. There are always multiple factors in play. However I’d still be willing to bet that there’s nothing in Teslas that makes them inherently unsafe compared to other cars.

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I’m not so sure disengaging autopilot because the driver’s hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that’s the better way?

    Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end “The driver was in control at the moment of the crash” just again feels like bad “self” driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.

    Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it’s again a sign you shouldn’t be releasing this to the public. It’s clearly just not ready.

    Not taking any responsibility away from the human driver here. I just don’t think the behaviour was good enough for software controlling a car used by the public.

    Not to mention, of course, the reason for suing Tesla isn’t because they think they’re more liable. It’s because they can actually get some money from them.

      • r00ty@kbin.life
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        The question here is, could you see there was a reason to stop the car significantly (more than 3 seconds) before the autopilot did? If we can recognize it through the haze the autopilot must too.

        Moreover, it needs to now be extra good at spotting vehicles in bad lighting conditions because other sensors are removed on newer Teslas. It only has cameras to go on.

  • Peanut@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    i still think tesla did a poor job in conveying the limitations on the larger scale. they piggybacked waymo’s capability and practice without matching it, which is probably why so many are over reliant. i’ve always been against mass-producing semi-autonomous vehicles to the general public. this is why.

    and then this garbage is used to attack the general concept of autonomous vehicles, which may become a fantastic life-saver, because then it can safely drive these assholes around.

  • Jordan Lund@lemmy.one
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    Don’t see how that’s a Tesla problem… Drunk/high driver operating their car incorrectly.