A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.”

So, you admit that the company’s marketing has continued to lie for the past six years?

  • partial_accumen@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    2 days ago

    Don’t take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.

    I had to find two other articles to figure out if the system being used here was Tesla’s free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.

    There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.

    Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that’s true, Tesla has positioned its cars as being highly autonomous, and often times doesn’t call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.

    So I DO blame Tesla, even if the driver contributed to the accident.

    • TranscendentalEmpire@lemmy.today
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      I feel like calling it AutoPilot is already risking liability, Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

      • partial_accumen@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        I feel like calling it AutoPilot is already risking liability,

        From an aviation point of view, Autopilot is pretty accurate to the original aviation reference. The original aviation autopilot released in 1912 for aircraft would simply hold an aircraft at specified heading and altitude without human input where it would operate the aircraft’s control surfaces to keep it on its directed path. However, if you were at an altitude that would let you fly into a mountain, autopilot would do exactly that. So the current Tesla Autopilot is pretty close to that level of functionality with the added feature of maintaining a set speed too. Note, modern aviation autopilot is much more functional in that it can even land and takeoff airplanes for specific models

        Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

        I agree. I think Musk always intended FSD to live up to the name, and perhaps named it that aspirationally, which is all well and good, but most consumers don’t share that mindset and if you call it that right now, they assume it has that functionality when they buy it today which it doesn’t. I agree with you that it was a legal liability waiting to happen.

        • Auli@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 day ago

          So your comparing a we well say 2020 technology to the 1915 version of autopilot and not the kind in the 2020s that is much more advanced. Yah what BS.

          • atrielienz@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 day ago

            Because it still basically does what’s they said. The only new advent for the autopilot system besides maintaining speed, heading, and altitude is the ability to use and set a GPS heading, and waypoints (for the purposes of this conversation). It will absolutely still fly into a mountain if not for other collision avoidance systems. Your average 737 or A320 is not going to spontaneously change course just because of the elevation of the ground below it changed. But you can program other systems in the plane to know to avoid a specific flight path because there is a known hazard. I want you to understand that we know a mountain is there. They don’t move around much in short periods of time. Cars and pedestrians are another story entirely.

            There’s a reason we still have air traffic controllers and even then pilots and air traffic control aren’t infallible and they have way more systems to make flying safe than the average car (yes even the average Tesla).

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 days ago

      FSD wasn’t even available (edit to use) in 2019. It was a future purchase add on that only went into very limited invite only beta in 2020.

      In 2019 there was much less confusion on the topic.

    • Geyser@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?

      Going off of OP’s quote, the jury found the driver responsible but Tesla is found liable, which is pretty confusing. It might make some sense if expected autopilot functionality despite the drivers foot being on the pedal didn’t work.

      • partial_accumen@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 days ago

        Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?

        From the article, it looks like the car didn’t even try to stop because the driver was overridden by the acceleration because the driver had their foot pressed on the pedal (which isn’t normal during autopilot use).