Family of Florida Man Killed in Autopilot Crash Sues Tesla

By | August 2, 2019

  • August 2, 2019 at 12:21 pm
    Rosenblatt says:
    Well-loved. Like or Dislike:
    Thumb up 19
    Thumb down 0

    This is a horrible tragedy and I sincerely feel for the family this guy left behind. That said, I do not see how the autopilot could be considered defective like the family’s attorney claims. I mean, look at all these different warnings about Tesla’s autopilot feature. Just because some people may not take the time to read something (like an insurance policy) doesn’t mean that absolves them of the responsibilities listed therein.

    “While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car.”

    “Before enabling Autopilot, the driver first needs to agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle””

    “The currently enabled features require active driver supervision and do not make the vehicle autonomous”

    “Do I still need to pay attention while using Autopilot?
    Yes. Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver. It does not turn a Tesla into a self-driving car nor does it make a car autonomous.”

    “Before enabling Autopilot, you must agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your car.””

    “While using Autopilot features, it is your responsibility to stay alert, drive safely and be in control of your car at all times.”

    • August 2, 2019 at 1:55 pm
      CL PM says:
      Like or Dislike:
      Thumb up 8
      Thumb down 6

      But does Tesla have any culpability as it is apparent their customers are not reading the warning language? In my mind, their culpability comes from the “Autopilot” name. At the least, they could change the name to something like “Driver Assist?” Seems obvious to me.

      • August 2, 2019 at 2:35 pm
        Rosenblatt says:
        Like or Dislike:
        Thumb up 10
        Thumb down 1

        IMHO, no they don’t. Ignorance of the law (or the terms to which you’ve agreed) is not an excuse. How many times have you clicked a “Terms and Agreement” online without reading it? If something goes wrong, those T&A are still valid even if you argue you didn’t read them. Or say you sign up for a credit card with 0% interest, but didn’t read the terms & conditions which say it’ll jump to 25% after 6 months. You know what happens if you argue your 0% interest should continue after 6 months? The company says ‘too bad, you should’ve read the terms and conditions to which you already agreed.’

        While I completely agree “Autopilot” is easily misconstrued and they should’ve used a better name, how many times do they have to tell a driver to pay attention and be ready to take avoidance actions before they’re no longer liable for people not heeding their warnings? 5 warnings? 30 warnings?

        Autopilot did not fail nor was it defective. What failed was the operator not heeding the abundance of warnings he was given prior to, and during, turning that feature on.

        • August 2, 2019 at 3:07 pm
          CC says:
          Like or Dislike:
          Thumb up 5
          Thumb down 0

          We will see how this turns out. The family is suing for a payout and trying to place blame on anyone but the deceased. My guess is that Tesla will settle this as quietly as possible for a couple million.

          • August 2, 2019 at 3:23 pm
            Rosenblatt says:
            Like or Dislike:
            Thumb up 1
            Thumb down 1

            Yuuuuuuuuuuuuuup
            -Lana Kane

      • August 2, 2019 at 4:54 pm
        Jack King says:
        Like or Dislike:
        Thumb up 7
        Thumb down 1

        There’s no Ham in a Hamburger, yet people know it’s beef.

    • August 8, 2019 at 12:45 pm
      Jenifer M-K says:
      Like or Dislike:
      Thumb up 1
      Thumb down 2

      I’m considering the reported 10-second interval between the autopilot being engaged and the accident occurring. How fast were the two vehicles going? Did the Tesla driver T-bone the semi that ran the light? Did the semi T-bone the Tesla? Would that 10 seconds have been enough time for either the driver or the autopilot to avoid the collision? Seems to me that the mechanics of the accident ought to be resolved, first, before chasing after any issues relating to the mechanics of the vehicle – but maybe the truck was uninsured, and/or the driver’s family thinks Tesla would be an easier target?



Add a Comment

Your email address will not be published. Required fields are marked *

*