Autopilot Warning May Not Help Tesla in Crash Defense

Telling Tesla drivers its Autopilot feature doesn’t mean their cars can drive themselves may not be enough to keep Elon Musk off the hot seat if the technology comes up short.

This month, two Teslas equipped with Autopilot veered into barriers following disclosure of the first fatal wreck, a Model S slamming into a 18-wheeler crossing a Florida highway after the semi-autonomous car failed to distinguish the truck’s white trailer from sky.

Tesla Motors Inc. warns drivers they must still pay attention and be ready to grab back control of the car, but there’s a lot in a name.

“The moment I saw Tesla calling it Autopilot, I thought it was a bad move,” said Lynn Shumway, a lawyer who specializes product liability cases against carmakers. “Just by the name, aren’t you telling people not to pay attention?”

Joshua Brown’s death in Florida was the first involving Tesla’s semi-autonomous technology, triggering chatter in legal circles about who was liable for the crash and prompting a probe by the National Highway Traffic Safety Administration as well as the National Transportation Safety Board, which typically devotes its attention to mishaps involving planes and trains. Some details remain in dispute, including whether Brown, a former Navy SEAL, might have been watching a Harry Potter movie in a DVD player found in the car.

Musk had anticipated the moment for at least two years, telling drivers to keep their hands on the wheel because they will be accountable if the car’s on Autopilot crash. Tesla buyers must activate the Autopilot software, which requires them to acknowledge the technology is a beta platform and isn’t meant to be used as a substitute for the driver.

Driver’s Responsibility

When U.S. investigators began evaluating Brown’s crash, Tesla doubled down in a statement: “Autopilot is an assist feature. You need to maintain control and responsibility of your vehicle.”

But people will be people and they often don’t do what they’re supposed to do.

Lawyers compare giving Tesla drivers Autopilot to building a swimming pool without a fence; the property owner should know that neighborhood kids will find it hard to resist and may get hurt.

“There’s a concept in the legal profession called an attractive nuisance,” said Tab Turner, another lawyer specializing in auto-defect cases. “These devices are much that way right now. They’re all trying to sell them as a wave of the future, but putting in fine print, ‘Don’t do anything but monitor it.’ It’s a dangerous concept.”

As with so-called smart features before it such as anti-lock brakes and electronic stability control, telling drivers Autopilot might not prevent an accident won’t help Tesla in court if the technology is found to be defective, Turner said.

“Warnings alone are never the answer to a design problem,” he said.

Possible Arguments

In a court case, lawyers for accident victims or their families would have other lines of attack if Tesla blames accidents on drivers failing to heed warnings. They could assert that Tesla’s software is defective because it doesn’t do enough to make sure drivers are paying attention.

Attorneys could also argue that, in Brown’s case for example, the car should have recognized the tractor-trailer as an obstacle, or that Tesla could have easily updated its system to address such a foreseeable problem.

“Any argument will try to establish that Tesla acted in an unreasonable way that was a cause of the crash,” said Bryant Walker Smith, a University of South Carolina law professor who researches automation and connectivity. “It doesn’t even need to be the biggest cause, but just a cause.”

If Brown’s May 7 crash doesn’t end up in court, others might.

A 77-year-old driver from Michigan, which passed laws allowing semi-autonomous and fully autonomous vehicles, struck a concrete median in Pennsylvania and his 2016 Model X SUV rolled over. Also this month, a driver in Montana said his Tesla veered off the highway and into a guardrail. Both drivers said their cars were operating on Autopilot at the time and both were cited for careless driving.

Pennsylvania and Montana are among the 42 states without legislation regulating autonomous and semi-autonomous cars.

Musk Response

Musk fired back in a tweet, saying the onboard vehicle logs show the Autopilot was turned off in the Pennsylvania crash and that the accident wouldn’t have happened if it had been on. The company said the Montana driver hadn’t placed his hands on the wheel for more than two minutes while the car was on Autopilot.

Musk and Tesla are certain to argue that while their technology has yet to meet the threshold for “autonomous vehicle,” its Model S has achieved the best safety rating of any car ever tested.

Even with that record, Consumer Reports last Thursday called on Tesla to disable Autopilot on more than 70,000 vehicles. “By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” said Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports.

“Tesla is consistently introducing enhancements proven over millions of miles of internal testing to ensure that drivers supported by Autopilot remain safer than those operating without assistance,” the carmaker said Thursday in a statement. “We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media.”

Khobi Brooklyn, a spokeswoman for the Palo Alto, California-based carmaker, cited the company’s earlier comments on the three accidents and declined to comment further on possible litigation involving Autopilot.

National Rules

The U.S. government will soon offer the auto industry guiding principles for safe operation of fully autonomous vehicles, part of a plan that includes $4 billion for safety research by 2026.

For now, the double line between Autopilot and full autonomy is a blurry one.

In 2013, NHTSA released a five-rung autonomous vehicle rating system based on cars’ computerized capabilities, ranging from level 0 for “no-automation” to level 4 for “full self-driving automation.”

Tesla’s likely to argue its technology has yet to surpass level 2: automation designed to relieve driver control of at least two functions. Plaintiffs will counter the car’s been marketed more like a level 3, when the driver can fully cede control of all safety-critical functions while remaining available for occasional intervention.

“It’s great technology, I hope they get this right and put people like us out of business,” says Steve Van Gaasbeck, an auto products lawyer in San Antonio, Texas. “There’s really no excuse for missing an 18-wheeler.”

Topics Auto InsurTech Tech Pennsylvania Tesla Montana

Was this article valuable?

Here are more articles you may enjoy.

Latest Comments

  • October 9, 2020 at 6:48 pm
    Kellie Kulton says:
    Jack, Can you tell me what the origin of your last name is? I am doing some research and that name came up but I hit a dead end. Many thanks in advance.
  • July 20, 2016 at 4:01 pm
    Agent says:
    Way to go Millenial dufus. Now, go back to playing with your Pokémon Go. It keeps you "confused" on life and how to be a good man. How about you denying you are a Bernie s... read more
  • July 20, 2016 at 3:15 pm
    too easy says:
    "Honestly, people are just so stupid these days." We know - we read your posts every day on this site. -confused

Add a CommentSee All Comments (8)Add a Comment

Your email address will not be published. Required fields are marked *

*

More News
More News Features