Tesla ‘Operational Limitations’ Cited for ‘Major Role’ in Crash by Safety Board

By | September 13, 2017

The chairman of the U.S. National Transportation Safety Board (NTSB) said on Tuesday “operational limitations” in the Tesla Model S played a “major role” in a May 2016 crash that killed a driver using the vehicle’s semi-autonomous “Autopilot” system.

The limits on the system include factors such as Tesla being unable to ensure driver attention even when the car is traveling at high speeds, ensuring Autopilot is used only on certain roads and monitoring driver engagement, NTSB said.

The NTSB recommended auto safety regulators and automakers take steps to ensure that semi-autonomous systems are not misused.

“System safeguards were lacking,” NTSB Chairman Robert Sumwalt said. “Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention.”

Tesla Inc. said in a statement that “Autopilot significantly increases safety,” citing an earlier government study that suggested the system reduced the incidence of crashes.

The automaker said it would evaluate the NTSB’s recommendations.

“We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times,” Tesla said.

Joshua Brown, a 40-year-old Ohio man, was killed near Williston, Florida, when his Model S collided with a truck while it was engaged in the “Autopilot” mode.

SYSTEM LIMITS

The incident raised questions about the safety of systems that can perform driving tasks for extended stretches of time with little or no human intervention, but which cannot completely replace human drivers.

The NTSB recommendations will put new pressure on regulators and automakers to deal with the limitations of driver-assistance technologies.

In its findings on Tuesday, the NTSB said the self-driving system’s “operational design” was a contributing factor to the 2016 crash because it allows drivers to avoid steering or watching the road for lengthy periods of time that were “inconsistent” with warnings from Tesla.

The NTSB said Tesla could have taken further steps to prevent the system’s misuse, and faulted the driver for not paying attention and for “overreliance on vehicle automation.”

The agency said the Autopilot system operated as designed but did not do enough to ensure drivers paid adequate attention. On some roads, drivers could use Autopilot at up to 90 miles (145 km) per hour, it said.

Tesla did not ensure that the system was used only on highways and limited-access roads, as recommended in the owner’s manual, a fact that Sumwalt noted.

The NTSB recommended that automakers monitor driver attention in ways other than through detecting steering-wheel engagement.

The system could not reliably detect cross traffic and “did little to constrain the use of autopilot to roadways for which it was designed,” the board said.

‘TEN SECONDS TO REACT’

Monitoring driver attention by measuring the driver’s touching of the steering wheel “was a poor surrogate for monitored driving engagement,” said the board.

Tesla said in June 2016 that Autopilot “is not perfect and still requires the driver to remain alert.”

At a public hearing Tuesday on the crash involving Brown, NTSB said the truck driver and the Tesla driver “had at least 10 seconds to observe and respond to each other.”

Brown’s family said on Monday the car was not to blame for the crash.

“We heard numerous times that the car killed our son. That is simply not the case,” the family’s statement said. “There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car.”

“People die every day in car accidents,” the statement said. “Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements.”

A spokeswoman for Tesla and a lawyer for the family, Jack Landskroner, have declined to say if the automaker has reached a legal settlement with the Brown family.

NTSB recommended that NHTSA require automakers to have safeguards to prevent the misuse of semi-autonomous vehicle features.

The National Highway Traffic Safety Administration (NHTSA) said it would review the findings of the safety board.

In January, NHTSA said it found no evidence of defects in the crash. NHTSA and NTSB said Brown did not apply the brakes, and his last action was to set the cruise control at 74 miles per hour (119 kph), less than 2 minutes before the crash – above the 65-mph speed limit.

(Editing by Bernadette Baum and Matthew Lewis)

Topics Auto Tesla

Was this article valuable?

Here are more articles you may enjoy.