Safety Experts Weigh In on Autonomous Car Standards After Uber, Tesla Fatal Crashes

By and | April 4, 2018

Autonomous cars should be required to meet standards on their ability to detect potential hazards and better ways are needed to keep their human drivers ready to assume control, U.S. auto safety and technology experts said after fatal crashes involving Uber Technologies Inc. and Tesla Inc. vehicles.

Automakers and tech companies rely on human drivers to step in when necessary with self-driving technology. But in the two recent crashes, which involved vehicles using different kinds of technologies, neither of the human drivers took any action before the accidents.

Driverless cars rely on lidar, which uses laser light pulses to detect road hazards, as well as sensors such as radar and cameras. There are not, however, any standards on the systems, nor do all companies use the same combination of sensors, and some vehicles may have blind spots.

“Humans don’t have the ability to take over the vehicle as quickly as may be expected” in those situations, said self-driving expert and investor Evangelos Simoudis.

In the Uber crash last month, the ride services company was testing a fully driverless system intended for commercial use when the prototype vehicle struck and killed a woman walking across an Arizona road. Video of the crash, taken from inside the vehicle, shows the driver at the wheel, who appears to be looking down and not at the road. Just before the video stops, the driver looks upwards toward the road and suddenly looks shocked.

In the Tesla incident last month, which involved a car that any consumer can buy, a Model X vehicle was in semi-autonomous Autopilot mode when it crashed, killing its driver. The driver had received earlier warnings to put his hands on the wheel, Tesla said.

Some semi-automated cars, like the Tesla, employ different technologies to help drivers stay in their lane or maintain a certain distance behind the vehicle in front. Those systems rely on alerts – beeping noises or a vibrating steering wheel – to get drivers’ attention.

“Immature Technology”

Duke University mechanical engineering professor Missy Cummings said the recent Uber and Tesla crashes show the “technology they are using is immature.”

Tesla says its technology is statistically proven to save lives through better driving. In a response to Reuters on Tuesday, Tesla said drivers have a “responsibility to maintain control of the car” whenever they enable Autopilot and need to be ready to respond to “audible and visual cues.”

An Uber spokesperson said “safety is our primary concern every step of the way.”

A consumer group, Advocates for Highway and Auto Safety, says a bill on self-driving cars now stalled in the U.S. Senate is an opportunity to improve safety, quite different from the bill’s original intent to quickly allow testing of self-driving cars without human controls on public roads. The group has proposed amending the bill, the AV START Act, to set standards for those vehicles, for instance, requiring a “vision test” for automated vehicles to test what their different sensors actually see.

The group believes the bill should also cover semi-automated systems like Tesla’s Autopilot – a lower level of technology than what is included in the current proposed legislation.

Other groups have also put forth proposals on self-driving cars, including requiring the vehicles and even semi-automated systems to meet performance targets, greater transparency and data from makers and operators of the vehicles, increased regulatory oversight, and better monitoring of and engagement with human drivers.

Others want to focus on the human driver. In November, Consumer Reports magazine called on automakers for responsible labeling “to help consumers fully understand” their vehicles’ autonomous functions.

Jake Fisher, Consumer Reports’ head of automotive testing, said human drivers “are bad at paying attention to automation and this technology is not capable of reacting to all types of emergencies.

“It’s like being a passenger with a toddler driving the car,” he said.

The Massachusetts Institute of Technology is doing tests using semi-automated vehicles including models from Tesla, Volvo, Jaguar Land Rover and General Motors Co. The aim is to see how drivers use semi-autonomous technology – some watch the road with their hands above the wheel, others do not – and which warnings get their attention.

“We just don’t know enough about how drivers use any of these systems in the wild,” said MIT research scientist Bryan Reimer.

Timothy Carone, an autonomous systems expert and professor at Notre Dame University’s Mendoza College of Business, said autonomous technology’s proponents must “find the right balance so the technology is tested right, but it isn’t hampered or halted.”

“Because in the long run it will save lives,” he said.

(Reporting by Nick Carey and Paul Lienert in Detroit Editing by Leslie Adler)

Was this article valuable?

Here are more articles you may enjoy.

Advertisers

Latest Comments

  • April 5, 2018 at 4:04 pm
    Agent says:
    Let's hope for decades before widespread use on autonomous cars. Massive failure so far. Then, they will have to mix in with 300 million regular cars. Scary thought for sur... read more
  • April 5, 2018 at 9:41 am
    Rosenblatt says:
    I agree we're at the very early stages of autonomous vehicles where the systems are just automated and require human intervention. We're still years (if not decades) away from... read more
  • April 4, 2018 at 3:58 pm
    John Blangi says:
    Driving a TSLA Model S with AutoPilot 1. I can see that AP1 is a more advanced active cruise control..but nowhere near autonomous. The benefit I believe with AP1 is that the s... read more

Add a CommentSee All Comments (5)Add a Comment

Your email address will not be published. Required fields are marked *

*

More News
More News Features