Elon Musk said Tesla Motors Inc. is updating its Autopilot technology to emphasize radar images over cameras, in his latest attempt to convince critics that the electric-car maker’s semi-autonomous driving features are safe for the public.
The biggest change in Autopilot version 8.0 is the new starring role for radar, which since October 2014 has only been used to supplement camera sensors, Musk said in a blog post Sunday. In another key development, the automatic-steering software has been designed to disengage if a driver ignores repeated warnings to keep hands safely on the steering wheel and to stay off until the driver parks the car, the blog stated.
“I am highly confident this will be a substantial improvement,” Musk, Tesla’s chief executive officer, said Sunday during a conference call with reporters.
The enhanced radar even may have been enough to save the life of a driver whose Model S struck a tractor-trailer in blinding sunshine in May while using Autopilot, he said.
Palo Alto, California-based Tesla is upgrading the software, which automatically rolls out to customers over the airwaves, as it remains under fire for that May 7 crash. The U.S. National Highway Traffic Safety Administration opened an investigation into the accident and safety advocates criticized Tesla for beta-testing autonomous-driving features with the public. Version 8.0 includes automatic feedback from the cars to GPS systems, to catalog fixed items that the radar sees — such as a sign over a highway — to prevent future false alarms for other drivers.
Though controversial, Tesla’s use of such so-called fleet learning is exciting and will allow the technology to progress much faster, said Bryant Walker Smith, a law professor at the University of South Carolina who has written extensively on driverless-car liability.
“Tesla is actually using its customers as trainers,” Smith said via e-mail Sunday. “These drivers will teach not only their own vehicle but also all of the other Tesla vehicles on the road to correctly recognize roadway objects.”
In the latest update, Tesla said the Autopilot system should now recognize objects like trucks crossing the road, piles of junk metal — even “a UFO” — and would know enough not to hit it, even the system can’t decipher exactly what it’s seeing. Instead of being supplemental to the camera, the blog post stated that radar alone “can be used as a primary control sensor without requiring the camera to confirm visual image recognition.”
When asked Sunday whether Tesla’s improved radar might have saved the life of Joshua Brown in the May accident, Musk said it likely would have. Brown was killed when his Model S sedan drove under a semi trailer as the rig crossed the road. In a blog post June 30, Tesla stressed that “neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky,” so the brakes were never applied.
“We believe it would have seen a large metal object across the road,” Musk said, referring to the tractor trailer. “Knowing that there is no overhead road sign there, it would have braked.”
NHTSA revealed in late June that it was probing the Florida crash, the first known fatality involving Tesla’s driver-assistance features known as Autopilot. The government said in a preliminary report that Brown’s Tesla was traveling at 74 miles (119 kilometers) per hour in a zone where the limit was 65 mph.
Autopilot accidents are far more likely to occur among drivers who have been using the technology for a while, Musk said. “It’s not the neophytes, it’s the experts,” he said. “They get very comfortable with it and they repeatedly ignore the warnings.”
With the latest update, the technology that Tesla calls Autosteer will hold drivers more accountable for keeping their hands on the steering wheel. After repeated warnings, if owners do not hold the steering wheel, the car will apply hazard lights, slow down and come to a stop. The Autosteer system can’t be reengaged until the car has been parked.
Tesla has pushed back against criticism by Fortune magazine and others that it should have disclosed the May fatality to investors before a secondary stock offering later that month. In July, Consumer Reports called Tesla’s Autopilot “Too Much Autonomy Too Soon” and called on the company to stop referring to the system as “Autopilot” as it is “misleading and potentially dangerous.”
The federal government is drafting guidelines, expected to be released this month, for automakers racing to bring fully self-driving cars to market.
While Ford Motor Co. and Alphabet Inc.’s Google espouse an all-or-nothing approach — saying that only fully autonomous cars are safe — Tesla has introduced driver-assist technology in “beta” form. Almost 1,000 Tesla customers are part of the company’s “early access program,” which allows them to test new features and give feedback before they are rolled out to customers as a whole.
About 35,200 people were killed in U.S. auto accidents in 2015, according to NHTSA. The overwhelming majority of vehicle accidents — 94 percent — are due to human error. Safety regulators want to improve human behavior while promoting technology that will protect people in crashes and help prevent them from occurring.
Musk, in his defense of Autopilot, said Tesla vehicles have driven about 200 million cumulative miles with Autopilot engaged and it would be “morally wrong” to withhold the technology. “It’s quite unequivocal that Autopilot improves safety,” he said. “And with this update, it improves it even more.”
Was this article valuable?
Here are more articles you may enjoy.