I don’t think that’s reasonable and I argued for better training, maybe making the new owner watch a video about how Autopilot works, and even make them answer a single question — can you just set Autopilot and take your eyes and hands off the wheel and stop paying attention and don’t ever worry about taking control of the car to avoid an accident? — before they can drive off the lot.
Does anyone have any other suggestions on how to better educate the public on how to properly use AutoPilot?
Perhaps if/when humans have little to no oversight required in driving and passenger vehicles are able to communicate with each other there will be greater infallibility with the Autopilot/autonomous vehicles. Time will tell.
100% agree Andrew, but that type of automation (classified as Level 5) is many many years away – if not decades. In the meantime, what can we do to get more to people realize (what I think is something simple) such as “I shouldn’t turn around and play with my dog when my car is on Autopilot mode because that’s not how the technology currently works”?
If you cannot stop people from texting in traffic or following simple traffic codes like signaling 100 feet before turning or coming to a complete stop at a Stop sign, there is really no way to remove the human factor that exists with current Autopilot vehicles and test autonomous vehicles. The good thing obviously is that the human factor is incredibly reduced in Autopilot/autonomous vehicles (though I don’t recommend sleep or watching Netflix in them).
I hear you – it’s definitely a daunting task, and with the human factor we won’t get a 100% success rate. While I understand you weren’t saying we shouldn’t try, just because something is difficult does not mean we shouldn’t try to educate and reinforce good behavior.
But man, the whole signaling thing gets me every time. The “worst” is when you’re on a multi-lane highway and the person signals their lane change when they’re 4/5ths of the way into the new lane.
December 13, 2019 at 8:35 am
PolarBeaRepeal says:
Like or Dislike:
1
1
‘Difficult’ is your categorization of Tesla’s technology. Mine is ‘dangerous’. For the sake of public safety, and everyone’s pursuit of Life, Liberty, and the Pursuit of Happiness without undue concern over safety, Teslas should be kept off the road until their AI is confirmed as safe by thorough, unbiaed OFF-ROAD testing.
December 13, 2019 at 5:32 am
Pelon Tusk says:
Like or Dislike:
1
1
I do not think training of drivers helps. The psychological temptation to not supervise a machine who is 95% of the time right is just too strong. I believe Waymo tried it but did not succeed educating their drivers to properly supervise their autopilot system. Then they decided to go for level 4, so without driver.
And yes I think these cars should not be insured, for the safety of their drivers AND the other people in traffic. It should stop ASAP, before more people die. IMHO this is a BIG fuck up of regulators.
For insurers I also think it makes sense not to insure these cars for several other reasons:
1. Good PR when regulators will finally act, will happen as more of these cars come on the road
2. in the end it is a lose-lose situation to insure these cars. Customers are not happy because of much higher premiums and you do not want all this unpredictable risk on your books.
These points are in particular valid for Tesla’s Autopilot because they do a very poor job in continuously checking the driver pays attention, much poorer than some other car manufacturers do.
Remember when Toyota was having all those “sudden unintended accelerations” and people were dying? Were you crying for Toyota not to be insured? What about the Ford Pinto that was exploding when rear-ended? What about the millions of recalls by domestic and foreign car makers every year? At some point there has to be accountability for the driver’s own inattention, ignorance or purposeful disregard for safety that’s not the manufacturer’s constructed issue. There are plenty of people in traffic in cars without Autopilot that put on make up, text, read magazines or newspapers, fall asleep or turn around to attend children or animals (never mind the ones that drive with animals in their lap) every day.
In another article (https://www.insurancejournal.com/news/east/2019/12/09/550701.htm/?comments) Yogi suggested the solution is simply not to insure Tesla’s.
I don’t think that’s reasonable and I argued for better training, maybe making the new owner watch a video about how Autopilot works, and even make them answer a single question — can you just set Autopilot and take your eyes and hands off the wheel and stop paying attention and don’t ever worry about taking control of the car to avoid an accident? — before they can drive off the lot.
Does anyone have any other suggestions on how to better educate the public on how to properly use AutoPilot?
Perhaps if/when humans have little to no oversight required in driving and passenger vehicles are able to communicate with each other there will be greater infallibility with the Autopilot/autonomous vehicles. Time will tell.
100% agree Andrew, but that type of automation (classified as Level 5) is many many years away – if not decades. In the meantime, what can we do to get more to people realize (what I think is something simple) such as “I shouldn’t turn around and play with my dog when my car is on Autopilot mode because that’s not how the technology currently works”?
If you cannot stop people from texting in traffic or following simple traffic codes like signaling 100 feet before turning or coming to a complete stop at a Stop sign, there is really no way to remove the human factor that exists with current Autopilot vehicles and test autonomous vehicles. The good thing obviously is that the human factor is incredibly reduced in Autopilot/autonomous vehicles (though I don’t recommend sleep or watching Netflix in them).
I hear you – it’s definitely a daunting task, and with the human factor we won’t get a 100% success rate. While I understand you weren’t saying we shouldn’t try, just because something is difficult does not mean we shouldn’t try to educate and reinforce good behavior.
But man, the whole signaling thing gets me every time. The “worst” is when you’re on a multi-lane highway and the person signals their lane change when they’re 4/5ths of the way into the new lane.
‘Difficult’ is your categorization of Tesla’s technology. Mine is ‘dangerous’. For the sake of public safety, and everyone’s pursuit of Life, Liberty, and the Pursuit of Happiness without undue concern over safety, Teslas should be kept off the road until their AI is confirmed as safe by thorough, unbiaed OFF-ROAD testing.
I do not think training of drivers helps. The psychological temptation to not supervise a machine who is 95% of the time right is just too strong. I believe Waymo tried it but did not succeed educating their drivers to properly supervise their autopilot system. Then they decided to go for level 4, so without driver.
And yes I think these cars should not be insured, for the safety of their drivers AND the other people in traffic. It should stop ASAP, before more people die. IMHO this is a BIG fuck up of regulators.
For insurers I also think it makes sense not to insure these cars for several other reasons:
1. Good PR when regulators will finally act, will happen as more of these cars come on the road
2. in the end it is a lose-lose situation to insure these cars. Customers are not happy because of much higher premiums and you do not want all this unpredictable risk on your books.
These points are in particular valid for Tesla’s Autopilot because they do a very poor job in continuously checking the driver pays attention, much poorer than some other car manufacturers do.
Yes, don’t insure Tesla’s until they pass proper tests.
What do you mean by “proper tests?”
Remember when Toyota was having all those “sudden unintended accelerations” and people were dying? Were you crying for Toyota not to be insured? What about the Ford Pinto that was exploding when rear-ended? What about the millions of recalls by domestic and foreign car makers every year? At some point there has to be accountability for the driver’s own inattention, ignorance or purposeful disregard for safety that’s not the manufacturer’s constructed issue. There are plenty of people in traffic in cars without Autopilot that put on make up, text, read magazines or newspapers, fall asleep or turn around to attend children or animals (never mind the ones that drive with animals in their lap) every day.