Should drivers using vehicles with autonomous features be trained on those features before they get behind the wheel?
A new paper, “What Do We Tell the Drivers? Toward Minimum Driver Training Standards for Partially Automated Cars,” recently published by the Journal of Cognitive Engineering and Decision Making, explores that question and suggests looking to the experience of pilots for an answer.
Despite autonomous car systems being designed to respond to conflicts, crashes could still occur if there is not proper human training, according to authors Stephen M. Casner of NASA and Edwin L. Hutchins of the University of California San Diego.
Casner says this issue is increasingly relevant as more and more vehicles are being equipped with autonomous features such as lane keeping and automatic braking. By providing no standard training for drivers on these machines, automakers are handing average drivers complex pieces of equipment and hoping for the best, the authors say.
Currently, there has been little coordinated attempt to train drivers of vehicles with autonomous features about how the systems work or how they affect driver behavior. “The current strategy seems to be to place additional pages in the operator’s manual and hope drivers will pull it out of the glovebox and carefully read it,” Casner said.
Casner and Hutchins argue that, while autonomous-feature vehicles will reduce common crash scenarios, the lack of educated drivers could lead to a slew of other types of accidents.
To demonstrate their point, they cite the advent of autopilot in airplanes. Decades ago when automation was first deployed in airplane cockpits, there was little front end training for pilots on how these systems worked. This led to new kinds of crashes, and ultimately pushed the industry to increase the training offered for these systems.
“That early research yielded a number of unexpected findings,” the article states. “We found that pilots were sometimes surprised by the behavior of the automation, unable to predict what a complex system would do next. Ironically, being able to predict what the automation would do next seemed to require more knowledge about how the automation works than we originally anticipated.”
They say people need training not just on how the technology works, but also on how humans respond to it and the general concept of a human-automation team.
The paper explores how what was learned nearly 50 years ago in planes could foreshadow what will happen if society doesn’t start educating autonomous-feature vehicle drivers how their machines work.
The authors point to studies showing that drivers often let their attention drift and don’t understand how the automation works.
While they applaud that car makers, insurance firms, government agencies and others are discussing some of these issues, they conclude there appear to be few actual plans to provide drivers with the training they need.
The authors offer a first attempt at a set of minimum standards for what drivers should know before they operate a partially automated car.
The paper, which will appear in print in the Journal of Cognitive Engineering and Decision Making in June, is available online now.
The Human Factors and Ergonomics Society is a professional organization for human factors/ergonomics professionals with an interest in designing systems, tools, consumer products, and equipment to be safe and effective.
Was this article valuable?
Here are more articles you may enjoy.