How much people trust technology and what type of driving alerts they respond to may have as much or more to do with the success of autonomous vehicles than technological, legal and security concerns, according to two new studies.
The studies on human behavior and self-driving cars have been published in Human Factors: The Journal of the Human Factors and Ergonomics Society.
One paper assesses the level of drivers’ trust in the autonomous car by monitoring how often they interrupt a non-driving task to look at their surroundings. This study presents the first empirical evidence making this connection.
The other study suggests that drivers will respond best to verbal prompts, as opposed to sounds or visual displays, alerting them to driving conditions and the state of the vehicle (for example, low tire pressure).
“Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving” is the work of Sebastian Hergeth, Lutz Lorenz, and Roman Vilimek of the BMW Group in Munich, and Josef F. Krems from Technische Universität Chemnitz, Germany.
In this study, 35 BMW Group employees ages 18 to 55 participated in a self-driving car simulation while engaging in a visually demanding non-driving task. The driving scenario was a standard three-lane highway with a hard shoulder in which uneventful driving was periodically interrupted by incidents requiring the driver to take control. Although trust is difficult to quantify, drivers’ use of eye-tracking glasses enabled the researchers to capture data about how frequently participants looked away from the secondary task to observe the driving scene. Hergeth and his team then used these data to draw preliminary conclusions about drivers’ levels of trust in the simulated car’s automation.
The more the participants trusted the automation, the less frequently they looked at their surroundings. They were also more trusting of the car once they learned the system. Overall, more than half the drivers said they trusted the car more at the end than at the beginning of the trials. The researchers postulate that appropriate trust in automation is crucial for drivers to get the maximum benefit from self-driving vehicles.
In “Speech Auditory Alerts Promote Memory for Alerted Events in a Video-Simulated Self-Driving Car Ride,” human factors researchers Michael A. Ness, Benji Helbein, and Anna Porter of Lafayette College, Easton, Pennsylvania, studied the usefulness of speech alerts to help drivers perceive and remember driving conditions while engaged in a non-driving activity.
Eighty-five undergraduate students performed a word search task while watching three driving simulation videos. Each scenario showed a routine driving condition. The participants were randomly assigned to one of three display conditions: sounds such as a jackhammer, indicating construction ahead; a visual display with text; and speech alerts such as “pedestrian” or “front hazard.”
After watching the videos, participants reported what they recalled about the driving scenario, how useful and how annoying the alerts were, and how confident they would feel if they had to resume control of the car at the moment the video stopped. Participants who heard the speech alerts had better recall than those who were given the sound icons or visual displays. However, both audio alerts were rated as annoying, and studies show that annoying alerts have a tendency to be turned off.
Both research teams said they plan further investigations to assess how these areas of study can impact safety and how quickly and effectively drivers would take over the controls when necessary.
The Human Factors and Ergonomics Society is an association for human factors/ergonomics professionals, with more than 4,800 members globally. HFES members include psychologists and other scientists, designers and engineers.
Source: The Journal of the Human Factors and Ergonomics Society
Was this article valuable?
Here are more articles you may enjoy.