Abstract
We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver ́s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger ́s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant ́s head-movements are comparable to drivers and if this can be used for classification. Ina driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks weretrained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.
Original language | English |
---|---|
Title of host publication | Proceedings of the Human Factors and Ergonomics Society Annual Meeting |
Publisher | Taylor & Francis |
Pages | 2078-2082 |
Number of pages | 5 |
Volume | 63 |
DOIs | |
Publication status | Published - 2019 |
Externally published | Yes |
Event | Human Factors and Ergonomics Society Annual Meeting, HFES 2019 - Sheraton Grand Seattle, Seattle, United States Duration: 28 Oct 2019 → 1 Nov 2019 https://techsage.gatech.edu/human-factors-and-ergonomics-society-hfes-2019 |
Conference
Conference | Human Factors and Ergonomics Society Annual Meeting, HFES 2019 |
---|---|
Abbreviated title | HFES |
Country/Territory | United States |
City | Seattle |
Period | 28/10/19 → 1/11/19 |
Internet address |
Keywords
- ITC-CV