Startseite Technik 8 Head Pose as an Indicator of Drivers’ Visual Attention
Kapitel
Lizenziert
Nicht lizenziert Erfordert eine Authentifizierung

8 Head Pose as an Indicator of Drivers’ Visual Attention

  • Sumit Jha und Carlos Busso
Veröffentlichen auch Sie bei De Gruyter Brill
Vehicles, Drivers, and Safety
Ein Kapitel aus dem Buch Vehicles, Drivers, and Safety

Abstract

Estimating the direction of visual attention is very important to predict driver’s distraction, assess his/her situational awareness, and improve in-vehicle dialog systems. Tracking eye movement can be an accurate measure to identify the exact location of the gaze of a driver. However, robustly measuring gaze in a driving environment is challenging due to changes in illuminations, occlusions, and changes in head poses from the drivers. Head pose provides a coarse estimate of the gaze direction, which might be helpful in determining the visual attention of the drivers for most applications. To what extent can the head pose be a useful indicator of visual attention? In addition to head pose, gaze activities are characterized by eye movements. Therefore, the relation between head pose and gaze is not deterministic. This chapter summarizes our effort to understand and model the relation between head pose and visual attention. We are interested in understanding how much the head pose deviates from the actual gaze of the driver, and how much the head pose varies for a given gaze direction. We observe that the deviation is much higher in the vertical direction compared to the horizontal direction, making it more difficult to estimate vertical gaze. We observe that as the direction of visual attention is directed further away from the frontal direction, the deviation between gaze direction and head pose increases. Given that the relations between gaze and head pose are not deterministic, we propose probabilistic maps to describe visual attention. Instead of estimating the exact direction of the gaze, this formulation provides confidence regions that are mapped to either the windshield or the road scene. We describe a parametric probabilistic map built with Gaussian process regression (GPR), and a nonparametric probabilistic map built by upsampling convolutional neural networks (CNNs).

Abstract

Estimating the direction of visual attention is very important to predict driver’s distraction, assess his/her situational awareness, and improve in-vehicle dialog systems. Tracking eye movement can be an accurate measure to identify the exact location of the gaze of a driver. However, robustly measuring gaze in a driving environment is challenging due to changes in illuminations, occlusions, and changes in head poses from the drivers. Head pose provides a coarse estimate of the gaze direction, which might be helpful in determining the visual attention of the drivers for most applications. To what extent can the head pose be a useful indicator of visual attention? In addition to head pose, gaze activities are characterized by eye movements. Therefore, the relation between head pose and gaze is not deterministic. This chapter summarizes our effort to understand and model the relation between head pose and visual attention. We are interested in understanding how much the head pose deviates from the actual gaze of the driver, and how much the head pose varies for a given gaze direction. We observe that the deviation is much higher in the vertical direction compared to the horizontal direction, making it more difficult to estimate vertical gaze. We observe that as the direction of visual attention is directed further away from the frontal direction, the deviation between gaze direction and head pose increases. Given that the relations between gaze and head pose are not deterministic, we propose probabilistic maps to describe visual attention. Instead of estimating the exact direction of the gaze, this formulation provides confidence regions that are mapped to either the windshield or the road scene. We describe a parametric probabilistic map built with Gaussian process regression (GPR), and a nonparametric probabilistic map built by upsampling convolutional neural networks (CNNs).

Kapitel in diesem Buch

  1. Frontmatter I
  2. Contents V
  3. Contributing Authors VII
  4. Introduction XI
  5. Part A: Driver/Vehicle Interaction Systems
  6. 1 MobileUTDrive: A Portable Device Platform for In-vehicle Driving Data Collection 3
  7. 2 Semantic Analysis of Driver Behavior by Data Fusion 25
  8. 3 Predicting When Drivers Need AR Guidance 35
  9. 4 Driver’s Mental Workload Estimation with Involuntary Eye Movement 49
  10. 5 Neurophysiological Driver Behavior Analysis 67
  11. 6 Modeling the Relationship between Driver Gaze Behavior and Traffic Context during Lane Changes Using a Recurrent Neural Network 87
  12. 7 A Multimodal Control System for Autonomous Vehicles Using Speech, Gesture, and Gaze Recognition 101
  13. 8 Head Pose as an Indicator of Drivers’ Visual Attention 113
  14. Part B: Models & Theories of Driver/Vehicle Systems
  15. 9 Evolving Neural Network Controllers for Tractor-Trailer Vehicle Backward Path Tracking 135
  16. 10 Spectral Distance Analysis for Quality Estimation of In-Car Communication Systems 149
  17. 11 Combination of Hands-Free and ICC Systems 165
  18. 12 Insights into Automotive Noise PSD Estimation Based on Multiplicative Constants 183
  19. 13 In-Car Communication: From Single- to Four-Channel with the Frequency Domain Adaptive Kalman Filter 213
  20. Part C: Self–driving and the Mobility in 2050
  21. 14 The PIX Moving KuaiKai: Building a Self-Driving Car in Seven Days 233
  22. 15 Vehicle Ego-Localization with a Monocular Camera Using Epipolar Geometry Constraints 251
  23. 16 Connected and Automated Vehicles: Study of Platooning 263
  24. 17 Epilogue – Future Mobility 2050 285
  25. Index 311
Heruntergeladen am 16.10.2025 von https://www.degruyterbrill.com/document/doi/10.1515/9783110669787-008/html
Button zum nach oben scrollen