6 Modeling the Relationship between Driver Gaze Behavior and Traffic Context during Lane Changes Using a Recurrent Neural Network
-
Daiki Hayashi
Abstract
Lane changes are one of the most difficult tasks faced by drivers because they require a high degree of situational awareness. Previous studies have shown that drivers exhibit typical gaze patterns prior to lane changes. In this study, we assume that there is a correlation between driver gaze behavior and the existing traffic context, and propose a method for modeling the correlation. Driver gaze behavior during lane changes is modeled using a recurrent neural network (RNN). The input for the RNN consists of eight features which are calculated based on positions of surrounding vehicles in the eight areas around the ego-vehicle. We regard the output of the RNN as probabilities of a driver looking in each of ten different gaze directions, for example, “front,” “right,” mirror,” etc. The RNN generates a sequence of the probabilities for a given traffic context. To evaluate our gaze behavior model, we conducted a risky lane change detection experiment. We collected driving data from nine drivers driving an instrumented vehicle on expressways and passing other vehicles, resulting in a total of 859 lane change scenes. Another ten evaluators watched the front-view video of each lane change scene and rated how risky they felt. The normalized average of the risk scores was used as the ground truth risk level for each scene. We then selected 10% safest and 10% least safe scenes to train two RNNs, one for safe lane changes and one for risky lane changes, respectively. The similarity ratios of driver gaze behavior during ten seconds of each lane change to the risky and safe models were used to detect risky lane change scenes. We used the leave-one-out method to evaluate detection performance. Our experimental results showed that the proposed model achieved AUCs of 0.90 and 0.61 for right and left lane changes, respectively.
Abstract
Lane changes are one of the most difficult tasks faced by drivers because they require a high degree of situational awareness. Previous studies have shown that drivers exhibit typical gaze patterns prior to lane changes. In this study, we assume that there is a correlation between driver gaze behavior and the existing traffic context, and propose a method for modeling the correlation. Driver gaze behavior during lane changes is modeled using a recurrent neural network (RNN). The input for the RNN consists of eight features which are calculated based on positions of surrounding vehicles in the eight areas around the ego-vehicle. We regard the output of the RNN as probabilities of a driver looking in each of ten different gaze directions, for example, “front,” “right,” mirror,” etc. The RNN generates a sequence of the probabilities for a given traffic context. To evaluate our gaze behavior model, we conducted a risky lane change detection experiment. We collected driving data from nine drivers driving an instrumented vehicle on expressways and passing other vehicles, resulting in a total of 859 lane change scenes. Another ten evaluators watched the front-view video of each lane change scene and rated how risky they felt. The normalized average of the risk scores was used as the ground truth risk level for each scene. We then selected 10% safest and 10% least safe scenes to train two RNNs, one for safe lane changes and one for risky lane changes, respectively. The similarity ratios of driver gaze behavior during ten seconds of each lane change to the risky and safe models were used to detect risky lane change scenes. We used the leave-one-out method to evaluate detection performance. Our experimental results showed that the proposed model achieved AUCs of 0.90 and 0.61 for right and left lane changes, respectively.
Kapitel in diesem Buch
- Frontmatter I
- Contents V
- Contributing Authors VII
- Introduction XI
-
Part A: Driver/Vehicle Interaction Systems
- 1 MobileUTDrive: A Portable Device Platform for In-vehicle Driving Data Collection 3
- 2 Semantic Analysis of Driver Behavior by Data Fusion 25
- 3 Predicting When Drivers Need AR Guidance 35
- 4 Driver’s Mental Workload Estimation with Involuntary Eye Movement 49
- 5 Neurophysiological Driver Behavior Analysis 67
- 6 Modeling the Relationship between Driver Gaze Behavior and Traffic Context during Lane Changes Using a Recurrent Neural Network 87
- 7 A Multimodal Control System for Autonomous Vehicles Using Speech, Gesture, and Gaze Recognition 101
- 8 Head Pose as an Indicator of Drivers’ Visual Attention 113
-
Part B: Models & Theories of Driver/Vehicle Systems
- 9 Evolving Neural Network Controllers for Tractor-Trailer Vehicle Backward Path Tracking 135
- 10 Spectral Distance Analysis for Quality Estimation of In-Car Communication Systems 149
- 11 Combination of Hands-Free and ICC Systems 165
- 12 Insights into Automotive Noise PSD Estimation Based on Multiplicative Constants 183
- 13 In-Car Communication: From Single- to Four-Channel with the Frequency Domain Adaptive Kalman Filter 213
-
Part C: Self–driving and the Mobility in 2050
- 14 The PIX Moving KuaiKai: Building a Self-Driving Car in Seven Days 233
- 15 Vehicle Ego-Localization with a Monocular Camera Using Epipolar Geometry Constraints 251
- 16 Connected and Automated Vehicles: Study of Platooning 263
- 17 Epilogue – Future Mobility 2050 285
- Index 311
Kapitel in diesem Buch
- Frontmatter I
- Contents V
- Contributing Authors VII
- Introduction XI
-
Part A: Driver/Vehicle Interaction Systems
- 1 MobileUTDrive: A Portable Device Platform for In-vehicle Driving Data Collection 3
- 2 Semantic Analysis of Driver Behavior by Data Fusion 25
- 3 Predicting When Drivers Need AR Guidance 35
- 4 Driver’s Mental Workload Estimation with Involuntary Eye Movement 49
- 5 Neurophysiological Driver Behavior Analysis 67
- 6 Modeling the Relationship between Driver Gaze Behavior and Traffic Context during Lane Changes Using a Recurrent Neural Network 87
- 7 A Multimodal Control System for Autonomous Vehicles Using Speech, Gesture, and Gaze Recognition 101
- 8 Head Pose as an Indicator of Drivers’ Visual Attention 113
-
Part B: Models & Theories of Driver/Vehicle Systems
- 9 Evolving Neural Network Controllers for Tractor-Trailer Vehicle Backward Path Tracking 135
- 10 Spectral Distance Analysis for Quality Estimation of In-Car Communication Systems 149
- 11 Combination of Hands-Free and ICC Systems 165
- 12 Insights into Automotive Noise PSD Estimation Based on Multiplicative Constants 183
- 13 In-Car Communication: From Single- to Four-Channel with the Frequency Domain Adaptive Kalman Filter 213
-
Part C: Self–driving and the Mobility in 2050
- 14 The PIX Moving KuaiKai: Building a Self-Driving Car in Seven Days 233
- 15 Vehicle Ego-Localization with a Monocular Camera Using Epipolar Geometry Constraints 251
- 16 Connected and Automated Vehicles: Study of Platooning 263
- 17 Epilogue – Future Mobility 2050 285
- Index 311