4 Driver’s Mental Workload Estimation with Involuntary Eye Movement
-
Anh Son Le
Abstract
Mental workload estimation plays a very important role in traffic safety during manual control and transition to/from autonomous control. This chapter provides an overview of our research using involuntary eye movement for capturing the driver’s mental workload state. A series of studies addressed here include: (1) algorithm development for capturing the driver’s mental workload by using involuntary eye movement of the vestibulo-ocular reflex (VOR); (2) parameter identification for simulating VOR; (3) development of a VOR and optokinetic response (OKR) model; and (4) validation of the algorithm with both driving simulator and realistic conditions data. The results indicate that our algorithm can detect high mental workload while driving even in a naturalistic condition. This study also indicates the potential, as well as limitation, of using eye movement information to enhance advanced vehicle safety systems in the future.
Abstract
Mental workload estimation plays a very important role in traffic safety during manual control and transition to/from autonomous control. This chapter provides an overview of our research using involuntary eye movement for capturing the driver’s mental workload state. A series of studies addressed here include: (1) algorithm development for capturing the driver’s mental workload by using involuntary eye movement of the vestibulo-ocular reflex (VOR); (2) parameter identification for simulating VOR; (3) development of a VOR and optokinetic response (OKR) model; and (4) validation of the algorithm with both driving simulator and realistic conditions data. The results indicate that our algorithm can detect high mental workload while driving even in a naturalistic condition. This study also indicates the potential, as well as limitation, of using eye movement information to enhance advanced vehicle safety systems in the future.
Kapitel in diesem Buch
- Frontmatter I
- Contents V
- Contributing Authors VII
- Introduction XI
-
Part A: Driver/Vehicle Interaction Systems
- 1 MobileUTDrive: A Portable Device Platform for In-vehicle Driving Data Collection 3
- 2 Semantic Analysis of Driver Behavior by Data Fusion 25
- 3 Predicting When Drivers Need AR Guidance 35
- 4 Driver’s Mental Workload Estimation with Involuntary Eye Movement 49
- 5 Neurophysiological Driver Behavior Analysis 67
- 6 Modeling the Relationship between Driver Gaze Behavior and Traffic Context during Lane Changes Using a Recurrent Neural Network 87
- 7 A Multimodal Control System for Autonomous Vehicles Using Speech, Gesture, and Gaze Recognition 101
- 8 Head Pose as an Indicator of Drivers’ Visual Attention 113
-
Part B: Models & Theories of Driver/Vehicle Systems
- 9 Evolving Neural Network Controllers for Tractor-Trailer Vehicle Backward Path Tracking 135
- 10 Spectral Distance Analysis for Quality Estimation of In-Car Communication Systems 149
- 11 Combination of Hands-Free and ICC Systems 165
- 12 Insights into Automotive Noise PSD Estimation Based on Multiplicative Constants 183
- 13 In-Car Communication: From Single- to Four-Channel with the Frequency Domain Adaptive Kalman Filter 213
-
Part C: Self–driving and the Mobility in 2050
- 14 The PIX Moving KuaiKai: Building a Self-Driving Car in Seven Days 233
- 15 Vehicle Ego-Localization with a Monocular Camera Using Epipolar Geometry Constraints 251
- 16 Connected and Automated Vehicles: Study of Platooning 263
- 17 Epilogue – Future Mobility 2050 285
- Index 311
Kapitel in diesem Buch
- Frontmatter I
- Contents V
- Contributing Authors VII
- Introduction XI
-
Part A: Driver/Vehicle Interaction Systems
- 1 MobileUTDrive: A Portable Device Platform for In-vehicle Driving Data Collection 3
- 2 Semantic Analysis of Driver Behavior by Data Fusion 25
- 3 Predicting When Drivers Need AR Guidance 35
- 4 Driver’s Mental Workload Estimation with Involuntary Eye Movement 49
- 5 Neurophysiological Driver Behavior Analysis 67
- 6 Modeling the Relationship between Driver Gaze Behavior and Traffic Context during Lane Changes Using a Recurrent Neural Network 87
- 7 A Multimodal Control System for Autonomous Vehicles Using Speech, Gesture, and Gaze Recognition 101
- 8 Head Pose as an Indicator of Drivers’ Visual Attention 113
-
Part B: Models & Theories of Driver/Vehicle Systems
- 9 Evolving Neural Network Controllers for Tractor-Trailer Vehicle Backward Path Tracking 135
- 10 Spectral Distance Analysis for Quality Estimation of In-Car Communication Systems 149
- 11 Combination of Hands-Free and ICC Systems 165
- 12 Insights into Automotive Noise PSD Estimation Based on Multiplicative Constants 183
- 13 In-Car Communication: From Single- to Four-Channel with the Frequency Domain Adaptive Kalman Filter 213
-
Part C: Self–driving and the Mobility in 2050
- 14 The PIX Moving KuaiKai: Building a Self-Driving Car in Seven Days 233
- 15 Vehicle Ego-Localization with a Monocular Camera Using Epipolar Geometry Constraints 251
- 16 Connected and Automated Vehicles: Study of Platooning 263
- 17 Epilogue – Future Mobility 2050 285
- Index 311