Home Technology 4 Driver’s Mental Workload Estimation with Involuntary Eye Movement
Chapter
Licensed
Unlicensed Requires Authentication

4 Driver’s Mental Workload Estimation with Involuntary Eye Movement

  • Anh Son Le , Hirofumi Aoki and Tatsuya Suzuki
Become an author with De Gruyter Brill
Vehicles, Drivers, and Safety
This chapter is in the book Vehicles, Drivers, and Safety

Abstract

Mental workload estimation plays a very important role in traffic safety during manual control and transition to/from autonomous control. This chapter provides an overview of our research using involuntary eye movement for capturing the driver’s mental workload state. A series of studies addressed here include: (1) algorithm development for capturing the driver’s mental workload by using involuntary eye movement of the vestibulo-ocular reflex (VOR); (2) parameter identification for simulating VOR; (3) development of a VOR and optokinetic response (OKR) model; and (4) validation of the algorithm with both driving simulator and realistic conditions data. The results indicate that our algorithm can detect high mental workload while driving even in a naturalistic condition. This study also indicates the potential, as well as limitation, of using eye movement information to enhance advanced vehicle safety systems in the future.

Abstract

Mental workload estimation plays a very important role in traffic safety during manual control and transition to/from autonomous control. This chapter provides an overview of our research using involuntary eye movement for capturing the driver’s mental workload state. A series of studies addressed here include: (1) algorithm development for capturing the driver’s mental workload by using involuntary eye movement of the vestibulo-ocular reflex (VOR); (2) parameter identification for simulating VOR; (3) development of a VOR and optokinetic response (OKR) model; and (4) validation of the algorithm with both driving simulator and realistic conditions data. The results indicate that our algorithm can detect high mental workload while driving even in a naturalistic condition. This study also indicates the potential, as well as limitation, of using eye movement information to enhance advanced vehicle safety systems in the future.

Chapters in this book

  1. Frontmatter I
  2. Contents V
  3. Contributing Authors VII
  4. Introduction XI
  5. Part A: Driver/Vehicle Interaction Systems
  6. 1 MobileUTDrive: A Portable Device Platform for In-vehicle Driving Data Collection 3
  7. 2 Semantic Analysis of Driver Behavior by Data Fusion 25
  8. 3 Predicting When Drivers Need AR Guidance 35
  9. 4 Driver’s Mental Workload Estimation with Involuntary Eye Movement 49
  10. 5 Neurophysiological Driver Behavior Analysis 67
  11. 6 Modeling the Relationship between Driver Gaze Behavior and Traffic Context during Lane Changes Using a Recurrent Neural Network 87
  12. 7 A Multimodal Control System for Autonomous Vehicles Using Speech, Gesture, and Gaze Recognition 101
  13. 8 Head Pose as an Indicator of Drivers’ Visual Attention 113
  14. Part B: Models & Theories of Driver/Vehicle Systems
  15. 9 Evolving Neural Network Controllers for Tractor-Trailer Vehicle Backward Path Tracking 135
  16. 10 Spectral Distance Analysis for Quality Estimation of In-Car Communication Systems 149
  17. 11 Combination of Hands-Free and ICC Systems 165
  18. 12 Insights into Automotive Noise PSD Estimation Based on Multiplicative Constants 183
  19. 13 In-Car Communication: From Single- to Four-Channel with the Frequency Domain Adaptive Kalman Filter 213
  20. Part C: Self–driving and the Mobility in 2050
  21. 14 The PIX Moving KuaiKai: Building a Self-Driving Car in Seven Days 233
  22. 15 Vehicle Ego-Localization with a Monocular Camera Using Epipolar Geometry Constraints 251
  23. 16 Connected and Automated Vehicles: Study of Platooning 263
  24. 17 Epilogue – Future Mobility 2050 285
  25. Index 311
Downloaded on 17.10.2025 from https://www.degruyterbrill.com/document/doi/10.1515/9783110669787-004/html
Scroll to top button