Abstract
This paper provides a methodology to directly compare the performance of event-based and traditional area scan cameras. Event-based cameras offer high temporal resolution, wide dynamic range (120 dB), low power consumption, and a compressed output stream, making them attractive for robotics and computer vision in challenging environments. However, the lack of a standardized comparison method between the two types of sensors has made it difficult to evaluate their relative performance and decide which camera is more suitable for which application. Here, we show that a set of three performance parameters, namely event contrast detectability (ECD), latency or maximum event rate, and event detection efficiency (EDE), can be used to quantitatively compare the performance of area scan and event-based cameras. The theoretical investigations are verified by a comparison study with 3 event-based and 3 industry standard area scan sensors, including a high dynamic range SONY IMX490 sensor to match the dynamic range of event-based cameras. For this purpose, the dynamic range of the EMVA 1288 compliant test equipment was extended to 120 dB.
Zusammenfassung
In diesem Beitrag wird eine Methodik zum direkten Vergleich der Leistung von ereignisbasierten und herkömmlichen Flächenkameras vorgestellt. Ereignisbasierte Kameras bieten eine hohe zeitliche Auflösung, einen großen Dynamikbereich (120 dB), einen geringen Stromverbrauch und einen komprimierten Ausgabestrom, was sie für Robotik und Computer Vision in schwierigen Umgebungen attraktiv macht. Da es jedoch keine standardisierte Vergleichsmethode zwischen den beiden Sensortypen gibt, ist es schwierig, ihre relative Leistung zu bewerten und zu entscheiden, welche Kamera für welche Anwendung besser geeignet ist. Hier zeigen wir, dass ein Satz von drei Leistungsparametern, nämlich die Ereignis-Kontrast-Detektierbarkeit (ECD), die Latenzzeit oder maximale Ereignisrate und die Ereignis-Detektions-Effizienz (EDE), verwendet werden können, um die Leistung von Flächenkameras und ereignisbasierten Kameras quantitativ zu vergleichen. Die theoretischen Untersuchungen werden durch eine Vergleichsstudie mit drei ereignisbasierten und drei branchenüblichen Flächensensoren verifiziert, darunter ein SONY IMX490-Sensor mit hohem Dynamikbereich, der dem Dynamikbereich ereignisbasierter Kameras entspricht. Zu diesem Zweck wurde der Dynamikbereich der EMVA 1288-konformen Messeinrichtung auf 120 dB erweitert.
Acknowledgments
The authors thank AEON Imaging for providing the EMVA1288 conform setup for camera performance measurements and IMAGO Technologies, Fraunhofer IOSB and Prophesee for supplying test-cameras. AM and BJ mourn the death of co-author Helmut Herrmann, who passed away after the initial submission of the manuscript. We dedicate this paper to his memory.
-
Research ethics: Not applicable.
-
Informed consent: Not applicable.
-
Author contributions: Not applicable.
-
Use of Large Language Models, AI and Machine Learning Tools: None declared.
-
Conflict of interest: The author states no conflict of interest.
-
Research funding: None declared.
-
Data availability: The raw data can be obtained on request from the corresponding author.
References
[1] M. Mahowald, The Silicon Retina, Boston, MA, US, Springer, 1994, pp. 4–65.10.1007/978-1-4615-2724-4_2Search in Google Scholar
[2] D. Chen, D. Matolin, A. Bermark, and C. Posch, “Pulse modulation imaging – review and performance analysis,” IEEE Trans. Biomed. Circuits Syst. Cog. Sci., vol. 5, no. 1, pp. 64–82, 2011.10.1109/TBCAS.2010.2075929Search in Google Scholar PubMed
[3] S. C. Liu and T. Delbruck, “Neuromorphic sensory systems,” Curr. Opin. Neurobiol., vol. 20, no. 3, pp. 288–295, 2010. https://doi.org/10.1016/j.conb.2010.03.007.Search in Google Scholar PubMed
[4] P.-F. Ruedi, et al.., “A 128 × 128 pixel 120-db dynamic-range vision-sensor chip for image contrast and orientation extraction,” IEEE J. Solid-State Circuits, vol. 38, no. 12, pp. 2325–2333, 2003. https://doi.org/10.1109/JSSC.2003.819169.Search in Google Scholar
[5] P. Lichtsteiner, C. Posch, and T. Delbruck, “A 128 x 128 120db 30mw asynchronous vision sensor that responds to relative intensity change,” in 2006 IEEE International Solid State Circuits Conference – Digest of Technical Papers, 2006, pp. 2060–2069.10.1109/ISSCC.2006.1696265Search in Google Scholar
[6] G. Gallego, et al.., “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 44, no. 1, pp. 154–180, 2022. https://doi.org/10.1109/TPAMI.2020.3008413.Search in Google Scholar PubMed
[7] C. Posch, D. Matolin, and R. Wohlgenannt, “A QVGA 143 db dynamic range asynchronous address-event pwm dynamic image sensor with lossless pixel-level video compression,” in IEEE Conference of Solid-State Circuits, 2010, pp. 400–401.10.1109/ISSCC.2010.5433973Search in Google Scholar
[8] R. Berner, C. Brandli, M. Yang, S.-C. Liu, and T. Delbruck, “A 240 x 180 10 mW 12 µs latency sparse-output vision sensor for mobile applications,” in Symposium on Very Large Scale Integrated Systems, Japan, Kyoto, 2013, pp. C186–C187.Search in Google Scholar
[9] A. Manakov and B. Jähne, “Characterization of event-based image sensors in extent of the EMVA 1288 standard,” in Forum Bildverarbeitung 2020, M. Heizmann, and T. Längle, Eds., 2020, pp. 1–11.10.58895/ksp/1000124383-1Search in Google Scholar
[10] EMVA 1288 Working Group, EMVA Standard 1288 – Standard for Characterization of Image Sensors and Cameras, Release 4.0. Open Standard, European Machine Vision Association, European Machine Vision Association, Barcelona, Tech. Rep., 2021.Search in Google Scholar
[11] B. Jähne, “EMVA 1288 standard for machine vision – objective specification of vital camera data,” Opt. Photonik, vol. 5, no. 1, pp. 53–54, 2010. https://doi.org/10.1002/opph.201190082.Search in Google Scholar
[12] D. Joubert, A. Marcireau, N. Ralph, A. Jolley, A. Van Schaik, and G. Cohen, “Event camera simulator improvements via characterized parameters,” Front. Neurosci., vol. 15, no. 1, 2021, Art. no. 702765. https://doi.org/10.3389/fnins.2021.702765.Search in Google Scholar PubMed PubMed Central
[13] A. Manakov, H. Herrmann, and B. Jähne, “Performance comparison of area-scan and event-based sensors,” in Forum Bildverarbeitung 2024, T. Längle, and M. Heizmann, Eds., 2024, pp. 99–110.10.58895/ksp/1000174496-9Search in Google Scholar
[14] B. McReynolds, R. Graça, and T. Delbruck, “Experimental methods to predict dynamic vision sensor event camera performance,” Opt. Eng., vol. 61, no. 7, 2022. https://doi.org/10.1117/1.OE.61.7.074103.Search in Google Scholar
[15] C. Posch and D. Matolin, “Sensitivity and uniformity of a 0.18 µm CMOS temporal contrast pixel array,” in Circuits and Systems, ISCAS 2011, IEEE International Symposium, 2011, pp. 1572–1575.10.1109/ISCAS.2011.5937877Search in Google Scholar
[16] B. Jähne, Digitale Bildverarbeitung, 8th ed. Berlin, Heidelberg, Springer, 2024.10.1007/978-3-662-59510-7Search in Google Scholar
[17] A. Manakov, H. Herrmann, and B. Jähne, “Towards integration of event-based cameras into EMVA 1288 characterization (Poster),” in 6th European Machine Vision Forum, Wagingen, 2023.Search in Google Scholar
[18] A. Manakov, Event-based Sensor Characterization, Ph.D. thesis, Interdisciplinary Center for Scientific Computing (IWR), Heidelberg University, 2025, (In preparation).Search in Google Scholar
© 2025 Walter de Gruyter GmbH, Berlin/Boston
Articles in the same Issue
- Frontmatter
- Editorial
- Image Processing Forum – Forum Bildverarbeitung 2024
- Research Articles
- Optimizing speed and quality in rendering of sensor-realistic images
- Characterisation metrics for performance comparison of area scan and event-based sensors
- A comparative study on multi-task uncertainty quantification in semantic segmentation and monocular depth estimation
- Comparing fast semantic segmentation CNNs for FPGAs with standard methods
- A comparative study of Q-Seg, quantum-inspired techniques, and U-Net for crack image segmentation
- Bayesian optimization of single-pulse laser drilling using advanced image processing
- Robuste Ampeldetektion und Haltelinienfreigabe durch HD-Karten Assoziation
Articles in the same Issue
- Frontmatter
- Editorial
- Image Processing Forum – Forum Bildverarbeitung 2024
- Research Articles
- Optimizing speed and quality in rendering of sensor-realistic images
- Characterisation metrics for performance comparison of area scan and event-based sensors
- A comparative study on multi-task uncertainty quantification in semantic segmentation and monocular depth estimation
- Comparing fast semantic segmentation CNNs for FPGAs with standard methods
- A comparative study of Q-Seg, quantum-inspired techniques, and U-Net for crack image segmentation
- Bayesian optimization of single-pulse laser drilling using advanced image processing
- Robuste Ampeldetektion und Haltelinienfreigabe durch HD-Karten Assoziation