Evaluating sensor performance and configuration for autonomous operation of small ferries in maritime environments
-
Julius Schmalz
Julius Schmalz received the M.Sc. and Ph.D. degrees in electrical engineering from the University of Kiel, Germany, in 2014 and 2021, respectively. He was a research assistant from 2014 to 2020 and has been a postdoctoral researcher in the Integrated Systems and Photonics Group, led by Prof. Dr. Martina Gerken, since 2021. His early research focused on magnetoelectric sensing, including magnetic field simulation and sensor modeling. Since March 2023, he has been working on autonomous ferry systems within the CAPTN Initiative project Förde5G. His research interests include sensor fusion for autonomous maritime systems, LiDAR, mmWave radar, infrared cameras, and real-time sensor data processing., Christian Schyr
and Martina Gerken Christian Schyr received the Diploma degree in mechanical engineering from the Vienna University of Technology, Vienna, Austria, in 1992, the M.Sc. degree in aerospace engineering from The University of Texas at Austin, Austin, TX, USA, in 1994, and the Ph.D. degree from the Karlsruhe Institute of Technology, Karlsruhe, Germany, in 2006. He is currently a Principal Engineer with the Advanced Solution Laboratory at AVL Deutschland GmbH in Karlsruhe. His research focuses on the development of new methods and tools for highly automated and electrified mobility systems across the automotive, agricultural, and maritime domains. Martina Gerken received the Dipl.-Ing. degree in electrical engineering from the University of Karlsruhe, Karlsruhe, Germany, in 1998, and the Ph.D. degree in electrical engineering from Stanford University, Stanford, CA, USA, in 2003. From 2003 to 2008, she was an Assistant Professor with the University of Karlsruhe. In 2008, she was appointed as a Full Professor of Electrical Engineering and the Head of the Chair of Integrated Systems and Photonics, Kiel University, Kiel, Germany.
Abstract
Autonomous ferries offer a promising solution to challenges in public waterway transport, such as crew shortage and environmental impact. Safe navigation requires reliable sensor performance, particularly in adverse maritime conditions like fog, rain, and low visibility. This paper evaluates LiDAR, mmWave radar, and infrared/RGB cameras under real-world conditions aboard a medium-sized autonomous ferry prototype. A virtual test field in Unreal Engine 5 was used to assess sensor configurations and field trials validated performance in varying weather conditions. Results show LiDAR’s range and density loss in fog and rain, the robustness of mmWave radar for long-range detection, and the benefit of pairing it with PTZ-mounted infrared cameras for improved tracking in low-visibility scenarios. We propose an optimized multi-sensor configuration combining these technologies to maximize perception accuracy. By linking simulation and experimental findings, this study provides actionable recommendations for weather-resilient perception systems and informs sensor fusion strategies for small to medium-sized autonomous vessels.
Zusammenfassung
Autonome Fähren bieten eine vielversprechende Lösung für Herausforderungen im öffentlichen Wasserverkehr, wie Personalengpässe und Umweltbelastung. Für eine sichere Navigation ist eine zuverlässige Sensorleistung entscheidend, insbesondere bei widrigen Bedingungen wie Nebel, Regen und eingeschränkter Sicht. Diese Arbeit bewertet LiDAR, mmWave-Radar und Infrarot/RGB-Kameras unter realen Bedingungen an Bord eines mittelgroßen autonomen Fährprototyps. Ein virtuelles Testfeld in Unreal Engine 5 diente zur Analyse verschiedener Sensorkonfigurationen; Praxistests validierten die Ergebnisse bei unterschiedlichen Wetterbedingungen. Die Ergebnisse zeigen Reichweitenverluste und eine reduzierte Punktdichte von LiDAR bei Nebel und Regen, die Robustheit von mmWave-Radar für die Langstreckenerkennung sowie den Vorteil der Kombination mit schwenkbaren Infrarotkameras (PTZ) für eine verbesserte Objektverfolgung bei schlechter Sicht. Auf Basis dieser Erkenntnisse stellen wir eine optimierte Multisensor-Konfiguration vor, die diese Technologien zur Maximierung der Wahrnehmungsgenauigkeit kombiniert. Durch die Verknüpfung von Simulation und Experiment liefert die Studie konkrete Empfehlungen für wetterfeste Wahrnehmungssysteme und unterstützt die Entwicklung robuster Sensorfusionsstrategien für kleine und mittelgroße autonome Schiffe.
1 Introduction
Autonomous Surface Vehicles (ASVs) have been widely studied for navigation, environmental monitoring, and surveillance [1], [2], [3], [4], [5], [6], [7], [8]. These works highlight the need for perception systems being able to cope with dynamic maritime environments. For autonomous ferries, unstructured coastal waters and conditions such as fog, rain, and waves present particular challenges to sensor reliability.
In autonomous road transport, sensor suites combining LiDAR, cameras, and radar have achieved strong results in object detection and navigation. Each technology has limitations in adverse weather conditions. LiDAR, for example, is degraded by fog, rain, and snow through pulse attenuation and noise prompting research into filtering wavelength adaptation [9], [10], [11], and specialized processing to handle interference such as snow reflections [12], [13]. Recent work has developed weather-sensitive signal models to refine detection under varying conditions [14], [15].
Multi-sensor fusion addresses the limitations of individual sensors. Combining LiDAR and camera data improves detection in low-visibility conditions [16], [17], while integrating mmWave radar with thermal or optical imaging supports robust tracking [18], [19].
In the maritime domain, projects such as AMN have developed modular fusion frameworks for adaptable operation [4], and datasets like eM/S Salama promote shipborne deep learning [20]. Integrating LiDAR, an inertial measurement unit (IMU), and real-time kinematic (RTK) positioning enhances mapping accuracy and pose estimation, particularly during berthing [21], while active vision strategies improve trajectory tracking in GPS-denied environments [22]. Maritime autonomy also depends on path planning and collision avoidance, increasingly incorporating predictive risk assessment and decision-making under uncertainty [23], [24], [25], [26]. Although this study focuses on perception, the results support these systems by improving environmental awareness.
Most existing research targets large ocean-going vessels, with fewer studies addressing small to medium-sized ferries in coastal and inland waters. This work experimentally evaluates LiDAR, mmWave radar, and cameras under real maritime conditions, including fog, rain, and nighttime. Thombre et al. [27] emphasize integrating GNSS, radar, LiDAR, and vision within AI-driven fusion frameworks, but many studies do not quantify weather effects for small ferries. Helgesen et al. [28] and Kim et al. [29] investigate fusion in nearshore environments, but their vessels (<8 m) and contexts differ from medium-sized ferries, and they do not directly compare sensor types under varied maritime weather. Here, we focus on evaluating and selecting sensors for the autonomous operation of the research vessel MV Wavelab, with emphasis on robustness in challenging visibility.
We evaluate the performance of multiple sensor types for autonomous ferry operations, considering resilience to adverse weather. Using the MV Wavelab, a prototype for high-frequency public transport in coastal waters, we assess sensor reliability and propose optimized configurations for enhanced maritime perception. We examine the impact of weather on LiDAR, mmWave radar and cameras, as well as the benefits of combining mmWave radar with infrared cameras for tracking and classification in low visibility. Furthermore, we explore placement strategies to ensure coverage and the potential of adaptive fusion to counter environmental disturbances. By focusing on medium-sized ferries in fjords and nearshore waters, we extend previous research through empirical validation of sensor performance in real conditions, providing actionable recommendations for deployment.
2 Sensor selection
Selecting sensors for an autonomous ferry requires dependable, high-precision detection over varied ranges and environmental conditions. Complementary strengths across modalities ensure robust and accurate sensing in diverse scenarios.
LiDAR provides high-resolution mapping in the mid-range, offering detailed 3D spatial data for navigation and obstacle avoidance. The Ouster OS2 [30] was chosen for long-range, high-resolution collision avoidance, while the Blickfeld Cube 1 [31] supports precise docking and nearby obstacle detection.
For extended-range detection, the Continental ARS548 mmWave radar [32] provides coverage of up to 1,500 m. Although less resolved than LiDAR, it is well-suited for the early identification and tracking of distant objects.
Optical sensing complements these modalities by adding classification capabilities. The AXIS P1455-LE RGB camera [33] with optical zoom supplies rich visual data, while the PTZ-mounted infrared camera with telephoto lens (AXIS Q8642-E) [34] enhances detection in fog or at night by dynamically adjusting its field of view.
Together, these sensors balance precision, range, and bandwidth, enabling reliable performance across operating conditions (see Table 2).
3 Sensor simulation
In parallel to real sensor measurements, sensor models were used within a virtual test field implemented in the Unreal Engine 5 VR environment. Built-in Unreal Engine features modeled the camera, and a custom LiDAR model was developed and integrated using C-Code (Figure 1(a)). This virtual environment enabled testing of sensor configurations under controlled conditions.

Virtual test field and simulated vessel lineup. (a) Digital twins of the research vessel, camera, and LiDAR sensors. (b) Top-down view of the lineup used for sensor performance analysis.
The LiDAR model was parameterized to match three sensors: the Ouster OS2 128, Blickfeld Cube 1 Outdoor, and Velodyne Puck VLP-16 [35]. Parameters included field of view, resolution, scan rate, point count per frame, and range.
The simulation setup (Figure 1, Table 1) consisted of the ego vessel (MV Wavelab, i.e., the own ship under control) and four rows of vessels at increasing distances, each with a ferry and a small dinghy spaced 50 m apart. This provided targets of different size and shape, enabling systematic evaluation of detection at various ranges.
Object positions and range as the distance from the ego (own-ship) LiDAR sensor to each vessel.
| Object | Posx/m | Range/m |
|---|---|---|
| Ego_Lidar | 28.1 | 0 |
| Vessel row 1 | 100 | 72 |
| Vessel row 2 | 150 | 122 |
| Vessel row 3 | 200 | 172 |
| Vessel row 4 | 250 | 222 |
A ray-tracing approach simulated scanning, calculated target distances, and analyzed coverage, blind spots, and placement effects. This allowed direct comparison of configurations under realistic maritime scenarios.
Figure 2 compares point clouds from the three LiDAR models. Differences in vertical resolution are clear, especially for the dinghy at 72 m: the VLP-16 fails to detect it, while the Blickfeld and Ouster succeed. The Ouster’s extended range enables detection of the farthest targets, while shorter-range sensors are limited. Beam spacing on the ferries shows how resolution affects detail.

Comparison of sensor data from Velodyne VLP16 (a), Blickfeld Cube 1 (b), and Ouster OS2 (c). Each column represents one sensor, with images stacked vertically to show different perspectives or scenarios for that sensor.
Ray-hit visualizations supported efficient performance analysis in varied scenarios. Tests with small craft such as kayaks and paddleboards showed that adaptive parameterization can improve detection by distance. The flexible virtual test field thus supports optimizing sensor placement and performance before deployment.
4 Experimental setup & placement strategy
The autonomous sensor system was tested aboard the MV Wavelab and on a fixed frame at Arsenal Hafen Kiel. The 21 m long, 8 m wide catamaran, developed with regional companies and universities, serves as a platform for autonomous navigation and propulsion experiments. Workspaces, computers, and control panels provide an environment for extensive real-world testing.
A modular, transportable sensor frame (Figure 3) was designed for deployment on both vessel and pier. It housed LiDAR, mmWave radar, RGB and IR cameras, and a 5G router. Using the public mobile network instead of a dedicated campus network avoided high infrastructure costs for Kiel Fjord coverage. Deploying the frame in both locations enabled data capture from different vantage points, supporting comprehensive evaluation. The design ensured precise alignment, reliable data collection, and easy maintenance, while real-time 5G transmission allowed immediate analysis and remote monitoring.

Sensor frame mounted on the MV Wavelab. (a) View of the frame on the port bow. (b) Same view with sensors highlighted for identification, including LiDAR, mmWave radar, RGB and PTZ-IR cameras, and the 5G antenna.
During experiments, sensors were placed for practicality rather than optimal autonomous operation. While effective for data collection, these locations did not fully use the vessel’s dedicated sensor frame, which provides an unobstructed 360° field of view above passenger height ensuring safety.
An optimized placement would mount the Ouster OS2 on the mast for full 360° detection, two mmWave radar units at the bow with overlapping 120° coverage, and RGB cameras around the frame for complete visual coverage. The PTZ-mounted IR camera would also be mast-mounted, focusing dynamically on objects identified by other sensors. For docking and short-range detection, Blickfeld LiDAR units would be positioned around the perimeter. Figure 4 shows proposed coverage and update rates.

Proposed sensor coverage and update rates on the MV Wavelab, showing view angles of the mmWave radar, Blickfeld Cube 1 LiDAR, Ouster OS2 LiDAR, and PTZ-mounted infrared camera.
While the experimental placement was sufficient for evaluation, the vessel’s design allows a highly optimized configuration. Combining real-world tests with virtual placement validation provides a replicable approach for configuring sensors on vessels of various sizes and purposes.
5 Results
To evaluate sensor performance under various weather conditions, we analyzed data collected using LiDAR, mmWave radar, and IR/RGB cameras. The following sections compare each sensor’s ability to detect objects under clear, foggy, rainy, and nighttime conditions, highlighting their limitations and strengths in autonomous ferry applications.
5.1 Performance of LiDAR and mmWave radar under fog
Fog is known to attenuate LiDAR signals significantly. A visual comparison of the scene with and without fog (Figure 5) provides context for the point-cloud analysis in Figure 6. In clear conditions the LiDAR exhibits robust range and high point density (a). However, the fog case (c) and the corresponding histograms quantify the degradation (d). Specifically, the LiDAR experiences a significant range loss of 158 m and an average resolution loss of 72 %, demonstrating its vulnerability to fog-induced signal attenuation.

Camera images of the same scene in clear conditions (a) and under fog (b), providing visual reference for evaluating LiDAR and mmWave performance in adverse weather.
The range loss and resolution loss values are calculated by analyzing the Euclidean distances of detected points in both clear and foggy conditions. The maximum range is determined by computing the farthest detected point from the sensor in both scenarios and taking their difference. To quantify the reduction in point density, the detected points are binned into distance intervals, and the relative decrease in the number of points per bin is computed. The average resolution loss is then obtained by averaging these percentage reductions across all bins where points were detected in clear conditions.

Comparison of LiDAR point clouds and distance distributions with and without fog; (a) and (c) show point distributions under clear and foggy conditions, with semi-circles marking distance intervals. (b) and (d) Present the corresponding histograms, highlighting reduced point density and range in fog.
Conversely, the mmWave radar demonstrates better penetration through fog (Figure 7). In both clear and foggy conditions ((a) and (c)), the radar maintains a more consistent range with minimal reduction in point density, as shown in the histograms ((b) and (d)). Notably, the mmWave radar exhibits an almost negligible range loss of only 0.1 m and an average resolution loss of 7 %, confirming its robustness against fog-related interference. This makes mmWave radar a more reliable option in foggy weather.

Comparison of mmWave sensor point clouds and distance distributions with and without fog; (a) and (c) show point distributions under clear and foggy conditions, with semi-circles marking distance intervals. (b) and (d) Present the corresponding histograms, illustrating reduced range and point density in fog.
5.2 Performance under rainy conditions
In rain, LiDAR performance degrades further with significant loss of point density and range, as shown in Figure 8. The LiDAR points missing at the top of (c) indicate areas obscured by rain droplets and high humidity. This effect is also visible in the distance histogram (d) confirming the impact of rain on LiDAR’s effectiveness. Compared to fog, rain-induced degradation is less severe but still substantial with a range loss of 25 m and an average resolution loss of 30 %.

Comparison of LiDAR point clouds and distance distributions with and without rain; (a) and (c) show point distributions under clear and rainy conditions, with semi-circles marking distance intervals. In (c), missing points at the top correspond to the harbor mole, obscured by rain, humidity, and spray. (b) and (d) Present the corresponding distance histograms, showing reduced point density and range in rain.

mmWave radar data in rainy conditions. (a) Displays data captured during heavy rain, while (b) provides a reference after the rain. A significant reduction in point density at greater distances is observed under rainy conditions.
The mmWave radar data under rain conditions (Figure 9) reveals that while rain reduces point density at greater distances (a), the radar retains functional range and detection capabilities better than LiDAR. However, compared to fog, the impact of rain on mmWave radar is more pronounced (Figure 10), with a range loss of 381 m and an average resolution loss of 60 %. While still more robust than LiDAR, this suggests that heavy rain significantly affects mmWave radar’s long-range detection capability.

Comparison of mmWave sensor point clouds and distance distributions with and without rain; (a) and (c) show spatial point distributions under clear and rainy conditions, with semicircles marking distance intervals. In (c), rain and airborne moisture cause missing points, especially near the harbor mole. (b) and (d) Present the corresponding distance histograms, highlighting rain-induced reductions in range and point density.
5.3 Impact of sunlight and nighttime on sensor performance
Direct sunlight can lead to false positives in LiDAR measurements, as seen in Figure 11(b) The cone-shaped regions in (a) display random points caused by sunlight glare, which can be filtered with minimal loss in resolution. This challenge is less pronounced for mmWave radar and IR cameras, which are not affected by visible light interference.

Direct sunlight glare causes cone-shaped regions with false positive LiDAR measurements. (a) Shows the camera image of a tall ship on the pier with glare from direct sunlight, while (b) depicts the corresponding LiDAR image with cone-shaped false positives. These random points can be effectively filtered, with only a minor reduction in overall resolution.
At night, the performance of RGB cameras is limited due to low visibility, as shown in Figure 12. While the RGB camera image (a) lacks detail, the mmWave radar (b) and IR camera (c) successfully detect key objects like vessels and seamarks. This outcome reinforces the advantages of radar and IR sensors in low-visibility conditions.

Scene from a night trip using camera, mmWave radar, and IR imaging. (a) Shows the dark, low-contrast camera image with only boat and seamark lights visible. (b) Presents mmWave radar data, clearly detecting objects, while (c) provides a clear IR view of the boat and seamark. Key objects – a container vessel, sailboat, and seamark – are highlighted with red circles for comparison. LiDAR data is omitted, as objects were too distant to capture effectively.
5.4 Detection of small and distant objects
Detecting small or distant objects, such as a person in water, is critical for ferry safety. Figure 13 compares LiDAR and IR camera capabilities in detecting a person at 170 m ((a) and (b)) and 75 m ((c) and (d)). The IR camera captures clear images of the person’s head at both distances, whereas LiDAR detection is limited to a few points, indicating its reduced sensitivity for small, distant targets.

Detection of a person in water using LiDAR and IR camera. (a) Shows the IR image and (b) shows the LiDAR data of a person at a distance of 170 m. In comparison, (c) and (d) display the IR and LiDAR data, respectively, for the same person at 75 m. While the person’s head is clearly visible in both IR images, only a few LiDAR points are detected at 75 m, and a single point at 170 m, highlighting the reduced LiDAR sensitivity for distant, small objects in water.
5.5 Clustering and tracking capabilities
Our object tracking system combines data from mmWave radar and LiDAR sensors, each offering distinct advantages for tracking under varying conditions. This dual-sensor approach enhances tracking flexibility and resilience in diverse environmental settings.
The primary tracking method relies on mmWave radar data, which provides a sparse set of points for each detected object. To handle this low point density, we applyied DBSCAN (Density-Based Spatial Clustering of Applications with Noise) [36] to identify object clusters. DBSCAN efficiently separates objects by finding core points with a sufficient number of neighbors within a specified radius, making it robust against noise and varying densities in radar detections.
Once clustered, objects are tracked using a Kalman filter, which estimates their movement trajectories and predicts future positions. This enables real-time assessment of potential collision risks, ensuring continuous situational awareness (see Figure 14(e), which visualizes the mmWave radar detections and tracking data).

Radar-guided infrared target verification using a PTZ camera. (a) Visible spectrum image captured by the camera, showing a seamark, a sailing vessel, and a second vessel. (b)–(d) Show the corresponding infrared (IR) images, captured as the camera is automatically oriented to objects identified by the mmWave radar. The small camera icons in (b), (c), and (d) illustrate the orientation dynamically adjusting for each target. (e) mmWave radar detections and tracking data, displaying tracked positions of the objects, each labeled with ID, angle, and distance from the radar.
A PTZ-mounted infrared camera is integrated into the system to provide detailed images of detected objects. After each object is identified and tracked by the mmWave radar, the camera sequentially adjusts its orientation to capture images of individual targets. For each detected object, the system records a visible spectrum image of the scene (Figure 14(a)), an infrared image of the object (Figure 14(b)–(d)), as well as the angle and distance information provided by the radar (Figure 14(e)).
This approach provides a robust and automated method to acquire high-resolution images of distant objects, overcoming the limitations of standard wide-angle cameras, where objects may appear too small for classification. The combination of mmWave radar and an adaptive PTZ camera significantly improves object detection reliability in challenging maritime conditions, particularly in low-visibility environments such as nighttime or fog.
For the second approach, we use LiDAR data, which provides a high-density point cloud and enables detailed object detection and tracking. The increased resolution of LiDAR allows for precise spatial mapping but can lead to complex clustering challenges. For instance, single objects sometimes appear as multiple clusters, especially when the object’s shape or distance causes variations in point density. This effect is due to using fixed clustering parameters for all distances, which may not be optimal for both close and distant objects. An improvement could involve distance-dependent clustering parameters, which adjust clustering sensitivity based on the object’s range from the sensor, thus maintaining a consistent representation of objects at various distances. Figure 15 demonstrates the clustering results on LiDAR data, highlighting the level of detail achievable with this high-resolution sensor.

Results of clustering LiDAR points, with one clustered object highlighted in green. A red rectangle marks the center of the cluster, which can serve as a reference point for object tracking. This clustering approach enables precise tracking of detected objects based on LiDAR data.
In terms of environmental resilience, each sensor offers unique strengths. mmWave radar performs consistently well in adverse conditions, such as fog and rain, as it is less affected by atmospheric interference than LiDAR. However, it lacks the spatial resolution that LiDAR provides in clear conditions, which is essential for precise object detection and close-proximity tracking. Together, these complementary sensors support a robust tracking system, with mmWave radar providing reliable long-range tracking and collision prediction and LiDAR offering high-resolution mapping for detailed object recognition in favorable conditions.
6 Discussion
This study systematically evaluated the performance of LiDAR, mmWave radar, and RGB/infrared (IR) cameras under real-world maritime conditions, including fog, rain, and nighttime. LiDAR provided high-resolution spatial mapping and was highly effective for docking and short-range navigation, but its range and point density degraded substantially in adverse weather. mmWave radar maintained reliable long-range detection in all tested conditions, though its lower spatial resolution limited detailed classification. RGB cameras offered rich visual detail in clear weather, while PTZ-mounted IR cameras enabled targeted observation of distant objects under low-visibility conditions. However, both camera types primarily provide angular information, with distance only roughly estimated from object size and perspective. From a bandwidth perspective, LiDAR and cameras generate large data volumes, especially at higher frame rates, whereas mmWave radar transmits lower-volume range and velocity data, making it less demanding on communication links.
A central finding of this work is that combining mmWave radar with a PTZ-mounted telephoto infrared camera is particularly effective in maritime environments. The radar provides reliable detection and tracking of objects even in poor visibility, and the PTZ-IR camera can then be automatically directed to these targets for a detailed visual view. This pairing enables future object classification algorithms to operate on high-quality imagery that would not be available from fixed cameras or LiDAR alone, thus improving overall situational awareness.
The virtual test field, implemented in Unreal Engine, allowed rapid evaluation of sensor configurations during the early design phase when access to the vessel was limited. Simulations provided quantitative insights into field of view, detection range, and the detectability of small targets, such as dinghies, using only datasheet specifications. This approach enabled optimization of sensor placement, identification of blind spots, and assessment of trade-offs before physical installation, reducing both costs and on-water testing time. The flexibility of simulation also supported comparative testing of multiple LiDAR models, confirming the operational benefits of higher vertical resolution (elevation in the case of radar) and extended range.
Previous studies on autonomous vessels have emphasized multi-sensor fusion but rarely quantified individual sensor degradation under real-world maritime conditions. For example, Helgesen et al. [28] and Kim et al. [29] demonstrate the feasibility of integrating LiDAR, radar, and automatic identification system (AIS) signals, which provide vessel identity and position information, but do not explicitly address the effects of fog, rain, or nighttime on performance. Review articles such as Thombre et al. [27] and Liu et al. [37] outline theoretical advantages of combined GNSS, radar, LiDAR, and vision-based perception but rely on idealized models or controlled experiments. Our results address this gap by providing empirical degradation metrics, validating mmWave radar’s robustness and LiDAR’s weather sensitivity in realistic ferry operations.
Performance alone cannot dictate sensor selection; cost and maintainability are equally important for operational deployment. High-end LiDAR systems offer excellent spatial resolution but are expensive and weather-sensitive. In contrast, mmWave radar is approximately one-tenth the cost, requires less bandwidth, and maintains performance in adverse conditions, making it an attractive backbone for perception. Infrared cameras, while less expensive than LiDAR, add critical classification capabilities in low-visibility scenarios. The cost–performance trade-off strongly supports a mixed sensor suite rather than relying on a single technology.
Based on the experimental results, simulations, and cost analysis, we recommend for the research vessel MV Wavelab a configuration consisting of an Ouster OS2 LiDAR mounted on the mast for unobstructed 360° mapping in favorable conditions, a Continental ARS548 mmWave radar at the bow for robust long-range detection in all weather, a Blickfeld Cube 1 LiDAR for short-range precision during docking, and a PTZ-mounted infrared camera dynamically cued by the mmWave radar for detailed observation and classification. This configuration ensures balanced coverage, robustness to adverse weather, and cost-effective operation while addressing the unique spatial constraints and mission requirements of the Kiel Fjord.
While this study focused on individual sensor performance, future work should investigate adaptive fusion methods that integrate additional data sources such as IMU, GPS, and AIS, as well as the effects of wind and sea state on sensor stability. Refining AI-driven perception models to enhance small-object detection will further strengthen maritime autonomy.
7 Conclusions
Overall, the study demonstrates the operational value of combining mmWave radar with a PTZ-mounted telephoto infrared camera. mmWave radar provides reliable long-range detection in all tested weather conditions, while the PTZ-IR camera can be directed to radar targets for detailed visual inspection, enabling high-quality imagery for future object classification and significantly improving situational awareness in low visibility. For mid-sized autonomous ferries such as the MV Wavelab, this configuration offers a cost-effective balance of robustness and precision. LiDAR remains essential for high-resolution mapping and docking under clear conditions, with a short-range unit such as the Blickfeld Cube 1 further supporting precise maneuvers.
The findings deliver an empirical dataset of sensor degradation in real maritime environments, validate the advantages of high-resolution mmWave radar, and highlight the role of virtual test fields in reducing the risks associated with sensor deployment. By combining resilient sensing hardware, adaptive fusion strategies, and simulation-driven optimization, autonomous ferries can achieve safe and reliable operation across a wide range of environmental conditions.
Funding source: Bundesministerium für Verkehr und Digitale Infrastruktur
Award Identifier / Grant number: 45FGU139_H
About the authors

Julius Schmalz received the M.Sc. and Ph.D. degrees in electrical engineering from the University of Kiel, Germany, in 2014 and 2021, respectively. He was a research assistant from 2014 to 2020 and has been a postdoctoral researcher in the Integrated Systems and Photonics Group, led by Prof. Dr. Martina Gerken, since 2021. His early research focused on magnetoelectric sensing, including magnetic field simulation and sensor modeling. Since March 2023, he has been working on autonomous ferry systems within the CAPTN Initiative project Förde5G. His research interests include sensor fusion for autonomous maritime systems, LiDAR, mmWave radar, infrared cameras, and real-time sensor data processing.

Christian Schyr received the Diploma degree in mechanical engineering from the Vienna University of Technology, Vienna, Austria, in 1992, the M.Sc. degree in aerospace engineering from The University of Texas at Austin, Austin, TX, USA, in 1994, and the Ph.D. degree from the Karlsruhe Institute of Technology, Karlsruhe, Germany, in 2006. He is currently a Principal Engineer with the Advanced Solution Laboratory at AVL Deutschland GmbH in Karlsruhe. His research focuses on the development of new methods and tools for highly automated and electrified mobility systems across the automotive, agricultural, and maritime domains.

Martina Gerken received the Dipl.-Ing. degree in electrical engineering from the University of Karlsruhe, Karlsruhe, Germany, in 1998, and the Ph.D. degree in electrical engineering from Stanford University, Stanford, CA, USA, in 2003. From 2003 to 2008, she was an Assistant Professor with the University of Karlsruhe. In 2008, she was appointed as a Full Professor of Electrical Engineering and the Head of the Chair of Integrated Systems and Photonics, Kiel University, Kiel, Germany.
-
Research ethics: Not applicable.
-
Informed consent: Not applicable.
-
Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission. Conceptualization, J.S. and M.G.; validation, J.S., and C.S.; investigation, J.S., and C.S.; methodology, J.S., and M.G.; writing–original draft preparation, J.S.; writing–review and editing, J.S., C.S, and M.G.; visualization, J.S.; supervision, M.G.; funding acquisition, M.G.
-
Use of Large Language Models, AI and Machine Learning Tools: None declared.
-
Conflict of interest: Author Christian Schyr was employed by AVL Deutschland GmbH. The funders had no role in the design of the study, the collection, analyses, or interpretation of data, the writing of the manuscript, or the decision to publish the results. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
-
Research funding: This research was funded by the Bundesministerium für Digitales und Verkehr (BMDV) 45FGU139_H.
-
Data availability: The raw data can be obtained on request from the corresponding author.
Comparison of LiDAR, mmWave, and camera sensors.
| Specification | Ouster OS2-128 | Blickfeld Cube 1 | ARS548 (mmWave) | AXIS Q8642-E | AXIS P1455-LE |
|---|---|---|---|---|---|
| Image |
|
|
|
|
|
| Field of view (H, V) | H: 360°, V: 22.5° | H: 70°, V: 30° | H: ±60°, V: ±4° (300 m) to ±14° (100 m) | 10° | 114°–37° |
| Angular resolution (H, V) | H: 0.18°–0.7°, V: 0.18° | H: 0.4°, V: 0.3° | H: ±0.1° to ±0.5°, V: ±0.1° | – | – |
| Scan frequency/frame rate | 10 or 20 Hz | 1.5 Hz | 10 Hz | Up to 8.3 fps and 30 fps | 50 or 60 fps |
| Range/coverage | 400 m | 75 m | 300 m (up to 1,500 m) | – | – |
| Wavelength | 865 nm | 905 nm | 3.90–3.95 mm | 8–14 µm | ≈400–700 nm |
| Sensitivity | – | – | – | NETD < 50 mK | – |
| Resolution | 2,048 × 128 | 175 × 400 | – | 640 × 480 | 1,920 × 1,080 |
| Bandwidth/data rate | 128–256 Mbit/s | 24 Mbit/s | 8 MBit/s | 4 MBit/s | 8 MBit/s |
| Interfaces | Ethernet, UDP | Ethernet, UDP | BRR-Ethernet, UDP | Ethernet | Ethernet |
| Dimensions/weight | 119 × 119 × 99 mm3, 1.1 kg | 60 × 82 × 86 mm3, 330 g | 137 × 90 × 39 mm3, 526 g | 557 × 229 × 289 mm3, 12.2 kg | 132 × 132 × 260 mm3, 1.0 kg |
References
[1] Y. Cheng, M. Jiang, J. Zhu, and Y. Liu, “Are we ready for unmanned surface vehicles in inland waterways? The usvinland multisensor dataset and benchmark,” CoRR, abs/2103.05383, 2021, https://doi.org/10.48550/arXiv.2103.05383.Search in Google Scholar
[2] J. Choi, J. Park, J. Jung, Y. Lee, and H.-T. Choi, “Development of an autonomous surface vehicle and performance evaluation of autonomous navigation technologies,” Int. J. Control Autom. Syst., vol. 18, no. 3, pp. 535–545, 2020, https://doi.org/10.1007/s12555-019-0686-0.Search in Google Scholar
[3] M. Dunbabin, A. Grinham, and J. Udy, “An autonomous surface vehicle for water quality monitoring,” in Australasian Conference on Robotics and Automation (ACRA), Citeseer, 2009, pp. 2–4.Search in Google Scholar
[4] L. Elkins, D. Sellers, and W. R. Monach, “The autonomous maritime navigation (amn) project: field tests, autonomous and cooperative behaviors, data fusion, sensors, and vehicles,” J. Field Robot., vol. 27, no. 6, pp. 790–818, 2010, https://doi.org/10.1002/rob.20367.Search in Google Scholar
[5] H. K. Heidarsson and G. S. Sukhatme, “Obstacle detection and avoidance for an autonomous surface vehicle using a profiling sonar,” in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 731–736.10.1109/ICRA.2011.5980509Search in Google Scholar
[6] T. Huntsberger, H. Aghazarian, A. Howard, and D. C. Trotz, “Stereo vision-based navigation for autonomous surface vessels,” J. Field Robot., vol. 28, no. 1, pp. 3–18, 2011, https://doi.org/10.1002/rob.20380.Search in Google Scholar
[7] E. T. Steimle and M. L. Hall, “Unmanned surface vehicles as environmental monitoring and assessment tools,” in OCEANS 2006, 2006, pp. 1–5.10.1109/OCEANS.2006.306949Search in Google Scholar
[8] M. T. Wolf, et al.., “360-degree visual detection and target tracking on an autonomous surface vehicle,” J. Field Robot., vol. 27, no. 6, pp. 819–833, 2010, https://doi.org/10.1002/rob.20371.Search in Google Scholar
[9] C. Dannheim, C. Icking, M. Mader, and P. Sallis, “Weather detection in vehicles by means of camera and lidar systems,” in 6th International Conference on Computational Intelligence, Communication Systems and Networks, CICSyN 2014, Institute of Electrical and Electronics Engineers Inc., 2014, pp. 186–191.10.1109/CICSyN.2014.47Search in Google Scholar
[10] M. Kutila, P. Pyykönen, W. Ritter, O. Sawade, and B. Schäufele, “Automotive lidar sensor development scenarios for harsh weather conditions,” in IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC, 2016, pp. 265–270.10.1109/ITSC.2016.7795565Search in Google Scholar
[11] M. Kutila, P. Pyykonen, H. Holzhuter, M. Colomb, and P. Duthon, “Automotive lidar performance verification in fog and rain,” in IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC, 2018-November, 2018, pp. 1695–1701.10.1109/ITSC.2018.8569624Search in Google Scholar
[12] J.-I. Park, J. Park, and K. S. Kim, “Fast and accurate desnowing algorithm for lidar point clouds,” IEEE Access, vol. 8, pp. 160202–160212, 2020, https://doi.org/10.1109/ACCESS.2020.3020266.Search in Google Scholar
[13] J. Wu, H. Xu, Y. Tian, R. Pi, and R. Yue, “Vehicle detection under adverse weather from roadside lidar data,” Sensors, vol. 20, no. 12, pp. 1–17, 2020. https://doi.org/10.3390/s20123433.Search in Google Scholar PubMed PubMed Central
[14] J.-I. Park, S. Jo, H.-T. Seo, and J. Park, “LiDAR-based snowfall level classification for safe autonomous driving in terrestrial, maritime, and aerial environments,” Sensors, vol. 24, no. 17, p. 5587, 2024, https://doi.org/10.3390/s24175587.Search in Google Scholar PubMed PubMed Central
[15] Y. Xie, C. Nanlal, and Y. Liu, “Reliable lidar-based ship detection and tracking for autonomous surface vehicles in busy maritime environments,” Ocean Eng., vol. 312, no. 3, p. 119288, 2024. https://doi.org/10.1016/j.oceaneng.2024.119288.Search in Google Scholar
[16] H. Hilmarsen, N. Dalhaug, T. A. Nygård, E. F. Brekke, R. Mester, and A. Stahl, “Maritime tracking-by-detection with object mask depth retrieval through stereo vision and lidar,” in 2024 27th International Conference on Information Fusion (FUSION), 2024, pp. 1–8.10.23919/FUSION59988.2024.10706307Search in Google Scholar
[17] Z. Wei, F. Zhang, S. Chang, Y. Liu, H. Wu, and Z. Feng, “mmWave radar and vision fusion for object detection in autonomous driving: a review,” Sensors, vol. 22, no. 7, p. 2542, 2022. https://doi.org/10.3390/s22072542.Search in Google Scholar PubMed PubMed Central
[18] D. Cormack, I. Schlangen, J. R. Hopgood, and D. E. Clark, “Joint registration and fusion of an infrared camera and scanning radar in a maritime context,” IEEE Trans. Aero. Electron. Syst., vol. 56, no. 2, pp. 1357–1369, 2020, https://doi.org/10.1109/TAES.2019.2929974.Search in Google Scholar
[19] B. Iepure and A. W. Morales, “A novel tracking algorithm using thermal and optical cameras fused with mmWave radar sensor data,” IEEE Trans. Consum. Electron., vol. 67, no. 4, pp. 372–382, 2021. https://doi.org/10.1109/TCE.2021.3128825.Search in Google Scholar
[20] J. Kalliovaara, et al.., “Deep learning test platform for maritime applications: development of the eM/S salama unmanned surface vessel and its remote operations center for sensor data collection and algorithm development,” Remote Sens., vol. 16, no. 9, p. 1545, 2024, https://doi.org/10.3390/rs16091545.Search in Google Scholar
[21] H. Lu, Y. Zhang, C. Zhang, Y. Niu, Z. Wang, and H. Zhang, “A multi-sensor fusion approach for maritime autonomous surface ships berthing navigation perception,” Ocean Eng., vol. 316, p. 119965, 2025, https://doi.org/10.1016/j.oceaneng.2024.119965.Search in Google Scholar
[22] H. He, N. Wang, D. Huang, and B. Han, “Active vision-based finite-time trajectory-tracking control of an unmanned surface vehicle without direct position measurements,” IEEE Trans. Intell. Transport. Syst., vol. 25, no. 9, pp. 12151–12162, 2024, https://doi.org/10.1109/TITS.2024.3364770.Search in Google Scholar
[23] G. Al-Falouji, T. Beyer, S. Gao, and S. Tomforde, “Steering towards maritime safety with true motion predictions ensemble,” in 2024 IEEE International Conference on Autonomic Computing and Self-Organizing Systems Companion (ACSOS-C), 2024, pp. 7–12.10.1109/ACSOS-C63493.2024.00021Search in Google Scholar
[24] D. F. Campos, E. P. Gonçalves, H. J. Campos, M. I. Pereira, and A. M. Pinto, “Nautilus: an autonomous surface vehicle with a multilayer software architecture for offshore inspection,” J. Field Robot., vol. 41, no. 4, pp. 966–990, 2024, https://doi.org/10.1002/rob.22304.Search in Google Scholar
[25] N. Smirnov and S. Tomforde, “Navigation support for an autonomous ferry using deep reinforcement learning in simulated maritime environments,” in 2022 IEEE Conference on Cognitive and Computational Aspects of Situation Management (CogSIMA), 2022, pp. 142–149.10.1109/CogSIMA54611.2022.9830689Search in Google Scholar
[26] J. Zalewski and S. Hożyń, “Computer vision-based position estimation for an autonomous underwater vehicle,” Remote Sens., vol. 16, no. 5, p. 741, 2024, https://doi.org/10.3390/rs16050741.Search in Google Scholar
[27] S. Thombre, et al.., “Sensors and AI techniques for situational awareness in autonomous ships: a review,” IEEE Trans. Intell. Transport. Syst., vol. 23, no. 1, pp. 64–83, 2022, https://doi.org/10.1109/TITS.2020.3023957.Search in Google Scholar
[28] Ø. K. Helgesen, K. Vasstein, E. F. Brekke, and A. Stahl, “Heterogeneous multi-sensor tracking for an autonomous surface vehicle in a littoral environment,” Ocean Eng., vol. 252, p. 111168, 2022, https://doi.org/10.1016/j.oceaneng.2022.111168.Search in Google Scholar
[29] J. Kim, et al.., “Field experiment of autonomous ship navigation in canal and surrounding nearshore environments,” J. Field Robot., vol. 41, no. 2, pp. 470–489, 2024, https://doi.org/10.1002/rob.22262.Search in Google Scholar
[30] Ouster, Inc, Ouster OS2: Datasheet (Rev 7, v2.5), Ouster, Inc., 2025, Available at: https://data.ouster.io/downloads/datasheets/datasheet-rev7-v2p5-os2.pdf.Search in Google Scholar
[31] Blickfeld GmbH, Blickfeld Cube 1 Outdoor: Datasheet (v 1.1), Blickfeld GmbH, 2022, Available at: https://www.blickfeld.com/wp-content/uploads/2022/10/blickfeld_Datasheet_Cube1-Outdoor_v1.1.pdf.Search in Google Scholar
[32] Continental Engineering Services GmbH, Continental ARS 548 RDI: Datasheet, Continental Engineering Services GmbH, 2023, Available at: https://engineering-solutions.aumovio.com/wp-content/uploads/2025/08/RadarSensors_ARS548RDI.pdf.Search in Google Scholar
[33] Axis Communications AB, AXIS P1455-LE: Datenblatt, Axis Communications AB, 2025, Available at: https://www.axis.com/dam/public/6a/e6/fd/datasheet-axis-p1455-le-network-camera-de-DE-476974.pdf.Search in Google Scholar
[34] Axis Communications AB, AXIS Q8642-E PT: Datenblatt, 2025, Available at: https://www.axis.com/dam/public/4b/7f/c3/datasheet-axis-q8642-e-pt-thermal-network-camera-de-DE-291211.pdf.Search in Google Scholar
[35] Ouster, Inc, Velodyne VLP-16: Datasheets, 2025, Available at: https://data.ouster.io/downloads/datasheets/velodyne/Puck%20Datasheets.zip.Search in Google Scholar
[36] M. Ester, H.-P. Kriegel, J. Sander, and X. Xu, “A density-based algorithm for discovering clusters in large spatial databases with noise,” in Second International Conference on Knowledge Discovery and Data Mining, KDD’96, AAAI Press, 1996, pp. 226–231.Search in Google Scholar
[37] Z. Liu, Y. Zhang, X. Yu, and C. Yuan, “Unmanned surface vehicles: an overview of developments and challenges,” Annu. Rev. Control, vol. 41, pp. 71–93, 2016, https://doi.org/10.1016/j.arcontrol.2016.04.018.Search in Google Scholar
© 2025 the author(s), published by De Gruyter, Berlin/Boston
This work is licensed under the Creative Commons Attribution 4.0 International License.