Startseite Mathematik Fusion of the optical and microwave images for cloud removal
Kapitel
Lizenziert
Nicht lizenziert Erfordert eine Authentifizierung

Fusion of the optical and microwave images for cloud removal

  • Bhavneet Kaur ORCID logo , Raman Maini ORCID logo und Sartajvir Singh
Veröffentlichen auch Sie bei De Gruyter Brill
RADAR
Ein Kapitel aus dem Buch RADAR

Abstract

The presence of clouds in optical imagesoptical images is a notable threat that needs to be dealt with since it tampers the accuracy of the observation and analysis process. The cloud-contaminated pixels can be identified and eradicated by the fusion techniques. Microwave and optical images are fused to leverage their respective strengths. Optical images offer the advantage of providing detailed high-resolution multispectralmultispectral information, whereas microwave images provide benefits such as better penetration, independence from weather conditions, and longer wavelengths. This chapter reviews state-of-the-art strategies for fusing optical and microwave images to enhance their capabilities in supporting effective cloud removal. The work also covers some significant challenges encountered during the fusion of images to remove clouds. In this work, we have taken the optical and microwave images from standard datasets obtained from the Copernicus source and merged them using the NNDiffuse pansharpening technique. The resulting fused image provides improved visualization with cloud pixels effectively removed. Additionally, it also highlights some future substitutes and refinements that could enhance the overall process.

Abstract

The presence of clouds in optical imagesoptical images is a notable threat that needs to be dealt with since it tampers the accuracy of the observation and analysis process. The cloud-contaminated pixels can be identified and eradicated by the fusion techniques. Microwave and optical images are fused to leverage their respective strengths. Optical images offer the advantage of providing detailed high-resolution multispectralmultispectral information, whereas microwave images provide benefits such as better penetration, independence from weather conditions, and longer wavelengths. This chapter reviews state-of-the-art strategies for fusing optical and microwave images to enhance their capabilities in supporting effective cloud removal. The work also covers some significant challenges encountered during the fusion of images to remove clouds. In this work, we have taken the optical and microwave images from standard datasets obtained from the Copernicus source and merged them using the NNDiffuse pansharpening technique. The resulting fused image provides improved visualization with cloud pixels effectively removed. Additionally, it also highlights some future substitutes and refinements that could enhance the overall process.

Kapitel in diesem Buch

  1. Frontmatter I
  2. Preface V
  3. Contents VII
  4. Integrating Sentinel-1 satellite data with machine learning for land use classification 1
  5. A systematic review of deep learning techniques in microwave remote sensing: challenges, applications, and future directions 17
  6. Fundamentals of active and passive microwave remote sensing: principles and applications 31
  7. Comprehensive overview of active and passive microwave remote sensing satellite sensors 55
  8. Essentials of RADAR remote sensing and AI integration 73
  9. Fusion of scatterometer and optical remote sensing: enhanced classification and change detection 91
  10. AI-powered urban infrastructure monitoring using RADAR-based remote sensing 103
  11. Fusion of the optical and microwave images for cloud removal 123
  12. Integrating AI in RADAR remote sensing: enhancing data processing, interpretation, and decision-making 141
  13. Revolutionizing precision agriculture: the synergy of RADAR, Internet of things (IoT), and satellite technology 155
  14. Integrating AI with RADAR remote sensing: applications in disaster mitigation, defense, and climate change 171
  15. Computational techniques in RADAR remote sensing from a machine and deep learning perspective 189
  16. Deep learning-based water body segmentation in SAR imagery: enhancing accuracy with CNN-U-Net and EfficientNet 205
  17. Artificial intelligence in RADAR remote sensing: advances, challenges, and future prospects 215
  18. Revolutionizing agricultural and environmental analytics with synthetic aperture radar (SAR): innovations, challenges, and future directions 229
  19. Editors’ biographies
  20. Index 247
Heruntergeladen am 1.10.2025 von https://www.degruyterbrill.com/document/doi/10.1515/9783111572970-008/html
Button zum nach oben scrollen