- Tytuł:
- Satellite Image Fusion Using a Hybrid Traditional and Deep Learning Method
- Autorzy:
-
Hammad, Mahmoud M.
Mahmoud, Tarek A.
Amein, Ahmed Saleh
Ghoniemy, Tarek S. - Powiązania:
- https://bibliotekanauki.pl/articles/27314300.pdf
- Data publikacji:
- 2023
- Wydawca:
- Akademia Górniczo-Hutnicza im. Stanisława Staszica w Krakowie. Wydawnictwo AGH
- Tematy:
-
deep learning image fusion
remote sensing image fusion
remote sensing optical image
pan-sharpening
remote sensing image - Opis:
- Due to growing demand for ground-truth in deep learning-based remote sensing satellite image fusion, numerous approaches have been presented. Of these approaches, Wald’s protocol is the most commonly used. In this paper, a new workflow is proposed consisting of two main parts. The first part targets obtaining the ground-truth images using the results of a pre-designed and well-tested hybrid traditional fusion method. This method combines the Gram–Schmidt and curvelet transform techniques to generate accurate and reliable fusion results. The second part focuses on the training of a proposed deep learning model using rich and informative data provided by the first stage to improve the fusion performance. The demonstrated deep learning model relies on a series of residual dense blocks to enhance network depth and facilitate the effective feature learning process. These blocks are designed to capture both low-level and high-level information, enabling the model to extract intricate details and meaningful features from the input data. The performance evaluation of the proposed model is carried out using seven metrics such as peak-signal-to-noise-ratio and quality without reference. The experimental results demonstrate that the proposed approach outperforms state-of-the-art methods in terms of image quality. It also exhibits the robustness and powerful nature of the proposed approach which has the potential to be applied to many remote sensing applications in agriculture, environmental monitoring, and change detection.
- Źródło:
-
Geomatics and Environmental Engineering; 2023, 17, 5; 145--162
1898-1135 - Pojawia się w:
- Geomatics and Environmental Engineering
- Dostawca treści:
- Biblioteka Nauki