Disaster monitoring and response are critical for lessening the effects that natural catastrophes have on the environment and populations, playing a vital role in disaster management, environmental protection, and long-term sustainable development. This research focuses on developing an automated feature extraction framework utilizing a U-Net architecture for segmenting disaster-affected areas from remote sensing images, specifically targeting floods, landslides, and wildfires. The proposed model is trained on a dataset of annotated remote sensing images and demonstrates high accuracy and robustness. The results prove that the model can successfully extract relevant features from remote sensing imagery, enabling timely and accurate identification of disaster-affected areas. The proposed single-stage semantic segmentation network achieves an accuracy of 97.3%, a recall of 95.5%, and an F1-score of 95.3, outperforming existing methods such as BRRNet, DRNet, and ENRU-Net. The use of the U-Net architecture is particularly motivated by its ability to capture both global contextual information and fine-grained spatial details, which are crucial for identifying disaster-affected regions in high-resolution remote sensing imagery. Furthermore, by leveraging transfer learning techniques, the dependency on large volumes of labeled data is significantly reduced, enhancing the practicality and scalability of the proposed approach. Overall, this framework supports intelligent disaster management strategies and contributes to Sustainable Cities by enabling rapid damage assessment, informed decision-making, and efficient resource allocation, thereby aligning with global goals for Climate Action and sustainable disaster resilience..
| Primary Language | English |
|---|---|
| Subjects | Satellite Communications |
| Journal Section | Research Article |
| Authors | |
| Submission Date | December 14, 2025 |
| Acceptance Date | January 28, 2026 |
| Publication Date | May 1, 2026 |
| IZ | https://izlik.org/JA62CW85XM |
| Published in Issue | Year 2026 Volume: 10 Issue: 2 |