Boundary extraction in remote sensing has an important task in studies such as environmental observa-tion, risk management and monitoring urban growth. Although significant progress has been made in the different calculation methods proposed, there are issues that need improvement, especially in terms of accuracy, efficiency and speed. In this study, dual stream network architecture of three different models that can obtain boundary extraction by using normalized Digital Surface Model (nDSM), Normalized Difference Vegetation Index (NDVI) and Near-Infrared (IR) band as the second stream, was explained. Model I is designed as the original HED, whereas the second stream of Model II, III, and IV use nDSM, nDSM + NDVI and nDSM + NDVI + IR, respectively. Thus, by comparing the models trained based on different data combinations, the contribution of different input data to the success of boundary extraction was revealed. For the training of the models, boundary maps produced from The International Society for Photogrammetry and Remote Sensing (ISPRS) Potsdam data set and input datasets augmented by rotation, mirroring and rotation were used. When the test results obtained from two-stream and multi-data-based models are evaluated, 11% better accuracy has achieved with Model IV compared to the original HED. The outcomes clearly revealed the importance of using multispectral band, height data and vegetation information as input data in boundary extraction beside commonly used RGB images.
TÜBİTAK
119Y363
Acknowledgement This research was supported by The Scientific and Technological Research Council of Turkey (TÜBİTAK), Project No: 119Y363.
119Y363
Primary Language | English |
---|---|
Subjects | Artificial Intelligence, Engineering |
Journal Section | Research Article |
Authors | |
Project Number | 119Y363 |
Publication Date | September 25, 2021 |
Submission Date | April 7, 2021 |
Published in Issue | Year 2021 |
As of 2024, JARNAS is licensed under a Creative Commons Attribution-NonCommercial 4.0 International Licence (CC BY-NC).