<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.4 20241031//EN"
        "https://jats.nlm.nih.gov/publishing/1.4/JATS-journalpublishing1-4.dtd">
<article  article-type="research-article"        dtd-version="1.4">
            <front>

                <journal-meta>
                                    <journal-id></journal-id>
            <journal-title-group>
                                                                                    <journal-title>Mersin Photogrammetry Journal</journal-title>
            </journal-title-group>
                                        <issn pub-type="epub">2687-654X</issn>
                                                                                            <publisher>
                    <publisher-name>Mersin University</publisher-name>
                </publisher>
                    </journal-meta>
                <article-meta>
                                        <article-id pub-id-type="doi">10.53093/mephoj.1575877</article-id>
                                                                <article-categories>
                                            <subj-group  xml:lang="en">
                                                            <subject>Photogrammetry and Remote Sensing</subject>
                                                    </subj-group>
                                            <subj-group  xml:lang="tr">
                                                            <subject>Fotogrametri ve Uzaktan Algılama</subject>
                                                    </subj-group>
                                    </article-categories>
                                                                                                                                                        <title-group>
                                                                                                                        <article-title>Automatic detection of active fires and burnt areas in forest areas using optical satellite imagery and deep learning methods</article-title>
                                                                                                                                        </title-group>
            
                                                    <contrib-group content-type="authors">
                                                                        <contrib contrib-type="author">
                                                                    <contrib-id contrib-id-type="orcid">
                                        https://orcid.org/0000-0002-5582-984X</contrib-id>
                                                                <name>
                                    <surname>Demirel</surname>
                                    <given-names>Yasin</given-names>
                                </name>
                                                                    <aff>BARTIN UNIVERSITY</aff>
                                                            </contrib>
                                                    <contrib contrib-type="author">
                                                                    <contrib-id contrib-id-type="orcid">
                                        https://orcid.org/0000-0002-2671-7590</contrib-id>
                                                                <name>
                                    <surname>Türk</surname>
                                    <given-names>Tarık</given-names>
                                </name>
                                                                    <aff>SİVAS CUMHURİYET ÜNİVERSİTESİ</aff>
                                                            </contrib>
                                                                                </contrib-group>
                        
                                        <pub-date pub-type="pub" iso-8601-date="20241231">
                    <day>12</day>
                    <month>31</month>
                    <year>2024</year>
                </pub-date>
                                        <volume>6</volume>
                                        <issue>2</issue>
                                        <fpage>66</fpage>
                                        <lpage>78</lpage>
                        
                        <history>
                                    <date date-type="received" iso-8601-date="20241030">
                        <day>10</day>
                        <month>30</month>
                        <year>2024</year>
                    </date>
                                                    <date date-type="accepted" iso-8601-date="20241128">
                        <day>11</day>
                        <month>28</month>
                        <year>2024</year>
                    </date>
                            </history>
                                        <permissions>
                    <copyright-statement>Copyright © 2019, Mersin Photogrammetry Journal</copyright-statement>
                    <copyright-year>2019</copyright-year>
                    <copyright-holder>Mersin Photogrammetry Journal</copyright-holder>
                </permissions>
            
                                                                                                <abstract><p>Forest fires have important ecological, social and economic consequences causing loss of life and property. In order to prevent these consequences, it is very important to intervene in active fires in a timely manner and to determine the extent of burnt areas as soon as possible. In such studies, remote sensing methods provide great benefits in terms of speed and cost. In recent years, various methods have been developed to segment active fires and burnt areas with satellite images. Deep learning methods successfully perform segmentation processes in many areas such as disease detection in the field of health, crop type determination in the field of agriculture, land use and building detection in the field of urbanization. In this study, a method has been developed that automatically detects both active fires and burned areas that need to be re-enacted in terms of location and area size by using the same Sentinel 2 scene in a single time using deep learning methods.  In particular, a new training and validation data set was created to train the U-Net+InceptionResNetV2 (CNN) model. By combining the powerful features of U-Net with InceptionResNet V2, a convolutional neural network trained over more than one million images on the ImageNet very base, we aim to examine its capabilities in burned area and active fire detection. The model applied on the test data has been shown to give successful results with an overall accuracy of 0.97 and an IoU (Intersection over union) value of 0.88 in the detection of burnt areas, and an overall accuracy of 0.99 and an IoU value of 0.82 in the detection of active fires. Finally, when the test images that were not used in the training dataset were evaluated with the trained model, it was revealed that the results were quite consistent in the detection of active fires and burnt areas and their geographical locations.</p></abstract>
                                                                                    
            
                                                            <kwd-group>
                                                    <kwd>Deep Learning</kwd>
                                                    <kwd>  Active Fire Detection</kwd>
                                                    <kwd>  Burnt Area Detection</kwd>
                                                    <kwd>  CNN Artificial Intelligence</kwd>
                                            </kwd-group>
                                                        
                                                                                                                                                    </article-meta>
    </front>
    <back>
                            <ref-list>
                                    <ref id="ref1">
                        <label>1</label>
                        <mixed-citation publication-type="journal">Kavzoğlu, T. (2021). Orman yangınları sebepleri, etkileri, izlenmesi, alınması gereken önlemler ve rehabilitasyon faaliyetleri, Türkiye Bilimler Akademisi Yayınları</mixed-citation>
                    </ref>
                                    <ref id="ref2">
                        <label>2</label>
                        <mixed-citation publication-type="journal">Şeker, M. (2021).  Orman yangınları sebepleri, etkileri, izlenmesi, alınması gereken önlemler ve rehabilitasyon faaliyetleri, Türkiye Bilimler Akademisi Yayınları</mixed-citation>
                    </ref>
                                    <ref id="ref3">
                        <label>3</label>
                        <mixed-citation publication-type="journal">Knopp, L., Wieland, M., Rättich, M., &amp; Martinis, S. (2020). A deep learning approach for burned area segmentation with Sentinel-2 data. Remote Sensing, 12(15), 2422.</mixed-citation>
                    </ref>
                                    <ref id="ref4">
                        <label>4</label>
                        <mixed-citation publication-type="journal">Zhang, Q., Ge, L., Zhang, R., Metternicht, G. I., Liu, C., &amp; Du, Z. (2021). Towards a deep-learning-based framework of Sentinel-2 imagery for automated active fire detection. Remote Sensing, 13(23), 4790.</mixed-citation>
                    </ref>
                                    <ref id="ref5">
                        <label>5</label>
                        <mixed-citation publication-type="journal">Nolde, M., Plank, S., &amp; Riedlinger, T. (2020). An adaptive and extensible system for satellite-based, large scale burnt area monitoring in near-real time. Remote Sensing, 12(13), 2162</mixed-citation>
                    </ref>
                                    <ref id="ref6">
                        <label>6</label>
                        <mixed-citation publication-type="journal">Chuvieco, E., Mouillot, F., Van der Werf, G. R., San Miguel, J., Tanase, M., Koutsias, N., ... &amp; Giglio, L. (2019). Historical background and current developments for mapping burned area from satellite Earth observation. Remote Sensing of Environment, 225, 45-64.</mixed-citation>
                    </ref>
                                    <ref id="ref7">
                        <label>7</label>
                        <mixed-citation publication-type="journal">Laris, P. S. (2005). Spatiotemporal problems with detecting and mapping mosaic fire regimes with coarse-resolution satellite data in savanna environments. Remote sensing of environment, 99(4), 412-424.</mixed-citation>
                    </ref>
                                    <ref id="ref8">
                        <label>8</label>
                        <mixed-citation publication-type="journal">Farhadi, H., Ebadi, H., &amp; Kiani, A. (2023). Badi: a Novel Burned Area Detection Index for SENTINEL-2 Imagery Using Google Earth Engine Platform. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 10, 179-186.</mixed-citation>
                    </ref>
                                    <ref id="ref9">
                        <label>9</label>
                        <mixed-citation publication-type="journal">Pulvirenti, L., Squicciarino, G., Fiori, E., Negro, D., Gollini, A., &amp; Puca, S. (2023). Near real-time generation of a country-level burned area database for Italy from Sentinel-2 data and active fire detections. Remote Sensing Applications: Society and Environment, 29, 100925.</mixed-citation>
                    </ref>
                                    <ref id="ref10">
                        <label>10</label>
                        <mixed-citation publication-type="journal">Gajardo, J., Mora, M., Valdés-Nicolao, G., &amp; Carrasco-Benavides, M. (2022). Burned Area Classification Based on Extreme Learning Machine and Sentinel-2 Images. Applied Sciences, 12(1), 9.</mixed-citation>
                    </ref>
                                    <ref id="ref11">
                        <label>11</label>
                        <mixed-citation publication-type="journal">Kavzoğlu, T., Çölkesen, İ., Tonbul H. &amp; Öztürk M., (2021). Uzaktan Algılama Teknolojileri ile Orman Yangınlarının Zamansal Analizi: 2021 Yılı Akdeniz ve Ege Yangınları, Türkiye Bilimler Akademisi Yayınları</mixed-citation>
                    </ref>
                                    <ref id="ref12">
                        <label>12</label>
                        <mixed-citation publication-type="journal">Musaoğlu, N., Yanalak M., Güngöroğlu C., Özcan O., (2021)., Orman yangınlarının yönetiminde bilgi teknolojilerinin katkıları, Türkiye Bilimler Akademisi Yayınları</mixed-citation>
                    </ref>
                                    <ref id="ref13">
                        <label>13</label>
                        <mixed-citation publication-type="journal">De Almeida Pereira, G. H., Fusioka, A. M., Nassu, B. T., &amp; Minetto, R. (2021). Active fire detection in Landsat-8 imagery: A large-scale dataset and a deep-learning study. ISPRS Journal of Photogrammetry and Remote Sensing, 178, 171-186.</mixed-citation>
                    </ref>
                                    <ref id="ref14">
                        <label>14</label>
                        <mixed-citation publication-type="journal">Seydi, S. T., Saeidi, V., Kalantar, B., Ueda, N., &amp; Halin, A. A. (2022). Fire-Net: A deep learning framework for active forest fire detection. Journal of Sensors, 2022, 1-14.</mixed-citation>
                    </ref>
                                    <ref id="ref15">
                        <label>15</label>
                        <mixed-citation publication-type="journal">Boothman, R., &amp; Cardille, J. A. (2022). New techniques for old fires: Using deep learning to augment fire maps from the early satellite era. Frontiers in Environmental Science, 1253.</mixed-citation>
                    </ref>
                                    <ref id="ref16">
                        <label>16</label>
                        <mixed-citation publication-type="journal">Khryashchev, V., &amp; Larionov, R. (2020, March). Wildfire segmentation on satellite images using deep learning. In 2020 Moscow Workshop on Electronic and Networking Technologies (MWENT) (pp. 1-5). IEEE.</mixed-citation>
                    </ref>
                                    <ref id="ref17">
                        <label>17</label>
                        <mixed-citation publication-type="journal">Atasever, Ü. H., &amp; Tercan, E. (2024). Deep learning-based burned forest areas mapping via Sentinel-2 imagery: a comparative study. Environmental Science and Pollution Research, 31(4), 5304-5318.</mixed-citation>
                    </ref>
                                    <ref id="ref18">
                        <label>18</label>
                        <mixed-citation publication-type="journal">Fusioka, A. M., Pereira, G. H., Nassu, B. T., &amp; Minetto, R. (2024). Sentinel-2 Active Fire Segmentation: Analyzing Convolutional and Transformer Architectures, Knowledge Transfer, Fine-Tuning and Seam-Lines. IEEE Geoscience and Remote Sensing Letters.</mixed-citation>
                    </ref>
                                    <ref id="ref19">
                        <label>19</label>
                        <mixed-citation publication-type="journal">Escuin, S., Navarro, R., &amp; Fernández, P. (2008). Fire severity assessment by using NBR (Normalized Burn Ratio) and NDVI (Normalized Difference Vegetation Index) derived from LANDSAT TM/ETM images. International Journal of Remote Sensing, 29(4), 1053-1073.</mixed-citation>
                    </ref>
                                    <ref id="ref20">
                        <label>20</label>
                        <mixed-citation publication-type="journal">Trigg, S., &amp; Flasse, S. (2001). An evaluation of different bi-spectral spaces for discriminating burned shrub-savannah. International Journal of Remote Sensing, 22(13), 2641-2647.</mixed-citation>
                    </ref>
                                    <ref id="ref21">
                        <label>21</label>
                        <mixed-citation publication-type="journal">Martín, M. P., Gómez, I., &amp; Chuvieco, E. (2006). Burnt Area Index (BAIM) for burned area discrimination at regional scale using MODIS data. Forest Ecology and Management, (234), S221.</mixed-citation>
                    </ref>
                                    <ref id="ref22">
                        <label>22</label>
                        <mixed-citation publication-type="journal">Petropoulos, G. P., Kontoes, C., &amp; Keramitsoglou, I. (2011). Burnt area delineation from a uni-temporal perspective based on Landsat TM imagery classification using Support Vector Machines. International Journal of Applied Earth Observation and Geoinformation, 13(1), 70-80.</mixed-citation>
                    </ref>
                                    <ref id="ref23">
                        <label>23</label>
                        <mixed-citation publication-type="journal">Ramo, R., &amp; Chuvieco, E. (2017). Developing a random forest algorithm for MODIS global burned area classification. Remote Sensing, 9(11), 1193.</mixed-citation>
                    </ref>
                                    <ref id="ref24">
                        <label>24</label>
                        <mixed-citation publication-type="journal">Roy, D. P., Huang, H., Boschetti, L., Giglio, L., Yan, L., Zhang, H. H., &amp; Li, Z. (2019). Landsat-8 and Sentinel-2 burned area mapping-A combined sensor multi-temporal change detection approach. Remote Sensing of Environment, 231, 111254.</mixed-citation>
                    </ref>
                                    <ref id="ref25">
                        <label>25</label>
                        <mixed-citation publication-type="journal">Kartal M., &amp; Polat Ö. (2022). Detection of benign and malignant skin cancer from dermoscopic images using modified deep residual learning model. AITA Journal, vol. 2, no. 2, pp. 10-18, 2022</mixed-citation>
                    </ref>
                                    <ref id="ref26">
                        <label>26</label>
                        <mixed-citation publication-type="journal">Gürkahraman, K., &amp; Karakiş, R.  (2021). Brain tumors classification with deep learning using data augmentation. Journal of the Faculty of Engineering and Architecture of Gazi University, 36(2), 997-1011.</mixed-citation>
                    </ref>
                                    <ref id="ref27">
                        <label>27</label>
                        <mixed-citation publication-type="journal">Maggiori, E., Tarabalka, Y., Charpiat, G., &amp; Alliez, P. (2016). Convolutional neural networks for large-scale remote-sensing image classification. IEEE Transactions on geoscience and remote sensing, 55(2), 645-657.</mixed-citation>
                    </ref>
                                    <ref id="ref28">
                        <label>28</label>
                        <mixed-citation publication-type="journal">Wieland, M., Li, Y., &amp; Martinis, S. (2019). Multi-sensor cloud and cloud shadow segmentation with a convolutional neural network. Remote Sensing of Environment, 230, 111203.</mixed-citation>
                    </ref>
                                    <ref id="ref29">
                        <label>29</label>
                        <mixed-citation publication-type="journal">Wurm, M., Stark, T., Zhu, X. X., Weigand, M., &amp; Taubenböck, H. (2019). Semantic segmentation of slums in satellite images using transfer learning on fully convolutional neural networks. ISPRS journal of photogrammetry and remote sensing, 150, 59-69.</mixed-citation>
                    </ref>
                                    <ref id="ref30">
                        <label>30</label>
                        <mixed-citation publication-type="journal">Wieland, M., &amp; Martinis, S. (2019). A modular processing chain for automated flood monitoring from multi-spectral satellite data. Remote Sensing, 11(19), 2330.</mixed-citation>
                    </ref>
                                    <ref id="ref31">
                        <label>31</label>
                        <mixed-citation publication-type="journal">Luus, F. P., Salmon, B. P., Van den Bergh, F., &amp; Maharaj, B. T. J. (2015). Multiview deep learning for land-use classification. IEEE Geoscience and Remote Sensing Letters, 12(12), 2448-2452.</mixed-citation>
                    </ref>
                                    <ref id="ref32">
                        <label>32</label>
                        <mixed-citation publication-type="journal">M Rustowicz, R., Cheong, R., Wang, L., Ermon, S., Burke, M., &amp; Lobell, D. (2019). Semantic segmentation of crop type in Africa: A novel dataset and analysis of deep learning methods. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (pp. 75-82).</mixed-citation>
                    </ref>
                                    <ref id="ref33">
                        <label>33</label>
                        <mixed-citation publication-type="journal">Varul, Y. E., Adıyaman, H., Bakırman, T., Bayram, B., Alkan, E., Karaca, S. Z., &amp; Topaloğlu, R. H. (2023). Preserving human privacy in real estate listing applications by deep learning methods. Mersin Photogrammetry Journal, 5(1), 10-17
M Rustowicz, R., Cheong, R., Wang, L., Ermon, S., Burke, M., &amp; Lobell, D. (2019). Semantic segmentation of crop type in Africa: A novel dataset and analysis of deep learning methods. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (pp. 75-82).</mixed-citation>
                    </ref>
                                    <ref id="ref34">
                        <label>34</label>
                        <mixed-citation publication-type="journal">Körez, A. (2020). Derin öğrenme kullanarak uzaktan algılama görüntülerindeki nesnelerin tespiti, Gazi Üniversitesi</mixed-citation>
                    </ref>
                                    <ref id="ref35">
                        <label>35</label>
                        <mixed-citation publication-type="journal">Hnatushenko, V., Hnatushenko, V., Kashtan, V. (2023a). Detection of Forest Fire Consequences on Satellite Images using a Neural Network. 43. WissenschaftlichTechnische Jahrestagung der DGPF, 31, 29–36.</mixed-citation>
                    </ref>
                                    <ref id="ref36">
                        <label>36</label>
                        <mixed-citation publication-type="journal">Perez, L., Wang, J., 2017. The effectiveness of data augmentation in image classification using deep learning. arXiv preprint arXiv:1712.04621.</mixed-citation>
                    </ref>
                                    <ref id="ref37">
                        <label>37</label>
                        <mixed-citation publication-type="journal">Shijie, J., Ping, W., Peiyi, J., Siping, H., 2017. Research on data augmentation for image classification based on convolution neural networks. 2017 Chinese automation congress (CAC), IEEE, 4165–4170.</mixed-citation>
                    </ref>
                                    <ref id="ref38">
                        <label>38</label>
                        <mixed-citation publication-type="journal">Hnatushenko, V., Soldatenko, D., &amp; Heipke, C. (2023b). Enhancing the quality of CNN-based burned area detection in satellite imagery through data augmentation. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences (ISPRS Archives); XLVIII-1/W2-2023, 48, 1749-1755.</mixed-citation>
                    </ref>
                                    <ref id="ref39">
                        <label>39</label>
                        <mixed-citation publication-type="journal">Tran, T., Pham, T., Carneiro, G., Palmer, L., Reid, I., 2017. A bayesian data augmentation approach for learning deep models. Advances in neural information processing systems, 30.</mixed-citation>
                    </ref>
                                    <ref id="ref40">
                        <label>40</label>
                        <mixed-citation publication-type="journal">Hnatushenko, V., Zhernovyi, V. (2020). Method of improving instance segmentation for very high resolution remote sensing imagery using deep learning. Data Stream Mining &amp; Processing: Third International Conference, DSMP 2020, Lviv, Ukraine, August 21–25, 2020, Proceedings 3, Springer, 323–333.</mixed-citation>
                    </ref>
                                    <ref id="ref41">
                        <label>41</label>
                        <mixed-citation publication-type="journal">Url-1: https://keras.io/, erişim tarihi:01.03.2023</mixed-citation>
                    </ref>
                                    <ref id="ref42">
                        <label>42</label>
                        <mixed-citation publication-type="journal">Url-2: https://scihub.copernicus.eu/dhus/#/home, erişim tarihi:01.03.2023</mixed-citation>
                    </ref>
                                    <ref id="ref43">
                        <label>43</label>
                        <mixed-citation publication-type="journal">Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., &amp; Wojna, Z. (2016). Rethinking the inception architecture for computer vision. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 2818-2826).</mixed-citation>
                    </ref>
                                    <ref id="ref44">
                        <label>44</label>
                        <mixed-citation publication-type="journal">Kingma, D. P. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.</mixed-citation>
                    </ref>
                            </ref-list>
                    </back>
    </article>
