RAS PresidiumИсследование Земли из космоса Earth Research from Space

  • ISSN (Print) 0205-9614
  • ISSN (Online) 3034-5405

Comparative Assessment of Different Architectures of Convolutional Neural Network for Semantic Segmentation of Forest Disturbances from Multi-Temporal Satellite Images

PII
10.31857/S0205961424030013-1
DOI
10.31857/S0205961424030013
Publication type
Article
Status
Published
Authors
Volume/ Edition
Volume / Issue number 3
Pages
3-15
Abstract
Algorithms based on convolutional neural networks are the most efficient for semantic segmentation of images, including segmentation of forest cover disturbances from satellite images. In this study, we consider the applicability of various modifications of the U-net architecture of convolutional neural network for recognizing logged, burnt and windthrow areas in forests from multi-temporal and multi-seasonal Sentinel-2 satellite images. The assessment was carried out on three test sites that differ significantly in the characteristics of forest stands and forest management. The highest accuracy (average F-measure of 0.59) was obtained from the U-net model, while the models that showed the best results during training (Attention U-Net and MobilNetv2 U-Net) did not improve segmentation of independent data. The resulting accuracy estimates are close to those previously published for forests with a substantial proportion of selective logged areas. Characteristics of logged areas and windthrows, namely their area and type are the main factor determining the accuracy of semantic segmentation. Substantial differences were also revealed between the images taken in different seasons of the year, with the maximum segmentation accuracy based on winter pairs of images. According to summertime and different-season pairs of images, the area of forest disturbances is substantially underestimated. Forest species composition has a less significant effect, although for two of the three test sites, the maximum accuracy was observed in dark coniferous forests, and the minimum in deciduous forests. There was no statistically significant effect of slope lighting factor calculated from digital elevation model on segmentation accuracy based for winter pairs of images. The accuracy of segmentation of burnt areas, which was assessed using the example of 14 large forest fires in 2021-2022, is unsatisfactory, which is probably due to the varying degrees of damage to the forest cover in the burnt areas.
Keywords
нарушения лесного покрова рубки гари ветровалы сегментация сверточная нейронная сеть U-net снимки Sentinel-2 F-мера
Date of publication
15.09.2025
Year of publication
2025
Number of purchasers
0
Views
5

References

  1. 1. Барталев С.А. Егоров В.А., Жарко В.О., Лупян Е.А., Плотников Д.Е., Хвостиков С.А., Шабанов Н.В. Спутниковое картографирование растительного покрова России. М., ИКИ РАН. 2016. 208 с.
  2. 2. Горбачёв В.А. Криворотов И.А., Маркелов А.О., Котлярова Е.В. Семантическая сегментация спутниковых снимков аэропортов с помощью свёрточных нейронных сетей // Компьютерная оптика. 2020. Т. 44. № 4. С. 636–645. DOI: 10.18287/2412-6179-CO-636.
  3. 3. Канев А.И., Тарасов А.В., Шихов А.Н., Подопригорова Н.С., Сафонов Ф.А. Распознавание вырубок и ветровалов по спутниковым снимкам Sentinel-2 с применением свёрточной нейронной сети U-net и факторы, влияющие на его точность // Современные проблемы дистанционного зондирования Земли из космоса. 2023. Т. 20. № 3. С. 136–151. DOI: 10.21046/2070-7401-2023-20-3-136-151.
  4. 4. Лупян Е.А., Барталев С.А., Балашов И.В., Барталев С.С., Бурцев М.А., Егоров В.А., Ефремов В.Ю., Жарко В.О., Кашницкий А.В., Колбудаев П.А., Крамарева Л.С., Мазуров А.А., Оксюкевич А.Ю., Плотников Д.Е., Прошин А.А., Сенько К.С., Уваров И.А., Хвостиков С.А., Ховратович Т.С. Информационная система комплексного дистанционного мониторинга лесов “ВЕГА-Приморье” // Современные проблемы дистанционного зондирования Земли из космоса. 2016. Т. 13. № 5. С. 11–28. DOI: 10.21046/2070-7401-2016-13-5-11-28.
  5. 5. Тарасов А.В., Шихов А.Н., Шабалина Т.В. Распознавание нарушений лесного покрова по спутниковым снимкам Sentinel-2 с помощью свёрточных нейронных сетей // Современные проблемы дистанционного зондирования Земли из космоса. 2021. Т. 18. № 3. С. 51–64. DOI: 10.21046/2070-7401-2021-18-3-51-64.
  6. 6. Шихов А.Н., Герасимов А.П., Пономарчук А.И., Перминова Е.С. Тематическое дешифрирование и интерпретация космических снимков среднего и высокого пространственного разрешения: учебное пособие / Перм. гос. нац. иссл. ун-т. – Электронные данные. – Пермь, 2020. – 49,6 Мб; 191 с. Режим доступа: http://www.psu.ru/files/docs/science/books/uchebnie-posobiya/shikhov-gerasimov-ponomarchuk-perminova-tematicheskoe-deshifrovanie-i-interpretaciya-kosmicheskih-snimkov.pdf
  7. 7. Al-Dabbagh A.M., Ilyas M. Uni-temporal Sentinel-2 imagery for wildfire detection using deep learning semantic segmentation models // Geomatics, Nat. Hazards and Risk. 2023. V. 14(1). Art. No. 2196370. DOI: 10.1080/19475705.2023.2196370.
  8. 8. Hansen M.C., Potapov P.V., Moore R., Hancher M., Turubanova S.A., Tyukavina A., Thau D., Stehman S.V., Goetz S.J., Loveland T.R., Kommareddy A., Egorov A., Chini L., Justice C.O., Townshend J.R.G. High-Resolution Global Maps of 21st-Century Forest Cover Change // Science. 2013. V. 342(6160). P. 850–853. DOI: 10.1126/science.1244693.
  9. 9. Hawker L., Uhe P., Paulo L., Sosa J., Savage J., Sampson C., Neal J. A 30 m global map of elevation with forests and buildings removed // Environ. Res. Letters, 2022. V. 17. Art. No. 024016. DOI: 10.1088/1748-9326/ac4d4f.
  10. 10. He K., Zhang X., Ren S., Sun J. Deep residual learning for image recognition // Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; Institute of Electrical and Electronics Engineers (IEEE): Piscataway, NJ, USA, 2016. P. 770–778.
  11. 11. John D., Zhang C. An attention-based U-Net for detecting deforestation within satellite sensor imagery // Int. J. Applied Earth Observations Geoinf. 2022. V. 107. Art. No. 102685. DOI: 10.1016/j.jag.2022.102685.
  12. 12. Ibtehaz N., Rahman M.S. MultiResUNet: Rethinking the U-Net architecture for multimodal biomedical image segmentation. Neural networks. 2020. V. 121. P. 74–87. DOI: 10.1016/j.neunet.2019.08.025.
  13. 13. Isaienkov K., Yushchuk M., Khramtsov V., Seliverstov O. Deep Learning for Regular Change Detection in Ukrainian Forest Ecosystem with Sentinel-2 // IEEE J. Selected Topics in Applied Earth Observations and Rem. Sens. 2021. V. 14. P. 364–376. DOI: 10.1109/JSTARS.2020.3034186.
  14. 14. Kislov D.E., Korznikov K.A. Automatic windthrow detection using very-high-resolution satellite imagery and deep learning // Rem. Sens. 2020. V. 12(7). Art. No. 1145. DOI: 10.3390/rs12071145.
  15. 15. Kislov D.E., Korznikov K.A., Altman J., Vozmishcheva A.S., Krestov P.V. Extending deep learning approaches for forest disturbance segmentation on very high-resolution satellite images // Rem. Sens. Ecol. Conservation. 2021. V. 7(3). P. 355–368. DOI: 10.1002/rse2.194.
  16. 16. Knopp L., Wieland M., Rättich M., Martinis S. A Deep Learning Approach for Burned Area Segmentation with Sentinel-2 Data // Rem. Sens. 2020. V. 12. Art. No. 2422. DOI: 10.3390/rs12152422.
  17. 17. Larabi M., Liu Q., Wang Y. Convolutional neural network features based change detection in satellite images // Proc. 1 st Intern. Workshop Pattern Recognition, RRPR 2016. Dec. 4, 2016, Cancún, Mexico. 2016. Art. No. 100110W.
  18. 18. Lee C., Park S., Kim T., Liu S., Md Reba M.N., Oh J., Han Y. Machine Learning-Based Forest Burned Area Detection with Various Input Variables: A Case Study of South Korea // Applied Sci. 2022. V. 12. Art. No. 10077. DOI: 10.3390/app121910077.
  19. 19. Mou L., Bruzzone L., Zhu X.X. Learning spectral-spatialoral features via a recurrent convolutional neural network for change detection in multispectral imagery // IEEE Trans. Geosci. Rem. Sens. 2019. V. 57(2). P. 924–935. DOI: 10.1109/TGRS.2018.2863224.
  20. 20. Mountrakis G., Im J. Ogole C. Support vector machines in remote sensing: A review // ISPRS J. of Photogram. Rem. Sens. 2011. V. 66(3). P. 247–259. DOI: 10.1016/j.isprsjprs.2010.11.001.
  21. 21. Potapov P., Li X., Hernandez-Serna A., Tyukavina A., Hansen M.C., Kommareddy A., Pickens A., Turubanova S., Tang H., Silva C.E., Armston J., Dubayah R., Blair J. B., Hofton M. Mapping global forest canopy height through integration of GEDI and Landsat data // Rem. Sens. Environ. 2021. V. 253. Art. No. 112165. DOI: 10.1016/j.rse.2020.112165.
  22. 22. Pyo J., Han K.-j., Cho Y., Kim D., Jin D. Generalization of U-Net Semantic Segmentation for Forest Change Detection in South Korea Using Airborne Imagery // Forests. 2022. V. 13. Art. No. 2170. DOI: 10.3390/f13122170.
  23. 23. Rodriguez-Galiano V.F., Ghimire B., Rogan J., Chica-Olmo M., Rigol-Sanchez J.P. An assessment of the effectiveness of a random forest classifier for land-cover classification // ISPRS J. Photogram. Rem. Sens. 2012. V. 67(1). P. 93–104. DOI: 10.1016/j.isprsjprs.2011.11.002.
  24. 24. Ronneberger O., Fischer P., Brox T. U-Net: Convolutional networks for biomedical image segmentation // arXiv: e-print service. arXiv:1505.04597. 2015. 8 p. https://arxiv.org/pdf/1505.04597.pdf.
  25. 25. Sandler M., Howard A.G., Zhu M., Zhmoginov A., Chen L-C. MobileNetV2: Inverted residuals and linear bottlenecks // IEEE Conf. Computer Vis. Pattern Recognition (CVPR), 2018. P. 4510−4520.
  26. 26. Shirvani Z., Abdi O., Goodman R.C. High-Resolution Semantic Segmentation of Woodland Fires Using Residual Attention UNet and Time Series of Sentinel-2 // Rem. Sens. 2023. V. 15. Art. No. 1342. DOI: 10.3390/rs15051342.
  27. 27. Scharvogel D., Brandmeier M., Weis M. A Deep Learning Approach for Calamity Assessment Using Sentinel-2 Data // Forests. 2020. V. 11(2). Art. No. 1239. 21 p. DOI: 10.3390/f11121239.
  28. 28. Trier O., Salberg A., Larsen R., Nyvoll O.T. Detection of forest roads in Sentinel-2 images using U-Net // Proc. Northern Lights Deep Learning Workshop, 2022. V. 3. DOI: 10.7557/18.6246.
QR
Translate

Индексирование

Scopus

Scopus

Scopus

Crossref

Scopus

Higher Attestation Commission

At the Ministry of Education and Science of the Russian Federation

Scopus

Scientific Electronic Library