CNN-based burned area mapping using radar and optical data
Authors
Belenguer Plomer, Miguel Ángel; Tanase, Mihai Andrei; Chuvieco Salinero, Emilio; Bovolo , FrancesaIdentifiers
Permanent link (URI): http://hdl.handle.net/10017/59696DOI: 10.1016/j.rse.2021.112468
ISSN: 0034-4257
Date
2021-07-01Funders
European Space Agency
Bibliographic citation
Remote Sensing of Environment, 2021, v. 260, n. 112468
Keywords
Burned area mapping
Convolutional neural networks
Deep learning
SAR
Sentinel-1
Sentinel-2
Wildland fires
Project
info:eu-repo/granAgreement/ESA//4000126706%2F19%2FI-NB/EU/Fire_cci
Document type
info:eu-repo/semantics/article
Version
info:eu-repo/semantics/publishedVersion
Rights
© los autores
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
Access rights
info:eu-repo/semantics/openAccess
Abstract
In this paper, we present an in-depth analysis of the use of convolutional neural networks (CNN), a deep learning method widely applied in remote sensing-based studies in recent years, for burned area (BA) mapping combining radar and optical datasets acquired by Sentinel-1 and Sentinel-2 on-board sensors, respectively. Combining active and passive datasets into a seamless wall-to-wall cloud cover independent mapping algorithm significantly improves existing methods based on either sensor type. Five areas were used to determine the optimum model settings and sensors integration, whereas five additional ones were utilised to validate the results. The optimum CNN dimension and data normalisation were conditioned by the observed land cover class and data type (i.e., optical or radar). Increasing network complexity (i.e., number of hidden layers) only resulted in rising computing time without any accuracy enhancement when mapping BA. The use of an optimally defined CNN within a joint active/passive data combination allowed for (i) BA mapping with similar or slightly higher accuracy to those achieved in previous approaches based on Sentinel-1 (Dice coefficient, DC of 0.57) or Sentinel-2 (DC 0.7) only and (ii) wall-to-wall mapping by eliminating information gaps due to cloud cover, typically observed for opticalbased algorithms.
Files in this item
Files | Size | Format |
|
---|---|---|---|
cnn_belenguer_RSE_2021.pdf | 4.765Mb |
|
Files | Size | Format |
|
---|---|---|---|
cnn_belenguer_RSE_2021.pdf | 4.765Mb |
|
Collections
- GEOGRAF - Artículos [73]