1. Introduction
Forest fire is a huge threat to the ecosystem, and it is one of the eight natural disasters in the world according to a report by the United Nations [1,2]. In China, from 2001 to 2015, there were a total of 3.52 million hectares of forests that experienced fires, based on the 2016–2025 National Forest Fire Prevention Plan. Accurately mapping forest fire burn scarring is important for post-disaster assessment.
Satellites are characterized by wide coverage and long time series [3], which enable them to continuously acquire imagery even in primitive forests. It is because of the significance of post-disaster assessment that satellite imagery-based forest fire burn scar mapping is a research hotspot [4]. In 2004, Key [5] proposed a Composite Burn Index (CBI) based on ground surveys, which has now become the standard for fire severity field surveys by the US Forest Service. Due to the rapidity and cost-effectiveness of the CBI approach, Danneyrolles [6] assessed spatial patterns of burn severity in boreal forests of eastern Canada and demonstrated that using the CBI method can produce reliable and meaningful burn severity maps. In 2023, Pacheco [7] used the Burn Area Index (BAI), Normalized Burn Ratio (NBR), and other methods to detect the burned areas in Brazil and Portugal and found that the BAI and NBR approaches made it possible to discriminate burned area from Landsat-8 satellite images, corroborating previous studies to discriminate burned areas in different ecosystems. The differenced Normalized Burn Ratio (dNBR) is the difference between the pre-forest fire NBR and the post-forest fire NBR. It has been proved that dNBR can extract burn scars more accurately and better reflect the spatial distribution of forest fire intensity than NBR [8,9,10]. With the development of satellite sensors, increasingly high spatial resolution satellites are being utilized for burn scar mapping. In 2023, Fotakidis [11] appraised the Breaks For Additive Seasonal and Trend (BFAST) Monitor unsupervised algorithm for continuous burned area monitoring and compared it to the NBR, dNBR, and Relativized Burn Ratio (RBR) generated from Landsat-8 images. The burn index can be used to obtain information such as fire severity for remote sensing assessment and is therefore widely used in fire severity research [12,13,14]. However, Soverel [15] found that quantitative estimation by remote sensing has uncertainty, and the application of the burn index needs to be validated by further research. Deep learning (DL) models can differentiate between burns of different severity; thus, DL models have been used in wildfire management applications. In 2023, Hu [16] applied state-of-the-art DL-based methods to map burn severity and the results demonstrated the high accuracy of DL-based models in assessing burn severity. In 2024, Han [17] proposed a transformer-based change detection model and introduced a multilevel feature fusion mechanism to address spatial resolution degradation in burn severity estimation, and the results showed that the model closely approximated evaluation dataset labels. However, DL-based approaches require a large number of samples, e.g., the DL-based approaches proposed by Hu [16] and Han [17] constructed 1000 major fire events and 3000 samples, respectively. In 2022, Xu [18] compiled a list of well-known openly accessible forest fire burn scar products from around the world, of which those extracted using the Landsat satellite imagery are the highest spatial resolution available [19]. Many scientists utilize satellite imagery with higher spatial resolution to research forest fire burn scar mapping. Pulvirenti [20] proposed an object-based validation of a country-level burn scar product derived from Sentinel-2 satellite imagery to assess thematic accuracy and geometric accuracy. Extracting the target directly at the pixel scale will result in omission detection and improve the image’s spatial resolution from the pixel scale to the subpixel scale to make the target obvious and improve the detection accuracy.
It is well known that mixed pixels are extremely common in satellite imagery [21], and researchers have proposed many approaches around the mixed pixel problem, such as Fully Constrained Least Squares (FCLS) [22], Vertex Component Analysis (VCA) [23], Simplex Growing Algorithm (SGA) [24], Minimum Volume Constraint Nonnegative Matrix Factorization (MVC-NMF) [25], and so on. However, mixed pixel unmixing only solves the problem of endmember type and abundance, but not the position of the endmember in the original pixel. To solve this problem, Atkinson [26,27] proposed a concept of subpixel mapping to clarify the specific spatial position of the endmember inside the original pixel. To map forest fire burn scars more accurately, Xu [18] proposed a Burned Area Subpixel Mapping–Feature Extraction Rule-Based (BASM-FERB) method by comprehensively applying the mixed pixel unmixing and subpixel mapping approaches, which upgraded forest fire burn scars from the pixel level to the subpixel level, and the result showed that the BASM-FERB approach can improve the accuracy and spatial resolution of forest fire burn scar mapping. The BASM-FERB approach is to convert hard classification into soft classification, attenuate the effect of mixed pixels on spatial resolution, and obtain high-resolution imagery with feature class labels, i.e., outputting imagery with discrete gray values. The BASM-FERB approach and the super-resolution reconstruction approach are designed to improve imagery spatial resolution, while the super-resolution reconstruction approach is used to mitigate imagery blurring brought about by the environment and condition and to achieve the purpose of reconstructing high-resolution imagery by inputting one or more imageries with continuous gray values, i.e., outputting imagery with consecutive gray values.
The super-resolution reconstruction approach was initially proposed by Harris [28] as a single-image restoration concept and method, which was subsequently investigated by many scholars. Freeman [29] proposed a sample-based learning reconstruction approach that matches prior information between high and low spatial resolution imageries to generate a learning model, and the reconstructed high spatial resolution imagery contains high-frequency detailed information, but it requires a large number of training samples. Yang [30] proposed image super-resolution reconstruction via sparse representation (SCSR), which fully utilizes prior knowledge and constructs high- and low-resolution dictionary pairs by the sparse coding algorithm to effectively represent the relationship between high and low spatial resolution imageries. Based on the sparse representation concept, the SCSR approach is used to sparsify the high and low spatial resolution imagery, which reduces the complexity of the algorithm, and the reconstructed high spatial resolution imagery is of high quality, making it effective among many super-resolution reconstruction approaches. Many researchers have recognized the SCSR method, which has been and still is of extremely high interest (
In other studies, we have used the SCSR method to carry out forest resource change monitoring studies and achieved good accuracy. To further expand the application of the SCSR method, based on the SCSR approach, we further propose an improved method to enhance imagery spatial resolution and reconstruct super-resolution imagery with a complete structural framework and rich detailed information, named modified image super-resolution reconstruction via sparse representation (MSCSR). Hunan Province, China, has high forest cover, large forest combustible stock, and a high risk of forest fires; so, in case of forest fires, there is a need to invest in post-disaster management and assessment, but this takes a long time. Therefore, we take Hunan Province as the research object and use the MSCSR approach for comparison with the BAI, dNBR, Feature Extraction Rule-Based (FERB), and BASM-FERB approaches for forest fire burn scar mapping at the subpixel/pixel level to screen the best method for forest fire burn scar mapping.
2. Material and Methods
2.1. Study Area
As of 2023, the forest coverage rate of Hunan Province, China, is 59.98% [34]. The latitude and longitude range of Hunan Province is 108°47′~114°15′, 24°38′~30°08′ with lush vegetation and evergreen seasons. “Hunan” is derived from the fact that most of the province lies south of Dongting Lake. Hunan Province has relatively favorable climatic conditions, with abundant light, heat, and precipitation. In a year, the temperature ranges between 7.78 °C~33.33 °C. In June, the rainiest month, there is an average rainfall of 177.80 mm. In December, the driest month, there is an average rainfall of 35.56 mm. It is the warm and humid conditions that create dense vegetation and an evergreen landscape. The contours of rain, heat, and other climatic conditions break the general rule of being parallel to the latitude lines and are instead roughly parallel to topographic contours. The center of less rain is in the Hengshao Basin, Dongting Lake Plain, and river valley area, and the center of more rain is located on the windward side of Xuefeng Mountain, Mufu Mountain, Jiuling Mountain, and Southeast Mountain. The high temperature is centered in the Dongting Lake Plain, Hengshao Basin, and river valley area and decreases to the east, west, and south. According to the official website of the People’s Government of Hunan Province, the main types of natural ecosystems are forest and wetland ecosystems, with ecological service functions valued at RMB 1.22 trillion. Hunan Province has 2 of the world’s 200 internationally significant ecological zones, namely the Wuling Xuefeng Mountain Range ecological zone and the Nanling Luoxiao Mountain Range, which are regarded as the most valuable ecological zones in the global zone of the same latitude. Hunan Province has high forest coverage, a large mountainous area, favorable climatic conditions, and a sustained increase in combustible material on forest land, which makes forest fires highly likely to occur. Therefore, accurately mapping forest fire burn scar is of practical significance for post-disaster management and assessment (See Figure 1).
2.2. Satellite Imagery and Processing
Sentinel-2 (developed by an industrial consortium led by Astrium GmbH, Germany,
2.2.1. Sentinel-2 Satellite Imagery
The Sentinel-2 satellite is an Earth observation mission that was developed by the European Space Agency (ESA) to support environmental service and natural disaster management [35]. Sentinel-2 is equipped with Multi-Spectral Imaging (MSI) with a total of 13 spectral bands, covering visible, near-infrared, and short-wave infrared (
2.2.2. GF Satellite Imagery
The Gaofen-1 (GF-1) satellite (developed by the China Aerospace Science and Technology Corporation, Beijing, China) was successfully launched on 26 April 2013, and it is the first high-resolution Earth observation satellite operated by China [40], carrying one two-camera panchromatic/multispectral system (PMS) and one four-camera multispectral (MS) system to support environmental surveying, natural disaster management, and so on. The PMS of the GF-1 has five spectral bands, panchromatic (PAN) band-1 has a spatial resolution of 2 m and bands 2, 3, 4, and 5 have a spatial resolution of 8 m. In coordination with GF-1 to observe the Earth’s surface, the GF-1B, -1C, and -1D satellites have the same sensor systems as GF-1, which were launched on 31 March 2018. The PMS of GF-2, like the GF-1, also has five spectral bands but with higher spatial resolution, PAN band-1 has a spatial resolution of 2 m and MS bands 2, 3, 4, and 5 have a spatial resolution of 4 m [41]. GF satellite imagery is downloaded free of charge from the China Centre for Resources Satellite Data and Application (
2.3. Actual Forest Fire Data
When a forest fire occurs, well-trained personnel confirm and report the forest fire in detail, which includes actual forest fire data. Well-trained personnel use equipment, such as Global Position System (GPS), and CW-15 unmanned aerial vehicles (UAVs, developed by the JOUAV, Chengdu, China, with vertical take-off and a landing fixed-wing, a fuselage length of 2.06 m, a wingspan of 3.54 m, endurance of 180 min, cruise speed of 61 km/h, maximum takeoff altitude of 4500 m, vertical positioning accuracy of 3 cm, and horizontal positioning accuracy of 1 cm + 1 ppm) with CA-103 aerial cameras (developed by the JOUAV, Chengdu, China, sensor size of 35.70 mm × 23.80 mm and 61 effective megapixels) and other types of equipment for in situ surveying and wide-area ortho-image or tilt data collection. Actual forest fire data, which are provided by the Hunan Forest Fire Prevention and Extinguish Command Center after video verification and manual checking, are used as the ground truth data in this paper. The ground truth data contain forest fire location, meteorological conditions, forest fire starting time, and forest fire extinguishing time. Six forest fire burn scar sample plots are selected as the ground truth data and the information is shown in Table 3. For the selected sample plots, we use the corresponding eight moments of satellite imagery with the information shown in Table 4.
3. Methodology
We improved the SCSR approach proposed by Yang [30] and named it MSCSR, which has a 2-layer flowchart as shown in Figure 2 Sentinel-2 satellite imagery up-sampling result and awaiting reconstructed Sentinel-2 satellite imagery are inputs to the Orthogonal Matching Pursuit (OMP) approach, which outputs the prediction results and . GF satellite imagery down-sampling . Predicted and are inputs to the high-pass filter (fusion of two transition images and a high spatial resolution image.) with the Equation shown below. The MSCSR approach is described in detail in Section 3.2.
(1)
The MSCSR approach is compared with the BASM-FERB approach proposed by Xu [18] to screen a better method. The flowchart of the BASM-FERB approach is shown in Figure 3 with three steps. (1) We obtain the forest fire location, download the Sentinel-2 satellite imagery before and after the forest fire based on the actual forest fire data, and process the Sentinel-2 satellite imagery. (2) The Fully Constrained Least Squares (FCLS) approach is used to detect the forest fire burn scars’ abundance [42]. The forest fire burn scars’ abundance is inputted to the Modified Pixel Swapping Algorithm (MPSA) approach edited in the Visual Studio 2012 platform based on the C# language, outputting the forest fire burn scar mapping at the subpixel level. (3) Forest fire burn scar mapping at the subpixel level is clumped based on the Clump Classes tool (the Dilate Kernel Value and Erode Kernel Value are 3) in the ENVI 5.3 software, using the GF satellite imagery to evaluate the accuracy of forest fire burn scar mapping at the subpixel level. Each pixel of the Sentinel-2 satellite imagery is decomposed into 5 × 5 subpixels, i.e., each subpixel’s spatial resolution is 2 m. The BASM-FERB approach is described in detail in Section 3.3.
3.1. Data Preprocessing
We perform data preprocessing work such as band synthesis and geo-referencing on Sentinel-2 satellite imagery. (1) Band synthesis is based on the Visual Studio Code (V1.31.0) platform and Geospatial Data Abstraction Library (GDAL), and band synthesis code is edited using the Python language to synthesize the band for the selected band. (2) Geo-referencing is based on the geo-referencing tool in ArcMap 10.4 software to determine the relationship between the Sentinel-2 and GF imageries by adding ten control points and selecting first-order polynomial (affine) transformation with the root mean square error (RMSE) of near or <0.5 pixels [43,44].
Pre-processing such as geometric ortho-rectification, geo-referencing, pan sharpening, and resampling are performed on the GF satellite imagery. (1) Geometric ortho-rectification [45] is based on the RPC ortho-rectification Batch tool in ENVI 5.3 software, utilizing the ALOS Digital Elevation Model (DEM) with a spatial resolution of 12.5 m [46]. (2) The geo-referencing processing steps for GF imageries are the same as the geo-referencing procedure for Sentinel-2. (3) Pan sharpening refers to the fusion of a PAN image and an MS image simultaneously acquired over the same area. The aim is to combine the spatial details of a PAN image with the multiple spectral bands of an MS image so that the result of the processing has the spatial resolution of the former and the spectral resolution of the latter. Pan sharpening is based on the pan sharpening Batch tool in ENVI 5.3 software that fuses PAN and MS bands to obtain imagery with a spatial resolution of 2 m [47]. (4) Resampling. In this study, we extracted burn scars from Sentinel-2 imagery with a spatial resolution of 10 m and increased the spatial resolution of the extraction results to 2 m. The extraction results are examined using the GF satellite imagery with a spatial resolution of 2 m. However, the GF-2 satellite imagery after pan sharpening has a spatial resolution of 1 m and needs to be down-sampled to a spatial resolution of 2 m. Resampling is based on the GF-2 satellite imagery with a spatial resolution of 1 m and the Resize Data tool in ENVI 5.3 software, using the Cubic Convolution Interpolation tool to resize the GF-2 satellite imagery to a spatial resolution of 2 m [48,49].
3.2. Modified Image Super-Resolution Reconstruction via Sparse Representation
The SCSR approach uses Orthogonal Matching Pursuit (OMP) for sparse representation and the K-SVD algorithm to train over-complete high- and low-resolution dictionary pairs. However, the SCSR approach uses a single-layer network for super-resolution reconstruction, which may have large prediction errors due to the difference between high and low spatial resolution imagery. We use two layers to solve the problem; as shown in Figure 2, the first layer increases the spatial resolution of the Sentinel-2 satellite imagery by a factor of 2.5 and the second layer increases the spatial resolution of the Sentinel-2 satellite imagery by a factor of 2.
The Sentinel-2 and GF satellite imageries, which are selected at the same phenological phase, have a high degree of similarity in terms of climatic period and land surface cover type. Assuming that the relationship between the transition imagery and the predicted imagery obeys linear variation, a high-pass filter linear time-varying model is established as shown below.
(2)
(3)
where is the pixel location, and a and b are the linear regression coefficients. From Equations (2) and (3), Equation (4) is derived.(4)
The high-pass filtering is calculated from Equation (5) [50], which is widely used for PAN and MS imagery and is utilized to transfer the high-frequency information of the imagery to the imagery proportionally. Equation (4) can be approximated as Equation (5) to reduce the computational amount and the fact that imagery and imagery are reconstructed by the same dictionary pair with the same error.
(5)
For natural images, the SCSR approach uses a color space transformation. According to the characteristics of satellite imagery, the SCSR approach is further modified to directly use the satellite imagery pixel value without color space transformation. Detailed descriptions of the SCSR can be found in [30], and the description is not repeated in this paper.
3.3. Burned Area Subpixel Mapping–Feature Extraction Rule-Based Method
3.3.1. Fully Constrained Least Squares
The Fully Constrained Least Squares (FCLS) approach is used for linear mixture spectral analysis to extract forest fire burn scars’ abundance in mixed pixels based on the Sentinel-2 satellite imagery. There are two constraints in the FCLS approach: (1) the abundance values sum to 1, i.e., , and (2) the abundance values are non-negative, i.e., . Three types of pure pixels are extracted from the Sentinel-2 satellite imagery, i.e., bare land, forest, and burn scars, and saved as a spectral library to be used as input to the FCLS approach.
3.3.2. Modified Pixel Swapping Algorithm
The Modified Pixel Swapping Algorithm (MPSA) approach, obtained by modifying the pixel swapping algorithm, consists of a total of 4 steps.
There is noise in the forest fire burn scars’ abundance [51,52]. An object-based classifier, the FERB method [53], is added to the MPSA approach to reduce the noise effect on forest fire burn scar mapping at the subpixel level and to filter out false burn scars caused by noise, and the object-based classifier is compared with three pixel-based classifiers, which are Random Forest (RF) [54], Backpropagation Neural Net (BPNN) [55], and Support Vector Machine (SVM) [56]. The BASM-FERB approach is found to have the best accuracy, and the parameter setting of the FERB classifier is shown in Table 5.
Detailed descriptions of steps 2~4 can be found in [18], and the description is not repeated in this paper.
3.4. Index-Based Approach
3.4.1. Burned Area Index
The index-based approach BAI is calculated using the quantitative relationship between the Red band and the NIR band, as detailed in Equation (6). Higher BAI values in the burned area indicate more severe forest fires.
(6)
where Red is the red band of the satellite imagery, and NIR is the near-infrared band of the satellite imagery.3.4.2. The Differenced Normalized Burn Ratio Approach
The index-based approach dNBR, which is the difference in the NBR before and after, can reflect the spatial distribution of the intensity of forest fires. The NBR is calculated using the quantitative relationship between the NIR band and the shortwave infrared (SWIR) band, as detailed in Equation (7). Accordingly, the dNBR is calculated using Equation (8). Higher dNBR values in the burned area indicate more severe forest fires.
(7)
(8)
where NIR is the near-infrared band of satellite imagery, SWIR is the shortwave infrared band of satellite imagery, is the NBR value of the satellite imagery before the forest fire, and is the NBR value of the satellite imagery after the forest fire.3.5. Accuracy Assessment
We chose five accuracy assessment indexes, which are the Overall Accuracy (OA) [57,58], User’s Accuracy (UA) [59,60], Producer’s Accuracy (PA) [61,62], Intersection over Union (IoU) [63,64], and Kappa Coefficient (Kappa) [65]. The OA represents the proportion of correctly predicted pixels to the total pixels. The UA represents the proportion of true forest fire burn scar pixels to the predicted forest fire burn scar pixels. The PA represents the proportion of correctly predicted forest fire burn scar pixels to the true forest fire burn scar pixels. The IoU represents the correct classification accuracy of the model. The Kappa represents the consistency between the prediction and the truth data. Five accuracy assessment indexes are given by Equations (9)–(13) and calculated from the confusion matrix shown in Table 6.
(9)
(10)
(11)
(12)
(13)
4. Result
4.1. Comparison of Misclassification Reduction Due to Noise
Based on the Sentinel-2 satellite imagery, we used the MSCSR and BASM-FERB approaches to map forest fire burn scars at the subpixel level and used the index-based approaches BAI and dNBR to map forest fire burn scars at the pixel level, comparing the noise reduction effect of these four approaches on forest fire burn scar mapping at the subpixel/pixel level. The GF satellite imagery and boundary data (boundary data are derived from actual forest fire data) are used to qualitatively assess the forest fire burn scar mapping at the subpixel/pixel level.
As shown in Figure 4B–E, forest fire burn scar mapping at the subpixel level obtained by the BASM-FERB approach and forest fire burn scar mapping at the pixel level obtained by the BAI and dNBR approaches are partially interfered with by noise. However, the forest fire burn scar mapping at the subpixel level obtained using the MSCSR approach has significantly less noise than that obtained using the BASM-FERB approach. In codes 4CZGD, 6ZZLL, 7CZGD, and 8CZGD, a few pixel-level forest fire burn scars were extracted based on the BAI approach. In codes 3YZLS and 4CZGD, forest fire burn scar mapping obtained by the dNBR approach presents relatively more noise. Although subpixel-scale forest fire burn scar mapping extracted by the BASM-FERB approach is partially noisy, it can maximize the advantages of the mixed pixel unmixing method, i.e., forest fire burn scar mapping at the subpixel level shows a thinning degree and a localized void phenomenon according to the proportion of burn scars in the pixel. However, since the MSCSR approach does not have a mixed pixel unmixing step, the extracted forest fire burn scar mapping at the subpixel level is free from localized voids. Qualitative analysis shows that although the BASM-FERB approach can effectively take advantage of the mixed pixel unmixing, the MSCSR approach can more effectively reduce the noise influence on forest fire burn scar mapping at the subpixel level and can obtain cleaner forest fire burn scar mapping at the subpixel level.
4.2. Forest Fire Burn Scar Mapping at Pixel Level and Subpixel Level
To further illustrate the advantages of forest fire burn scar mapping at the subpixel level, we compare it with pixel-scale forest fire burn scar mapping. Based on the Sentinel-2 satellite imagery, we carry out forest fire burn scar mapping comparison experiments using the BAI, dNBR, FERB, BASM-FERB, and MSCSR approaches and utilize the GF satellite imagery-based forest fire burn scar mapping and boundary data (boundary data are derived from actual forest fire data) as the reference data.
As shown in Figure 5B–F, the forest fire burn scar mapping data extracted using the BAI, dNBR, FERB, BASM-FERB, and MSCSR approaches all have deviations compared to the GF satellite imagery-based forest fire burn scar mapping. In codes 3YZLS, 4CZGD, 6ZZLL, 7CZGD, and 8CZGD, a few pixel-level forest fire burn scars were extracted based on the BAI approach. Compared with the forest fire burn scar mapping extracted using the BAI, dNBR, FERB, and BASM-FERB approaches shown in Figure 5B–E, the forest fire burn scar mapping extracted using the MSCSR approach shown in Figure 5F is closer to the forest fire burn scar spatial distribution map shown in Figure 5A. The analysis shows that pixel-scale forest fire burn scar mapping extracted using the BAI, dNBR, and FERB approaches is unsatisfactory, with obvious and non-smooth jagged edges. The forest fire burn scar mapping at the subpixel level detected using the MSCSR and BASM-FERB approaches has smoother edges than the pixel-scale forest fire burn scar mapping extracted using the BAI, dNBR, and FERB approaches. Comparison of the forest fire burn scar mapping extracted using the BAI, dNBR, FERB, BASM-FERB, and MSCSR approaches shows that the forest fire burn scar mapping extracted using the MSCSR approach is better, with clear, sharp, and natural edge contours and no obvious edge jagged effect, which avoids the loss of a large amount of spatial information.
4.3. Accuracy Assessment of the Forest Fire Burn Scar Mapping
To further describe the advantage of extracting forest fire burn scar mapping at the subpixel level based on the MSCSR approach, we use experimental data for quantitative analysis.
As shown in Table 7, the accuracy of forest fire burn scar mapping at the pixel level extracted by FERB is superior compared to forest fire burn scar mapping at the pixel level extracted by the BAI and dNBR approaches. For instance, for code 3YZLS, the OA, UA, PA, IoU, and Kappa of the forest fire burn scar mapping results at the pixel level extracted by the FERB approach are 98.83%, 91.05%, 86.73%, 79.91%, 88.22%, respectively, which are all greater than those extracted by the BAI approach (the OA, UA, PA, IoU, and Kappa are 95.88%, 66.27%, 47.24%, 38.08%, 53.06%, respectively) and dNBR approach (the OA, UA, PA, IoU, and Kappa are 91.78%, 38.32%, 86.61%, 36.18%, 49.35%, respectively). Some of the accuracy values are low, even 0 or negative, such as for code 4CZGD, where the UA, PA, IoU, and Kappa of the forest fire burn scar mapping results at the pixel level extracted by the BAI approach are 0.00%, 0.00%, 0.00%, and −0.01%, respectively, and 7CZGD, where the UA, PA, IoU, and Kappa of the forest fire burn scar mapping results at the pixel level extracted by the BAI approach are all 0.00%.
As shown in Table 8, the accuracy of forest fire burn scar mapping at the subpixel level extracted by the MSCSR approach is superior compared to forest fire burn scar mapping at the subpixel level extracted by the BASM-FERB approach. For example, for code 1CZGY, the OA, UA, PA, IoU, and Kappa of the forest fire burn scar mapping results at the subpixel level extracted by the MSCSR approach are 97.94%, 98.84%, 94.79%, 97.66%, and 88.66%, respectively, which are all greater than those extracted by the BASM-FERB approach (the OA, UA, PA, IoU, and Kappa are 97.92%, 91.44%, 85.57%, 79.23%, and 87.27%, respectively).
As shown in Table 9, the average OA, average UA, average PA, average IoU, and average Kappa of the forest fire burn scar mapping results at the subpixel level extracted by the MSCSR and BASM-FERB approaches are superior compared to the forest fire burn scar mapping results at the pixel level extracted by the BAI, dNBR, and FERB approaches. In particular, the average OA, average UA, average PA, average IoU, and average Kappa of the forest fire burn scar mapping results at the subpixel level extracted by the MSCSR approach (the average OA, average UA, average PA, average IoU, and average Kappa are 98.49%, 99.13%, 92.31%, 95.83%, and 92.81%, respectively) are significantly larger than those obtained by the BAI approach (the average OA, average UA, average PA, average IoU, and average Kappa are 86.95%, 54.19%, 37.07%, 28.48%, and 35.22%, respectively), with differences of 11.54%, 44.94%, 55.24%, 67.35%, and 57.59%, respectively.
According to Table 8, the accuracy of the forest fire burn scar mapping results at the subpixel level extracted by the BASM-FERB approach is subtracted from that extracted by the MSCSR approach, obtaining the accuracy difference, as shown in Table 10. According to Table 10, the OA, UA, IoU, and Kappa of forest fire burn scar mapping results at the subpixel level extracted by the MSCSR approach are larger than those extracted by the BASM-FERB approach except for the PA. Code 8CZGD has the largest OA difference of 3.62%, and code 1CZGY has the smallest OA difference of 0.01%. Code 4CZGD has the largest UA difference of 27.09%, and code 5CSNX has the smallest UA difference of 4.96%. Code 1CZGY, has the largest PA difference of 9.22%. For code 2CZGY, the corresponding PA difference is negative, with a difference of −0.40%, indicating that the PA of the forest fire burn scar mapping result at the subpixel level extracted by the MSCSR approach is smaller than that extracted by the BASM-FERB approach. Code 4CZGD has the largest IoU accuracy difference of 26.29%, and code 6ZZLL has the smallest IoU accuracy difference of 9.78%. Code 4CZGD has the largest Kappa accuracy difference of 14.22%, and code 2CZGY has the smallest Kappa accuracy difference of 1.36%. The average accuracy value of the forest fire burn scar mapping results at the subpixel level extracted by the MSCSR approach is always greater than that extracted by the BASM-FERB approach, and the average accuracy difference in the OA, UA, PA, IoU, and Kappa is 1.48%, 10.93%, 2.47%, 15.55%, and 5.90%, respectively. The results show that forest fire burn scar mapping at the subpixel level can be extracted with higher accuracy using the MSCSR approach compared to the BASM-FERB approach.
4.4. Run Time
As shown in Table 11, the forest fire burn scar mapping at the pixel level extracted by the BAI and dNBR is mainly performed in the Google Earth Engine environment, with an average time from the start of the run to the output of the results of 12 min and 11 min, respectively. The forest fire burn scar mapping at the pixel level extracted by the FERB is primarily performed in the Envi 5.3 software, which took an average of 25 min from the start of the run to the output of the results. The forest fire burn scar mapping at the subpixel level extracted by BASM-FERB is primarily performed in the Visual Studio 2012 platform, which took an average of 46 min from the start of the run to the output of the results. The forest fire burn scar mapping at the subpixel level extracted by the MSCSR approach is mainly performed in the Matlab R2016a software, with an average time from the start of the run to the output of the results of 52 min.
The computer is configured with Intel(R) Xeon(R) CPU E5-2620 v4 @ 2.10 GHz 2.10 GHz, 16 GB of RAM, Windows 10 Professional, SAMSUNG MZVPV256HDGL-000L7 239 GB of Disk, and NVIDIA Quadro M2000 4 GB of GPU.
5. Discussion
Before the popularization of remote sensing imagery, forest fire burn scar mapping usually relied on manual ground survey and in situ sketching [66], which can achieve the purpose of fine mapping, but it is difficult to achieve large area coverage using this method. The popularity of remote sensing imagery, e.g., satellite, aerial plane, etc., has provided scholars with a new means for forest fire burn scar mapping. Since the late 1980s, global-scale forest fire burn scar products based on remote sensing imagery have emerged successively [67]. Recently, forest fire burn scar mapping using remote sensing imagery as a data source has become a research hotspot [68].
Noise reduction is a classical problem in remote sensing imagery processing which has attracted much attention, and there are many approaches used for noise reduction, e.g., Asymmetrical Gaussian function fitting [69]. The forest fire burn scars’ abundance result extracted by the FCLS approach is susceptible to noise interference. To address this problem, the BASM-FERB approach is used to traverse classification results during the pixel-swapping process, which effectively avoids the interference of noise on forest fire burn scar mapping at the subpixel level. Currently, many researchers use machine learning approaches for forest fire burn scar mapping based on satellite remote sensing imagery. Ramo [70] conducted a global-scale forest fire burn scar classification study based on the MODIS satellite remote sensing imagery using a Random Forest, Support Vector Machine, Neural Network, and Decision Tree algorithm (C5.0). With the excellent performance of deep learning approaches in various fields, scholars have started to adopt deep learning methods to extract burn scars from remote sensing imagery. In 2021, Zhang [68] proposed the Siamese Self-Attention (SSA) classification strategy for multi-sensor burn scar mapping. Also, the burn index, such as CBI [5], BAI [71], NBR [8,9], dNBR [11], and so on, can be used to obtain fire severity and burn scars for remote sensing assessment, and it is therefore used in burn scar mapping studies. However, most of the existing research related to forest fire burn scar mapping using machine learning approaches, deep learning approaches, and index-based approaches is directly conducted at the pixel level.
Xu [72] conducted a series of studies in the field of forest fire using the subpixel mapping method and proposed the Burned Area Subpixel Mapping (BASM) workflow for forest fire burn scar mapping at the subpixel level, which improves the spatial resolution of forest fire burn scar mapping compared to the pixel-scale-based forest fire burn scar mapping approach. However, forest fire burn scar mapping at the subpixel level extracted by the BASM workflow is interfered with by noise, and to solve this problem, the MSCSR approach proposed in this paper can overcome the interference of noise on forest fire burn scar mapping at the subpixel level, and the forest fire burn scar mapping result has better performance and higher accuracy. Subpixel-scale forest fire burn scar mapping based on remote sensing imagery is of great practical significance for post-disaster management and assessment of forest fire.
6. Conclusions
Based on the Sentinel-2 satellite remote sensing imagery, we extracted subpixel-scale forest fire burn scar mapping using the MSCSR approach and compared it with that extracted by the BASM-FERB approach. Furthermore, to better illustrate the advantage of forest fire burn scar mapping at the subpixel level over pixel-scale forest fire burn scar mapping, the BASM-FERB, MSCSR, BAI, dNBR, and FERB approaches are compared. Compared with the pixel-scale forest fire burn scar mapping extracted by the BAI, dNBR, and FERB approaches, subpixel-scale forest fire burn scar mapping with a higher spatial resolution and accuracy value can be extracted by the BASM-FERB and the MSCSR approaches. In terms of the noise reduction effect, compared with the BASM-FERB approach, the MSCSR approach can more effectively reduce the interference of noise on forest fire burn scar mapping at the subpixel level. Furthermore, compared with the BASM-FERB approach, the forest fire burn scar mapping at the subpixel level extracted by the MSCSR approach has a higher accuracy, which is reflected in five indexes, OA, UA, PA, IoU, and Kappa, with the OA value increasing from 97.01% to 98.49%, the UA value increasing from 88.20% to 99.13%, the PA value increasing from 89.84% to 92.31%, the IoU average value increasing from 80.28% to 95.83%, and the Kappa average value increasing from 86.91% to 92.81%. Therefore, we obtain three conclusions. (1) Compared with the pixel-scale forest fire burn scar mapping extracted by the BAI, dNBR, and FERB approaches, the subpixel-scale forest fire burn scar mapping extracted by the BASM-FERB approach and MSCSR approach has a higher spatial resolution and accuracy value. (2) Compared with the BASM-FERB approach, the MSCSR approach can more effectively reduce the interference of noise on forest fire burn scar mapping at the subpixel level. (3) The MSCSR approach performs better than the BASM-FERB approach in forest fire burn scar mapping at the subpixel level, and the extracted forest fire burn scar mapping at the subpixel level has higher accuracy. The MSCSR approach is used to extract forest fire burn scar mapping at the subpixel level, which can be more effective in helping forest fire post-disaster management and assessment and so on.
J.Z.: Writing—original draft, Validation, Methodology, Formal analysis, Conceptualization. G.Z.: Supervision, Project administration, Funding acquisition, Conceptualization. H.X.: Writing—review and editing, Funding acquisition. R.C.: Writing—review and editing. Y.Y.: Writing—review and editing. S.W.: Writing—review and editing. All authors have read and agreed to the published version of the manuscript.
Data are contained within the article.
We thank the Hunan Forest Fire Prevention and Extinguish Command Center for their provision of actual forest fire data.
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 1. Study area. (a) The map is a standard Chinese map provided by the Ministry of Natural Resources of China, (b) the map is created using the CGCS_2000 geographical coordinates.
Figure 2. Flowchart of the MSCSR approach. The superscript of the variable indicates the layer of the MSCSR workflow. In the first layer, high spatial resolution imagery [Forumla omitted. See PDF.] is used for training and fusion, which is obtained based on the GF satellite imagery using the bicubic down-sampling method in Matlab 2016a software. In the second layer, [Forumla omitted. See PDF.] of the first layer is used as the input, with low spatial resolution imagery [Forumla omitted. See PDF.], of the second layer. The predicted high spatial resolution imagery [Forumla omitted. See PDF.] of the first layer is used as the input, with low spatial resolution imagery [Forumla omitted. See PDF.], of the second layer.
Figure 3. Flowchart of the BASM-FERB approach. Extraction of forest fire burn scars’ abundance using the FCLS approach on processed Sentinel-2 imagery. The forest fire burn scars’ abundance is inputted to the MPSA approach, outputting the forest fire burn scar mapping at the subpixel level. Forest fire burn scar mapping at the subpixel level is clumped, using the GF satellite imagery to evaluate the accuracy of that.
Figure 4. Comparison of the MSCSR, BAI, and dNBR with the BASM-FERB approach in minimizing misclassification and effect due to noise. The yellow line represents the extent of the burn scars’ vector. The red color represents the burn scars but the red color outside the yellow line represents noise. (A) represents the forest fire burn scar mapping based on the GF satellite imagery. (B) represents the forest fire burn scar mapping at the subpixel level based on the MSCSR approach. (C) represents the forest fire burn scar mapping at the subpixel level based on the BASM-FERB approach. (D) represents the forest fire burn scar mapping at the pixel level based on the BAI approach. (E) represents the forest fire burn scar mapping at the pixel level based on the dNBR approach.
Figure 5. Comparison of forest fire burn scar mapping at the pixel level with that at the subpixel level. The yellow line represents the extent of the forest fire burn scars’ vector. The red color represents the forest fire burn scars but the red color outside the yellow line represents noise. (A) represents the forest fire burn scar mapping based on the GF satellite imagery. (B–F) represents an enlarged view of the red-circled area in (A). (B) represents the pixel-scale forest fire burn scar mapping using the BAI approach. (C) represents the pixel-scale forest fire burn scar mapping using the dNBR approach. (D) represents the pixel-scale forest fire burn scar mapping using the FERB approach. (E) represents the forest fire burn scar mapping at the subpixel level using the BASM-FERB approach. (F) represents the forest fire burn scar mapping at the subpixel level using the MSCSR approach.
Parameters of the selected Sentinel-2 satellite imagery bands.
Number | Satellite | Sensor | Band Number | Band Width (μm) | Spatial Resolution (m) |
---|---|---|---|---|---|
1 | Sentinel-2 | MSI | B2 Blue | 0.46~0.52 | 10 |
B3 Green | 0.54~0.58 | 10 | |||
B4 Red | 0.65~0.68 | 10 | |||
B8 NIR | 0.79~0.90 | 10 |
Note: NIR represents the near-infrared.
GF satellite imagery band parameters.
Number | Satellite | Sensor | Band Number | Band Width (μm) | Spatial Resolution (m) |
---|---|---|---|---|---|
1 | GF-1, -1B, -1C, -1D | PMS | 1 | 0.45~0.90 | 2 |
2 | 0.45~0.52 | 8 | |||
3 | 0.52~0.59 | 8 | |||
4 | 0.63~0.69 | 8 | |||
5 | 0.77~0.89 | 8 | |||
2 | GF-2 | PMS | 1 | 0.45~0.90 | 1 |
2 | 0.45~0.52 | 4 | |||
3 | 0.52~0.59 | 4 | |||
4 | 0.63~0.69 | 4 | |||
5 | 0.77~0.89 | 4 |
Sample plot information of forest fire burn scars.
Number | Code | Starting Time | Extinguishing Time | Longitude | Latitude | Extent (ha) |
---|---|---|---|---|---|---|
1 | CZGY | 16 December 2019 | 16 December 2019 | 112°53′ | 25°54′ | 166.82 |
2 | YZLS | 1 October 2019 | 1 October 2019 | 112°8′ | 25°32′ | 95.32 |
3 | CZGD | 4 January 2021 | 4 January 2021 | 113°54′ | 25°58′ | 66.61 |
4 | CSNX | 19 March 2020 | 19 March 2020 | 112°0′ | 28°12′ | 53.06 |
5 | ZZLL | 16 April 2020 | 16 April 2020 | 113°27′ | 27°25′ | 37.99 |
6 | CZGD | 19 January 2021 | 19 January 2021 | 113°48′ | 25°49′ | 21.81 |
Corresponding satellite imagery information.
Number | Code | Satellite | Imagery Marking |
---|---|---|---|
1 | CZGY | Sentinel-2 | S2A_MSIL1C_20191211T030121_N0208_R032_T49RFJ_20191211T055310 |
S2B_MSIL1C_20191229T031129_N0208_R075_T49RFJ_20191229T045703 | |||
S2A_MSIL1C_20200130T025941_N0208_R032_T49RFJ_20200130T063238 | |||
GF-2 | GF2_PMS2_E112.9_N26.0_20200131_L1A0004590657 | ||
2 | CZGY | Sentinel-2 | S2A_MSIL1C_20191211T030121_N0208_R032_T49RFJ_20191211T055310 |
S2B_MSIL1C_20200105T030119_N0208_R032_T49RFJ_20200105T054607 | |||
S2A_MSIL1C_20200130T025941_N0208_R032_T49RFJ_20200130T063238 | |||
GF-2 | GF2_PMS2_E112.9_N26.0_20200131_L1A0004590657 | ||
3 | YZLS | Sentinel-2 | S2A_MSIL1C_20190816T030551_N0208_R075_T49RFJ_20190816T064328 |
S2A_MSIL1C_20191005T030601_N0208_R075_T49RFJ_20191005T060401 | |||
S2A_MSIL1C_20191104T030911_N0208_R075_T49RFJ_20191104T060224 | |||
GF-1C | GF1C_PMS_E112.2_N25.8_20191020_L1A1021532933 | ||
4 | CZGD | Sentinel-2 | S2B_MSIL1C_20201110T025939_N0209_R032_T49RGJ_20201110T051727 |
S2B_MSIL1C_20210119T030029_N0209_R032_T49RGJ_20210119T051132 | |||
S2B_MSIL1C_20210218T025749_N0209_R032_T49RGJ_20210218T051052 | |||
GF-1C | GF1C_PMS_E113.9_N25.8_20210218_L1A1021755587 | ||
5 | CSNX | Sentinel-2 | S2B_MSIL1C_20200318T030539_N0209_R075_T49REM_20200318T064248 |
S2B_MSIL1C_20200407T030539_N0209_R075_T49REM_20200407T055301 | |||
S2A_MSIL1C_20200412T030541_N0209_R075_T49REM_20200412T061542 | |||
GF-1C | GF1C_PMS_E112.2_N28.0_20200413_L1A1021569634 | ||
6 | ZZLL | Sentinel-2 | S2A_MSIL1C_20200409T025541_N0209_R032_T49RGL_20200409T063215 |
S2A_MSIL1C_20200429T025551_N0209_R032_T49RGL_20200429T054533 | |||
S2A_MSIL1C_20200519T025551_N0209_R032_T49RGL_20200519T060344 | |||
GF-1 | GF1_PMS1_E113.5_N27.5_20200519_L1A0004809745 | ||
7 | CZGD | Sentinel-2 | S2B_MSIL1C_20210119T030029_N0209_R032_T49RGJ_20210119T051132 |
S2A_MSIL1C_20210124T030021_N0209_R032_T49RGJ_20210124T051042 | |||
S2B_MSIL1C_20210218T025749_N0209_R032_T49RGJ_20210218T051052 | |||
GF-1C | GF1C_PMS_E113.9_N25.8_20210218_L1A1021755587 | ||
8 | CZGD | Sentinel-2 | S2B_MSIL1C_20210119T030029_N0209_R032_T49RGJ_20210119T051132 |
S2B_MSIL1C_20210129T025949_N0209_R032_T49RGJ_20210129T052617 | |||
S2B_MSIL1C_20210218T025749_N0209_R032_T49RGJ_20210218T051052 | |||
GF-1C | GF1C_PMS_E113.9_N25.8_20210218_L1A1021755587 |
Parameter setting for the FERB classifier.
Number | Model | Parameter Setting |
---|---|---|
1 | FERB | (1) Custom bands (normalized difference) |
Confusion matrix.
Number | Truth | Burn Scars | Background | |
---|---|---|---|---|
Prediction | ||||
1 | Burn scars | True Positive (TP) | False Positive (FP) | |
2 | Background | False Negative (FN) | True Negative (TN) |
Accuracy comparison among the BAI, dNBR, and FERB at the pixel scale.
Code | Approach | Truth | Burn Scars (NP) | Background (NP) | OA (%) | UA (%) | PA (%) | IoU (%) | Kappa (%) | |
---|---|---|---|---|---|---|---|---|---|---|
Prediction | ||||||||||
1CZGY | BAI | Burn scars | 9420 | 5874 | 93.57 | 61.59 | 86.79 | 56.31 | 68.53 | |
Background | 1434 | 96,854 | ||||||||
dNBR | Burn scars | 8411 | 510 | 97.40 | 94.28 | 77.49 | 74.01 | 83.66 | ||
Background | 2443 | 102,218 | ||||||||
FERB | Burn scars | 9162 | 1139 | 97.54 | 88.94 | 84.66 | 76.60 | 85.40 | ||
Background | 1660 | 102,003 | ||||||||
2CZGY | BAI | Burn scars | 9265 | 4441 | 94.69 | 67.60 | 85.36 | 60.58 | 72.52 | |
Background | 1589 | 98,287 | ||||||||
dNBR | Burn scars | 8557 | 532 | 97.51 | 94.15 | 78.84 | 75.15 | 84.46 | ||
Background | 2297 | 102,196 | ||||||||
FERB | Burn scars | 9566 | 4730 | 94.73 | 66.91 | 88.04 | 61.34 | 73.14 | ||
Background | 1299 | 98,803 | ||||||||
3YZLS | BAI | Burn scars | 4716 | 2400 | 95.88 | 66.27 | 47.24 | 38.08 | 53.06 | |
Background | 5268 | 173,548 | ||||||||
dNBR | Burn scars | 8668 | 13,953 | 91.78 | 38.32 | 86.61 | 36.18 | 49.35 | ||
Background | 1340 | 161,983 | ||||||||
FERB | Burn scars | 8604 | 846 | 98.83 | 91.05 | 86.73 | 79.91 | 88.22 | ||
Background | 1317 | 174,049 | ||||||||
4CZGD | BAI | Burn scars | 0 | 7 | 96.38 | 0.00 | 0.00 | 0.00 | −0.01 | |
Background | 5885 | 156,942 | ||||||||
dNBR | Burn scars | 5391 | 36,336 | 77.38 | 12.92 | 91.61 | 12.77 | 17.41 | ||
Background | 494 | 120,613 | ||||||||
FERB | Burn scars | 5033 | 1938 | 98.31 | 72.20 | 86.14 | 64.68 | 77.68 | ||
Background | 810 | 155,153 | ||||||||
5CSNX | BAI | Burn scars | 2727 | 308 | 80.55 | 89.85 | 54.01 | 50.91 | 54.79 | |
Background | 2322 | 8166 | ||||||||
dNBR | Burn scars | 4325 | 279 | 92.58 | 93.94 | 85.66 | 81.17 | 83.86 | ||
Background | 724 | 8195 | ||||||||
FERB | Burn scars | 4530 | 238 | 94.61 | 95.01 | 90.31 | 86.22 | 88.37 | ||
Background | 486 | 8188 | ||||||||
6ZZLL | BAI | Burn scars | 711 | 223 | 93.10 | 76.12 | 19.20 | 18.11 | 28.37 | |
Background | 2993 | 42,700 | ||||||||
dNBR | Burn scars | 3333 | 2711 | 93.39 | 55.15 | 89.98 | 51.96 | 64.93 | ||
Background | 371 | 40,212 | ||||||||
FERB | Burn scars | 3227 | 309 | 98.30 | 91.26 | 87.08 | 80.37 | 88.20 | ||
Background | 479 | 42,466 | ||||||||
7CZGD | BAI | Burn scars | 0 | 0 | 70.62 | 0.00 | 0.00 | 0.00 | 0.00 | |
Background | 2373 | 5703 | ||||||||
dNBR | Burn scars | 1952 | 40 | 94.22 | 97.99 | 82.09 | 80.73 | 85.41 | ||
Background | 426 | 5641 | ||||||||
FERB | Burn scars | 2107 | 222 | 94.07 | 90.47 | 89.28 | 81.60 | 85.68 | ||
Background | 253 | 5428 | ||||||||
8CZGD | BAI | Burn scars | 93 | 36 | 70.83 | 72.09 | 3.92 | 3.86 | 4.49 | |
Background | 2280 | 5531 | ||||||||
dNBR | Burn scars | 1997 | 7 | 95.11 | 99.65 | 83.98 | 83.73 | 87.81 | ||
Background | 381 | 5556 | ||||||||
FERB | Burn scars | 2107 | 222 | 93.99 | 90.47 | 89.28 | 81.60 | 85.60 | ||
Background | 253 | 5324 |
Note: NP represents the number of pixels.
Accuracy comparison among the MSCSR and BASM-FERB at the subpixel scale.
Code | Approach | Truth | Burn Scars (NP) | Background (NP) | OA (%) | UA (%) | PA (%) | IoU (%) | Kappa (%) | |
---|---|---|---|---|---|---|---|---|---|---|
Prediction | ||||||||||
1CZGY | MSCSR | Burn scars | 248,977 | 2918 | 97.94 | 98.84 | 94.79 | 97.66 | 88.66 | |
Background | 3048 | 2,533,070 | ||||||||
BASM-FERB | Burn scars | 223,999 | 20,964 | 97.92 | 91.44 | 85.57 | 79.23 | 87.27 | ||
Background | 37,764 | 2,549,932 | ||||||||
2CZGY | MSCSR | Burn scars | 223,699 | 3162 | 98.10 | 98.61 | 85.17 | 90.09 | 88.63 | |
Background | 21,453 | 2,562,900 | ||||||||
BASM-FERB | Burn scars | 223,999 | 20,964 | 97.93 | 91.44 | 85.57 | 79.23 | 87.27 | ||
Background | 37,764 | 2,549,932 | ||||||||
3YZLS | MSCSR | Burn scars | 207,475 | 1721 | 99.39 | 99.18 | 91.05 | 95.68 | 93.50 | |
Background | 7643 | 4,393,103 | ||||||||
BASM-FERB | Burn scars | 200,224 | 23,931 | 98.91 | 89.32 | 88.52 | 80.05 | 88.35 | ||
Background | 25,973 | 4,347,147 | ||||||||
4CZGD | MSCSR | Burn scars | 114,386 | 0 | 99.46 | 100.00 | 84.50 | 90.15 | 91.35 | |
Background | 12,500 | 3,912,543 | ||||||||
BASM-FERB | Burn scars | 113,575 | 42,194 | 98.41 | 72.91 | 83.72 | 63.86 | 77.13 | ||
Background | 22,081 | 3,875,836 | ||||||||
5CSNX | MSCSR | Burn scars | 112,908 | 88 | 98.07 | 99.92 | 95.17 | 99.87 | 95.87 | |
Background | 62 | 213,377 | ||||||||
BASM-FERB | Burn scars | 105,881 | 5614 | 94.61 | 94.96 | 89.65 | 85.58 | 88.11 | ||
Background | 12,225 | 207,266 | ||||||||
6ZZLL | MSCSR | Burn scars | 80,139 | 1287 | 99.33 | 98.42 | 96.59 | 97.72 | 95.14 | |
Background | 584 | 1,063,336 | ||||||||
BASM-FERB | Burn scars | 79,636 | 7893 | 99.05 | 90.98 | 96.34 | 87.94 | 93.07 | ||
Background | 3027 | 1,056,437 | ||||||||
7CZGD | MSCSR | Burn scars | 52,745 | 993 | 97.41 | 98.15 | 96.53 | 97.26 | 93.73 | |
Background | 495 | 137,919 | ||||||||
BASM-FERB | Burn scars | 51,663 | 7524 | 94.67 | 87.29 | 94.67 | 83.20 | 87.08 | ||
Background | 2909 | 133,462 | ||||||||
8CZGD | MSCSR | Burn scars | 51,737 | 25 | 98.22 | 99.95 | 94.68 | 98.26 | 95.63 | |
Background | 893 | 138,841 | ||||||||
BASM-FERB | Burn scars | 51,663 | 7533 | 94.60 | 87.27 | 94.67 | 83.19 | 87.01 | ||
Background | 2909 | 131,344 |
Note: NP represents the Number of Pixels.
Accuracy average comparison among the MSCSR, BASM-FERB, BAI, dNBR, and FERB.
Number | Approach | Average | ||||
---|---|---|---|---|---|---|
OA (%) | UA (%) | PA (%) | IoU (%) | Kappa (%) | ||
1 | MSCSR | 98.49 | 99.13 | 92.31 | 95.83 | 92.81 |
2 | BASM-FERB | 97.01 | 88.20 | 89.84 | 80.28 | 86.91 |
3 | BAI | 86.95 | 54.19 | 37.07 | 28.48 | 35.22 |
4 | dNBR | 92.42 | 73.30 | 84.53 | 61.96 | 69.61 |
5 | FERB | 96.30 | 85.79 | 87.69 | 76.54 | 84.04 |
Accuracy value difference in forest fire burn scar mapping at the subpixel level.
Number | Code | Accuracy Difference | ||||
---|---|---|---|---|---|---|
OA (%) | UA (%) | PA (%) | IoU (%) | Kappa (%) | ||
1 | 1CZGY | 0.01 | 7.40 | 9.22 | 18.43 | 1.39 |
2 | 2CZGY | 0.17 | 7.17 | −0.40 | 10.86 | 1.36 |
3 | 3YZLS | 0.48 | 9.86 | 2.53 | 15.63 | 5.15 |
4 | 4CZGD | 1.05 | 27.09 | 0.78 | 26.29 | 14.22 |
5 | 5CSNX | 3.46 | 4.96 | 5.52 | 14.29 | 7.76 |
6 | 6ZZLL | 0.29 | 7.44 | 0.25 | 9.78 | 2.07 |
7 | 7CZGD | 2.74 | 10.86 | 1.86 | 14.06 | 6.65 |
8 | 8CZGD | 3.62 | 12.68 | 0.01 | 15.07 | 8.62 |
Average | 1.48 | 10.93 | 2.47 | 15.55 | 5.90 |
Note:
Comparison of average run time for forest fire burn scar mapping based on the five approaches.
Number | Method | Software/Environment | Average Running Time (Min.) |
---|---|---|---|
1 | BAI | Google Earth Engine | 12 |
2 | dNBR | Google Earth Engine | 11 |
3 | FERB | Envi 5.3 | 25 |
4 | BASM-FERB | Visual Studio 2012 | 46 |
5 | MSCSR | Matlab 2016a | 52 |
Note: Min. represents the minute.
References
1. Matin, M.A.; Chitale, V.S.; Murthy, M.S.R.; Uddin, K.; Bajracharya, B.; Pradhan, S. Understanding forest fire patterns and risk in Nepal using remote sensing, geographic information system and historical fire data. Int. J. Wildland Fire; 2017; 26, pp. 276-286. [DOI: https://dx.doi.org/10.1071/WF16056]
2. Zheng, Y.; Zhang, G.; Tan, S.; Feng, L. Research on Progress of Forest Fire Monitoring with Satellite Remote Sensing. Agric. Rural Stud.; 2023; 1, 8. [DOI: https://dx.doi.org/10.59978/ar01020008]
3. Wang, Z.; Ma, T.; Shao, Y.; Sun, L.; Li, Y.; Zhang, X.; Zhang, L.; Zhang, G.; Fan, W.; Feng, Z. Future oriented Smart Forestry in China: Evolution and Development Trends of Observation Instrument Systems. Sci. Silvae Sin.; 2024; 60, pp. 1-15. [DOI: https://dx.doi.org/10.11707/j.1001-7488.LYKX20220903] (In Chinese)
4. Santos, F.L.M.; Libonati, R.; Peres, L.F.; Pereira, A.A.; Narcizo, L.C.; Rodrigues, J.A.; Oom, D.; Pereira, J.M.C.; Schroeder, W.; Setzer, A.W. Assessing VIIRS capabilities to improve burned area mapping over the Brazilian Cerrado. Int. J. Remote Sens.; 2020; 41, pp. 8300-8327. [DOI: https://dx.doi.org/10.1080/01431161.2020.1771791]
5. Key, C.H.; Benson, N. Landscape Assessment (LA) Sampling and Analysis Methods. FIREMON Fire Eff. Monit. Inventory Syst.; 2004; 164, LA-1.
6. Danneyrolles, V.; Smetanka, C.; Fournier, R.; Boucher, J.; Guindon, L.; Waldron, K.; Bourdon, J.-F.; Bonfils, D.; Beaudoin, M.; Ibarzabal, J. et al. Assessing spatial patterns of burn severity for guiding post-fire salvage logging in boreal forests of Eastern Canada. For. Ecol. Manag.; 2024; 556, 121756. [DOI: https://dx.doi.org/10.1016/j.foreco.2024.121756]
7. Pacheco, A.d.P.; da Silva Junior, J.A.; Ruiz-Armenteros, A.M.; Henriques, R.F.F.; de Oliveira Santos, I. Analysis of Spectral Separability for Detecting Burned Areas Using Landsat-8 OLI/TIRS Images under Different Biomes in Brazil and Portugal. Forests; 2023; 14, 663. [DOI: https://dx.doi.org/10.3390/f14040663]
8. van Wagtendonk, J.W.; Root, R.R.; Key, C.H. Comparison of AVIRIS and Landsat ETM+ detection capabilities for burn severity. Remote Sens. Environ.; 2004; 92, pp. 397-408. [DOI: https://dx.doi.org/10.1016/j.rse.2003.12.015]
9. Allen, J.L.; Sorbel, B. Assessing the differenced Normalized Burn Ratio’s ability to map burn severity in the boreal forest and tundra ecosystems of Alaska’s national parks. Int. J. Wildland Fire; 2008; 17, pp. 463-475. [DOI: https://dx.doi.org/10.1071/WF08034]
10. Zhang, X.; Yin, G.; Ma, Y.; Fan, J.; Zhou, J. Comparison between artificial restoration and natural recovery in vegetation regrowth following high-frequency fire disturbances in the Hengduan Mountains, Southwest China. Ecol. Indic.; 2024; 167, 112692. [DOI: https://dx.doi.org/10.1016/j.ecolind.2024.112692]
11. Fotakidis, V.; Chrysafis, I.; Mallinis, G.; Koutsias, N. Continuous burned area monitoring using bi-temporal spectral index time series analysis. Int. J. Appl. Earth Obs. Geoinf.; 2023; 125, 103547. [DOI: https://dx.doi.org/10.1016/j.jag.2023.103547]
12. Chang, Y.; Zhu, Z.; Feng, Y.; Li, Y.; Bu, R.; Hu, Y. The spatial variation in forest burn severity in Heilongjiang Province, China. Nat. Hazards; 2016; 81, pp. 981-1001. [DOI: https://dx.doi.org/10.1007/s11069-015-2116-9]
13. Musyimi, Z.; Said, M.Y.; Zida, D.; Rosenstock, T.S.; Udelhoven, T.; Savadogo, P.; de Leeuw, J.; Aynekulu, E. Evaluating fire severity in Sudanian ecosystems of Burkina Faso using Landsat 8 satellite images. J. Arid Environ.; 2017; 139, pp. 95-109. [DOI: https://dx.doi.org/10.1016/j.jaridenv.2016.11.005]
14. Stambaugh, M.; Hammer, L.; Godfrey, R. Performance of Burn-Severity Metrics and Classification in Oak Woodlands and Grasslands. Remote Sens.; 2015; 7, pp. 10501-10522. [DOI: https://dx.doi.org/10.3390/rs70810501]
15. Soverel, N.O.; Perrakis, D.D.B.; Coops, N.C. Estimating burn severity from Landsat dNBR and RdNBR indices across western Canada. Remote Sens. Environ.; 2010; 114, pp. 1896-1909. [DOI: https://dx.doi.org/10.1016/j.rse.2010.03.013]
16. Hu, X.; Zhang, P.; Ban, Y. Large-scale burn severity mapping in multispectral imagery using deep semantic segmentation models. ISPRS J. Photogramm. Remote Sens.; 2023; 196, pp. 228-240. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2022.12.026]
17. Han, Y.; Zheng, C.; Liu, X.; Tian, Y.; Dong, Z. Burned Area and Burn Severity Mapping with a Transformer-Based Change Detection Model. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.; 2024; 17, pp. 13866-13880. [DOI: https://dx.doi.org/10.1109/JSTARS.2024.3435857]
18. Xu, H.; Zhang, G.; Zhou, Z.; Zhou, X.; Zhang, J.; Zhou, C. Development of a Novel Burned-Area Subpixel Mapping (BASM) Workflow for Fire Scar Detection at Subpixel Level. Remote Sens.; 2022; 14, 3546. [DOI: https://dx.doi.org/10.3390/rs14153546]
19. Hawbaker, T.J.; Vanderhoof, M.K.; Schmidt, G.L.; Beal, Y.-J.; Picotte, J.J.; Takacs, J.D.; Falgout, J.T.; Dwyer, J.L. The Landsat Burned Area algorithm and products for the conterminous United States. Remote Sens. Environ.; 2020; 244, 111801. [DOI: https://dx.doi.org/10.1016/j.rse.2020.111801]
20. Pulvirenti, L.; Squicciarino, G.; Negro, D.; Puca, S. Object-Based Validation of a Sentinel-2 Burned Area Product Using Ground-Based Burn Polygons. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.; 2023; 16, pp. 9154-9163. [DOI: https://dx.doi.org/10.1109/JSTARS.2023.3316303]
21. Wang, Q.; Zhang, C.; Atkinson, P.M. Sub-pixel mapping with point constraints. Remote Sens. Environ.; 2020; 244, 111817. [DOI: https://dx.doi.org/10.1016/j.rse.2020.111817]
22. Daniel, C.; Heinz, C.-I.C. Fully Constrained Least Squares Linear Spectral Mixture Analysis Method for Material Quantification in Hyperspectral Imagery. IEEE Trans. Geosci. Remote Sens.; 2001; 39, pp. 529-545.
23. Nascimento, J.M.P.; Dias, J.M.B. Vertex component analysis: A fast algorithm to unmix hyperspectral data. IEEE Trans. Geosci. Remote Sens.; 2005; 43, pp. 898-910. [DOI: https://dx.doi.org/10.1109/TGRS.2005.844293]
24. Chang, C.I.; Wu, C.C.; Liu, W.; Ouyang, Y.C. A New Growing Method for Simplex-Based Endmember Extraction Algorithm. IEEE Trans. Geosci. Remote Sens.; 2006; 44, pp. 2804-2819. [DOI: https://dx.doi.org/10.1109/TGRS.2006.881803]
25. Miao, L.; Qi, H. Endmember Extraction From Highly Mixed Data Using Minimum Volume Constrained Nonnegative Matrix Factorization. IEEE Trans. Geosci. Remote Sens.; 2007; 45, pp. 765-777. [DOI: https://dx.doi.org/10.1109/TGRS.2006.888466]
26. Atkinson, P.M. Mapping Sub-Pixel Boundaries from Remotely Sensed Images; Innovations in GIS,. Taylor and Francis: London, UK, 1997; Volume 4, pp. 166-180.
27. Atkinson, P.M.; Cutler, M.E.J.; Lewis, H. Mapping sub-pixel proportional land cover with AVHRR imagery. Int. J. Remote Sens.; 1997; 18, pp. 917-935. [DOI: https://dx.doi.org/10.1080/014311697218836]
28. Harris, J.L. Diffraction and Resolving Power. J. Opt. Soc. Am.; 1964; 54, pp. 931-936. [DOI: https://dx.doi.org/10.1364/JOSA.54.000931]
29. Freeman, W.T.; Pasztor, E.C. Learning low-level vision. Proceedings of the Seventh IEEE International Conference on Computer Vision; Kerkyra, Greece, 20–27 September 1999.
30. Yang, J.; Wright, J.; Huang, T.S.; Ma, Y. Image Super-Resolution via Sparse Representation. Ieee Trans. Image Process.; 2010; 19, pp. 2861-2873. [DOI: https://dx.doi.org/10.1109/TIP.2010.2050625]
31. Zhang, J.; Shao, M.; Yu, L.; Li, Y. Image super-resolution reconstruction based on sparse representation and deep learning. Signal Process. Image Commun.; 2020; 87, 115925. [DOI: https://dx.doi.org/10.1016/j.image.2020.115925]
32. Hu, K.; Liu, Z.; Shao, P.; Ma, K.; Xu, Y.; Wang, S.; Wang, Y.; Wang, H.; Di, L.; Xia, M. et al. A Review of Satellite-Based CO2 Data Reconstruction Studies: Methodologies, Challenges, and Advances. Remote Sens.; 2024; 16, 3818. [DOI: https://dx.doi.org/10.3390/rs16203818]
33. Liu, X.; Zhai, D.; Zhao, D.; Gao, W. Image Super-Resolution via Hierarchical and Collaborative Sparse Representation. Proceedings of the 2013 Data Compression Conference; Snowbird, UT, USA, 20–22 March 2013; pp. 93-102.
34. Peng, Y.; Guo, Y. Hunan, with Nearly 60% Forest Cover, Is Exploring Sustainable Forest Management. Available online: http://www.hunan.gov.cn/hnyw/bmdt/202311/t20231121_32444759.html (accessed on 6 November 2024). (In Chinese)
35. Roteta, E.; Bastarrika, A.; Padilla, M.; Storm, T.; Chuvieco, E. Development of a Sentinel-2 burned area algorithm: Generation of a small fire database for sub-Saharan Africa. Remote Sens. Environ.; 2019; 222, pp. 1-17. [DOI: https://dx.doi.org/10.1016/j.rse.2018.12.011]
36. Yan, L.; Roy, D.P.; Li, Z.; Zhang, H.K.; Huang, H. Sentinel-2A multi-temporal misregistration characterization and an orbit-based sub-pixel registration methodology. Remote Sens. Environ.; 2018; 215, pp. 495-506. [DOI: https://dx.doi.org/10.1016/j.rse.2018.04.021]
37. Morresi, D.; Marzano, R.; Lingua, E.; Motta, R.; Garbarino, M. Mapping burn severity in the western Italian Alps through phenologically coherent reflectance composites derived from Sentinel-2 imagery. Remote Sens. Environ.; 2022; 269, 112800. [DOI: https://dx.doi.org/10.1016/j.rse.2021.112800]
38. Louis, J.; Debaecker, V.; Pflug, B.; Main-Knorn, M.; Bieniarz, J.; Mueller-Wilm, U.; Cadau, E.; Gascon, F. Sentinel-2 sen2cor: L2A Processor for Users. Available online: https://elib.dlr.de/107381/1/LPS2016_sm10_3louis.pdf (accessed on 19 April 2022).
39. Ramo, R.; Roteta, E.; Bistinas, I.; van Wees, D.; Bastarrika, A.; Chuvieco, E.; van der Werf, G.R. African burned area and fire carbon emissions are strongly impacted by small fires undetected by coarse resolution satellite data. Proc. Natl. Acad. Sci. USA; 2021; 118, e2011160118. [DOI: https://dx.doi.org/10.1073/pnas.2011160118]
40. Li, J.; Chen, X.; Tian, L.; Huang, J.; Feng, L. Improved capabilities of the Chinese high-resolution remote sensing satellite GF-1 for monitoring suspended particulate matter (SPM) in inland waters: Radiometric and spatial considerations. ISPRS J. Photogramm. Remote Sens.; 2015; 106, pp. 145-156. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2015.05.009]
41. Tong, X.-Y.; Xia, G.-S.; Lu, Q.; Shen, H.; Li, S.; You, S.; Zhang, L. Land-cover classification with high-resolution remote sensing images using transferable deep models. Remote Sens. Environ.; 2020; 237, 111322. [DOI: https://dx.doi.org/10.1016/j.rse.2019.111322]
42. Zhang, C.; Ma, L.; Chen, J.; Rao, Y.; Zhou, Y.; Chen, X. Assessing the impact of endmember variability on linear Spectral Mixture Analysis (LSMA): A theoretical and simulation analysis. Remote Sens. Environ.; 2019; 235, 111471. [DOI: https://dx.doi.org/10.1016/j.rse.2019.111471]
43. Parker, B.M.; Lewis, T.; Srivastava, S.K. Estimation and evaluation of multi-decadal fire severity patterns using Landsat sensors. Remote Sens. Environ.; 2015; 170, pp. 340-349. [DOI: https://dx.doi.org/10.1016/j.rse.2015.09.014]
44. Hall, R.J.; Freeburn, J.T.; De Groot, W.J.; Pritchard, J.M.; Lynham, T.J.; Landry, R. Remote sensing of burn severity: Experience from western Canada boreal fires. Int. J. Ofwildland Fire; 2008; 17, pp. 476-489. [DOI: https://dx.doi.org/10.1071/WF08013]
45. Tong, X.; Liu, S.; Weng, Q. Bias-corrected rational polynomial coefficients for high accuracy geo-positioning of QuickBird stereo imagery. ISPRS J. Photogramm. Remote Sens.; 2010; 65, pp. 218-226. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2009.12.004]
46. Rabby, Y.W.; Ishtiaque, A.; Rahman, M.S. Evaluating the Effects of Digital Elevation Models in Landslide Susceptibility Mapping in Rangamati District, Bangladesh. Remote Sens.; 2020; 12, 2718. [DOI: https://dx.doi.org/10.3390/rs12172718]
47. Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A Critical Comparison Among Pansharpening Algorithms. IEEE Trans. Geosci. Remote Sens.; 2015; 53, pp. 2565-2586. [DOI: https://dx.doi.org/10.1109/TGRS.2014.2361734]
48. Wu, Y.; Wang, N.; Li, Z.; Chen, A.; Guo, Z.; Qie, Y. The effect of thermal radiation from surrounding terrain on glacier surface temperatures retrieved from remote sensing data: A case study from Qiyi Glacier, China. Remote Sens. Environ.; 2019; 231, 111267. [DOI: https://dx.doi.org/10.1016/j.rse.2019.111267]
49. Awada, H.; Ciraolo, G.; Maltese, A.; Provenzano, G.; Moreno Hidalgo, M.A.; Còrcoles, J.I. Assessing the performance of a large-scale irrigation system by estimations of actual evapotranspiration obtained by Landsat satellite images resampled with cubic convolution. Int. J. Appl. Earth Obs. Geoinf.; 2019; 75, pp. 96-105. [DOI: https://dx.doi.org/10.1016/j.jag.2018.10.016]
50. Thomas, C.; Ranchin, T.; Wald, L.; Chanussot, J. Synthesis of Multispectral Images to High Spatial Resolution: A Critical Review of Fusion Methods Based on Remote Sensing Physics. IEEE Trans. Geosci. Remote Sens.; 2008; 46, pp. 1301-1312. [DOI: https://dx.doi.org/10.1109/TGRS.2007.912448]
51. Heylen, R.; Burazerovic, D.; Scheunders, P. Fully Constrained Least Squares Spectral Unmixing by Simplex Projection. IEEE Trans. Geosci. Remote Sens.; 2011; 49, pp. 4112-4122. [DOI: https://dx.doi.org/10.1109/TGRS.2011.2155070]
52. Plaza, A.; Martinez, P.; Perez, R.; Plaza, J. A Quantitative and Comparative Analysis of Endmember Extraction Algorithms from Hyperspectral Data. IEEE Trans. Geosci. Remote Sens.; 2004; 42, pp. 650-663. [DOI: https://dx.doi.org/10.1109/TGRS.2003.820314]
53. Hamedianfar, A.; Shafri, H.Z.M.; Mansor, S.; Ahmad, N. Improving detailed rule-based feature extraction of urban areas from WorldView-2 image and lidar data. Int. J. Remote Sens.; 2014; 35, pp. 1876-1899. [DOI: https://dx.doi.org/10.1080/01431161.2013.879350]
54. Mansaray, L.R.; Yang, L.; Kabba, V.T.S.; Kanu, A.S.; Huang, J.; Wang, F. Optimising rice mapping in cloud-prone environments by combining quad-source optical with Sentinel-1A microwave satellite imagery. GIScience Remote Sens.; 2019; 56, pp. 1333-1354. [DOI: https://dx.doi.org/10.1080/15481603.2019.1646978]
55. Huang, G.; Yan, B.; Mou, Z.; Wu, K.; Lv, X. Surrogate Model for Torsional Behavior of Bundle Conductors and its Application. IEEE Trans. Power Deliv.; 2022; 37, pp. 67-75. [DOI: https://dx.doi.org/10.1109/TPWRD.2021.3053341]
56. Hamilton, D.; Brothers, K.; McCall, C.; Gautier, B.; Shea, T. Mapping Forest Burn Extent from Hyperspatial Imagery Using Machine Learning. Remote Sens.; 2021; 13, 3843. [DOI: https://dx.doi.org/10.3390/rs13193843]
57. Zhan, P.; Zhu, W.; Li, N. An automated rice mapping method based on flooding signals in synthetic aperture radar time series. Remote Sens. Environ.; 2021; 252, 112112. [DOI: https://dx.doi.org/10.1016/j.rse.2020.112112]
58. Sebald, J.; Senf, C.; Seidl, R. Human or natural? Landscape context improves the attribution of forest disturbances mapped from Landsat in Central Europe. Remote Sens. Environ.; 2021; 262, 112502. [DOI: https://dx.doi.org/10.1016/j.rse.2021.112502]
59. Nelson, M.D.; Garner, J.D.; Tavernia, B.G.; Stehman, S.V.; Riemann, R.I.; Lister, A.J.; Perry, C.H. Assessing map accuracy from a suite of site-specific, non-site specific, and spatial distribution approaches. Remote Sens. Environ.; 2021; 260, 112442. [DOI: https://dx.doi.org/10.1016/j.rse.2021.112442]
60. Ji, C.; Bachmann, M.; Esch, T.; Feilhauer, H.; Heiden, U.; Heldens, W.; Hueni, A.; Lakes, T.; Metz-Marconcini, A.; Schroedter-Homscheidt, M. et al. Solar photovoltaic module detection using laboratory and airborne imaging spectroscopy data. Remote Sens Environ.; 2021; 266, 112692. [DOI: https://dx.doi.org/10.1016/j.rse.2021.112692]
61. Watanabe, M.; Koyama, C.N.; Hayashi, M.; Nagatani, I.; Tadono, T.; Shimada, M. Refined algorithm for forest early warning system with ALOS-2/PALSAR-2 ScanSAR data in tropical forest regions. Remote Sens. Environ.; 2021; 265, 112643. [DOI: https://dx.doi.org/10.1016/j.rse.2021.112643]
62. Foody, G.M. Impacts of ignorance on the accuracy of image classification and thematic mapping. Remote Sens. Environ.; 2021; 259, 112367. [DOI: https://dx.doi.org/10.1016/j.rse.2021.112367]
63. Wu, X.; Shi, Z.; Zou, Z. A geographic information-driven method and a new large scale dataset for remote sensing cloud/snow detection. ISPRS J. Photogramm. Remote Sens.; 2021; 174, pp. 87-104. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2021.01.023]
64. Hao, Z.; Lin, L.; Post, C.J.; Mikhailova, E.A.; Li, M.; Chen, Y.; Yu, K.; Liu, J. Automated tree-crown and height detection in a young forest plantation using mask region-based convolutional neural network (Mask R-CNN). ISPRS J. Photogramm. Remote Sens.; 2021; 178, pp. 112-123. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2021.06.003]
65. Bhattarai, R.; Rahimzadeh-Bajgiran, P.; Weiskittel, A.; Meneghini, A.; MacLean, D.A. Spruce budworm tree host species distribution and abundance mapping using multi-temporal Sentinel-1 and Sentinel-2 satellite imagery. ISPRS J. Photogramm. Remote Sens.; 2021; 172, pp. 28-40. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2020.11.023]
66. Chuvieco, E.; Mouillot, F.; van der Werf, G.R.; San Miguel, J.; Tanase, M.; Koutsias, N.; García, M.; Yebra, M.; Padilla, M.; Gitas, I. et al. Historical background and current developments for mapping burned area from satellite Earth observation. Remote Sens. Environ.; 2019; 225, pp. 45-64. [DOI: https://dx.doi.org/10.1016/j.rse.2019.02.013]
67. Mouillot, F.; Schultz, M.G.; Yue, C.; Cadule, P.; Tansey, K.; Ciais, P.; Chuvieco, E. Ten years of global burned area products from spaceborne remote sensing—A review: Analysis of user needs and recommendations for future developments. Int. J. Appl. Earth Obs. Geoinf.; 2014; 26, pp. 64-79. [DOI: https://dx.doi.org/10.1016/j.jag.2013.05.014]
68. Zhang, Q.; Ge, L.; Zhang, R.; Metternicht, G.I.; Du, Z.; Kuang, J.; Xu, M. Deep-learning-based burned area mapping using the synergy of Sentinel-1&2 data. Remote Sens. Environ.; 2021; 264, 112575. [DOI: https://dx.doi.org/10.1016/j.rse.2021.112575]
69. Hird, J.N.; McDermid, G.J. Noise reduction of NDVI time series: An empirical comparison of selected techniques. Remote Sens. Environ.; 2009; 113, pp. 248-258. [DOI: https://dx.doi.org/10.1016/j.rse.2008.09.003]
70. Ramo, R.; García, M.; Rodríguez, D.; Chuvieco, E. A data mining approach for global burned area mapping. Int. J. Appl. Earth Obs. Geoinf.; 2018; 73, pp. 39-51. [DOI: https://dx.doi.org/10.1016/j.jag.2018.05.027]
71. Benito, P.M.; Torralbo, A. Landsat and MODIS Images for Burned Areas Mapping in Galicia, Spain. Master’s Thesis; Royal Institute of Technology (KTH): Stockholm, Sweden, 2012.
72. Xu, H.; Zhang, G.; Zhou, Z.; Zhou, X.; Zhou, C. Forest Fire Monitoring and Positioning Improvement at Subpixel Level: Application to Himawari-8 Fire Products. Remote Sens.; 2022; 14, 2460. [DOI: https://dx.doi.org/10.3390/rs14102460]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
It is of great significance to map forest fire burn scars for post-disaster management and assessment of forest fires. Satellites can be utilized to acquire imagery even in primitive forests with steep mountainous terrain. However, forest fire burn scar mapping extracted by the Burned Area Index (BAI), differenced Normalized Burn Ratio (dNBR), and Feature Extraction Rule-Based (FERB) approaches directly at pixel level is limited by the satellite imagery spatial resolution. To further improve the spatial resolution of forest fire burn scar mapping, we improved the image super-resolution reconstruction via sparse representation (SCSR) and named it modified image super-resolution reconstruction via sparse representation (MSCSR). It was compared with the Burned Area Subpixel Mapping–Feature Extraction Rule-Based (BASM-FERB) method to screen a better approach. Based on the Sentinel-2 satellite imagery, the MSCSR and BASM-FERB approaches were used to map forest fire burn scars at the subpixel level, and the extraction result was validated using actual forest fire data. The results show that forest fire burn scar mapping at the subpixel level obtained by the MSCSR and BASM-FERB approaches has a higher spatial resolution; in particular, the MSCSR approach can more effectively reduce the noise effect on forest fire burn scar mapping at the subpixel level. Five accuracy indexes, the Overall Accuracy (OA), User’s Accuracy (UA), Producer’s Accuracy (PA), Intersection over Union (IoU), and Kappa Coefficient (Kappa), are used to assess the accuracy of forest fire burn scar mapping at the pixel/subpixel level based on the BAI, dNBR, FERB, MSCSR and BASM-FERB approaches. The average accuracy values of the OA, UA, PA, IoU, and Kappa of the forest fire burn scar mapping results at the subpixel level extracted by the MSCSR and BASM-FERB approaches are superior compared to the forest fire burn scar mapping results at the pixel level extracted by the BAI, dNBR and FERB approaches. In particular, the average accuracy values of the OA, UA, PA, IoU, and Kappa of the forest fire burn scar mapping at the subpixel level detected by the MSCSR approach are 98.49%, 99.13%, 92.31%, 95.83%, and 92.81%, respectively, which are 1.48%, 10.93%, 2.47%, 15.55%, and 5.90%, respectively, higher than the accuracy of that extracted by the BASM-FERB approach. It is concluded that the MSCSR approach extracts forest fire burn scar mapping at the subpixel level with higher accuracy and spatial resolution for post-disaster management and assessment of forest fires.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer