Next Article in Journal
A Serverless-Based, On-the-Fly Computing Framework for Remote Sensing Image Collection
Next Article in Special Issue
Experience Gained When Using the Yuneec E10T Thermal Camera in Environmental Research
Previous Article in Journal
Ground-Based MAX-DOAS Measurements of Tropospheric Aerosols, NO2, and HCHO Distributions in the Urban Environment of Shanghai, China
Previous Article in Special Issue
Evaluation of the Influence of Processing Parameters in Structure-from-Motion Software on the Quality of Digital Elevation Models and Orthomosaics in the Context of Studies on Earth Surface Dynamics
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Normalized Burn Ratio Plus (NBR+): A New Index for Sentinel-2 Imagery

Emanuele Alcaras
Domenica Costantino
Francesca Guastaferro
Claudio Parente
4,* and
Massimiliano Pepe
International PhD Programme “Environment, Resources and Sustainable Development”, Department of Science and Technology, Parthenope University of Naples, Centro Direzionale, Isola C4, 80143 Naples, Italy
DICATECH, Polytechnic of Bari, Via E. Orabona 4, 70125 Bari, Italy
Almaviva Digitaltec, Centro Direzionale, Isola F8, 80143 Naples, Italy
Department of Science and Technology, Parthenope University of Naples, Centro Direzionale, Isola C4, 80143 Naples, Italy
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(7), 1727;
Submission received: 28 February 2022 / Revised: 25 March 2022 / Accepted: 29 March 2022 / Published: 3 April 2022
(This article belongs to the Special Issue Recent Advances in GIS Techniques for Remote Sensing)


The monitoring of burned areas can easily be performed using satellite multispectral images: several indices are available in the literature for highlighting the differences between healthy vegetation areas and burned areas, in consideration of their different signatures. However, these indices may have limitations determined, for example, by the presence of clouds or water bodies that produce false alarms. To avoid these inaccuracies and optimize the results, this work proposes a new index for detecting burned areas named Normalized Burn Ratio Plus (NBR+), based on the involvement of Sentinel-2 bands. The efficiency of this index is verified by comparing it with five other existing indices, all applied on an area with a surface of about 500 km2 and covering the north-eastern part of Sicily (Italy). To achieve this aim, both a uni-temporal approach (single date image) and a bi-temporal approach (two date images) are adopted. The maximum likelihood classifier (MLC) is applied to each resulting index map to define the threshold separating burned pixels from non-burned ones. To evaluate the efficiency of the indices, confusion matrices are constructed and compared with each other. The NBR+ shows excellent results, especially because it excludes a large part of the areas incorrectly classified as burned by other indices, despite being clouds or water bodies.

1. Introduction

An important research field when using satellite images is the monitoring of active fires [1], their impact on air quality [2], and other traces they leave on the environment [3,4]: burned areas (BA). In fact, accurate and rapid mapping of fire damaged areas is necessary to support fire management, estimate environmental cost, define planning strategies, and monitor the restoration of vegetation [5]. The identification of BA through the use of remote sensing (RS) techniques, instruments, and methods represents a field of research in continuous development [6]. The ability to identify BA using Earth observation satellites with a high geometric, temporal, and spectral resolution makes it possible to monitor and preserve the state of the sites; indeed, very often, fires involve areas of particular naturalistic value [7,8]. Therefore, in order to preserve the natural and man-made landscape, it is necessary to identify suitable methodologies for monitoring burned areas and, at the same time, produce severity estimation maps using satellite data [9].
In recent decades, several satellite-mounted sensors have been used for this activity, such as Advanced Very High-Resolution Radiometer (AVHRR) [10] and Advanced Space borne Thermal Emission and Reflection Radiometer (ASTER) [11]. Another widely used sensor is Moderate Resolution Imaging Spectroradiometer (MODIS) due to its high temporal resolution, which allows the rapid detection of active fires, the identification of burned areas, or forest fire risk assessment [12,13,14]. However, MODIS has a coarse spatial resolution, which makes detecting the spatial extent of smaller fires more difficult [15]. The launch of Landsat-8 OLI (Operational Land Imager), i.e., the eighth satellite of the Landsat program developed by a collaboration between NASA and the United States Geological Survey (USGS), and Sentinel-2 (Earth observation mission from the Copernicus Program, formerly called the Global Monitoring for Environment and Security-GMES) allowed us to obtain images with a better spatial resolution than previous satellite-based sensors. In particular, Sentinel-2 collects multispectral land surface imagery by two satellites with a revisit cycle of 5 days at 10 m, 20 m, and 60 m spatial resolutions. Their single instrument is the Multispectral Imager (MSI), which collects data in 13 spectral bands using a line-scanner technology with a wide field of view [16].
The processing of remotely sensed data is a prerequisite for generating robust spatial information with scientific quality and is appropriate for fire monitoring at different scales and over time [17]. Indeed, the realization of increasingly performing indices in the field of RS for the detection and analysis of a phenomenon, such as burnt areas, is in continuous development. Therefore, the increasing spatial and spectral resolution of satellite platform sensors combined with the development of indices in the RS field mean that even small BA can be detected with high accuracy. BA detection is strictly correlated to the monitoring of vegetation. For this purpose, several methodologies have been proposed based on the use of spectral indices, such as the Normalized Difference Vegetation Index (NDVI), which can be defined as the normalized ratio of the difference of the near-infrared and red bands [18]; this index allows us to investigate the relationship between the amount of vegetation consumed and fire severity [19,20]. In addition, specific indices have been developed that record the effects of fire with greater spectral contrast, such as the Normalized Burn Ratio (NBR), which is the normalized ratio of the difference of the near-infrared and shortwave-infrared bands [20,21]. The combination of these latter bands makes it possible to analyze the phenomenon in pre- and post-fire conditions. In the NIR wavelengths, the absorbance of vegetation is low, whereas reflectance and transmittance are high; in the SWIR wavelengths, the reflectance and transmittance of the vegetation are low, and the absorbance is very high. In the post-fire zone, the recently burned areas show a relatively low reflectivity in the near-infrared band and a high reflectance in the short-wave infrared band [22,23]. This index was applied with success to satellite data, such as Sentinel-2 [24] and Landsat-7 and Landsat-8 [25] images in order to identify burned areas.
Recent advancements in remote sensing technology have facilitated new approaches to study fire ecology, including different aspects related to fire risk mapping, fuel mapping, active fire detection, burned area estimates, burn severity assessment, and post-fire vegetation recovery monitoring [26]. Specific indices for BA detection related to the availability of bands of specific sensors have been developed. A large part of the current application is based on a bi-temporal approach that permits us to identify the burned areas as results of changes due to the vegetation decrease in the pre-fire and post-fire images; however, other land-cover features might present similar responses in some specific bands such as NIR and SWIR bands, generating an increase in commission errors [27]. In addition, the peculiarity of the scene can influence the accuracy of BA delineation: for example, water bodies, clouds, and shadows might produce false alarms, so specific precautions and remedies are required [28]. Using SWIRs (short and long SWIR), these errors are mitigated [29], since the distinctive spectral signatures of water and burned areas beyond the NIR region, where water tends to absorb longer wavelengths almost completely, whereas burned forest reflectance remains fairly constant or shows a slightly growing trend [30].
The aim of this paper is to improve the performance obtained by the application of the existing indices for the identification of BA on Sentinel-2 images. Starting from the NBR index, this work proposes a new index, called Normalized Burn Ratio Plus (NBR+). The NBR+ is tested on an area located in the north-eastern part of Sicily (Italy) and affected by fires during the summer of 2019; its performances are evaluated by means of accuracy assessment of classification results and compared with five other methods that already exist in the literature and are applicable to Sentinel-2 multispectral geo-data, pre- and post-fire event.

2. Study Area and Dataset

Sentinel-2 satellite images are provided free of charge by ESA through a constellation of two satellites, part of the Copernicus program, which are called Sentinel-2A and Sentinel-2B. For this work, images provided by the Sentinel-2A satellite are used. The feature of bands with their central wavelengths and resolutions are reported in Table 1 [31].
The Copernicus program, through the use of Sentinel satellites, deals with the continuous monitoring of the Earth: in the framework of geographic data and their storage [32], Sentinel images are the main acquisition channel of the Copernicus geodatabase of the Risk and Recovery Mapping [33].
Two Sentinel-2A datasets are used for this study: the first dataset was acquired on 24 July 2019 and represents the pre-fire scenario; the second was acquired on 23 August 2019 and represents the post-fire scenario. The images cover the north-eastern part of Sicily (Italy), including the province of Messina (Italy), as shown in Figure 1.
As can be seen in Figure 2, the land is mostly surrounded by the sea, except in the southwestern part. If, on the one hand, the coastal areas are heavily inhabited and urbanized, on the other hand, the inland areas are predominantly rural, characterized by the dense vegetation of the Peloritani mountains [34]: as reported in [35], the vegetation in this area is very diversified and particularly prone to bushfires.

3. Methods

This section presents the new index proposed to identify burnt areas using Sentinel-2 multispectral images and illustrates the methodological approach adopted to establish the level of performance of this index. In particular, some indices already present in the literature and usually applied for the purpose, are first recalled, reporting the relative formulas and specifying the bands used, since we applied them in comparison with NBR+. Then, the reasons that determined the formulation of the proposed index are explained. Subsequently, we recall the fundamentals of the supervised classification applied to the maps obtained by the selected indices. Finally, the procedure for verifying the accuracy of the results, based on the use of test areas and the canonical approach applied in remote sensing [36], is described, including the confusion matrix [37] and related indices [38].
The workflow of the whole methodological approach is shown in Figure 3.
All experiments are carried out using GIS software named QGIS, version 3.16: Raster Calculator tool allows to implement index formulas by means of Map Algebra Operators.

3.1. Spectral Indices Used for the Identification of Burnt Areas

The Near Infrared (NIR) and Shortwave Infrared (SWIR) spectral regions are relevant for detecting burned areas: NIR highlights changes in canopy cover and brightness of leaf burn [39], whereas SWIR detects changes in landscape dryness [40]. After a fire, with the destruction of vegetation, the NIR reflectance strongly decreases and, on the other hand, the SWIR reflectance increases due to the fire’s removal of water-retaining vegetation [41]. Other important spectral regions for BA detection are the Red and Red-Edge, because they are linked to strong absorption of the chlorophyll content in plants [42,43].
For mapping burned areas, several algorithms use a combination of NIR and SWIR spectral regions, although some spectral indices combining only SWIR spectral regions or Red/Red-edge and NIR spectral regions have been developed as well.
Figure 4 compares burned area signature with healthy vegetation, remarking how areas devastated by fire have a very high reflectance in the SWIR wavelengths and low reflectance in the NIR.
Among the wide range of existing spectral indices described in the literature, five different approaches are applied and compared in this study: Normalized Burn Ratio (NBR), Normalized Burn Ratio-SWIR (NBRSWIR), Normalized Difference Shortwave Infrared Index (NDSWIR), Mid-Infrared Bi-Spectral Index (MIRBI), and Burnt Area Index for Sentinel 2 (BAIS2). The main characteristics of these indices are summarized below, and the description of the new index proposed in this study is reported in the following subsection.

3.1.1. Normalized Burn Ratio (NBR)

A famous index widely used in the literature to highlight burned areas in large fire zones is the Normalized Burn Ratio (NBR) [21], which is considered a standard for fire severity assessments. The Normalized Burn Ratio formula combines the use of both NIR (B8A) and SWIR (B12) wavelengths, where a high NBR value generally indicates healthy vegetation, and a low value indicates bare ground and recently burned areas.
The NBR formula is similar to that of NDVI. It is calculated as a ratio between the NIR and SWIR values and can be defined as:
NBR = B 12 B 8 A B 12 + B 8 A
Areas with values close to zero are considered no-burned areas.

3.1.2. Normalized Burn Ratio-SWIR (NBRSWIR)

A new fire index, developed for BA identification with Landsat-8 OLI data, is Normalized Burn Ratio-SWIR (NBRSWIR), designed by Liu et al. in 2020 [27]. The formula is similar to the NBR index, but SWIR1 (B11) and SWIR2 (B12) bands are used. Moreover, in the equation, two small constants are considered: in the numerator, the constant value of 0.02 is subtracted to set the changes in water close to zero or even negative, and in the denominator, 0.1 is added to avoid the positive amplification of some abnormal water changes:
NBRSWIR = B 12 B 11 0.02 B 12 + B 11 + 0.1

3.1.3. Normalized Difference Shortwave Infrared Index (NDSWIR)

As the SWIR was known to be useful for the detection of older fire scars, Gerard et al. used the individual SWIR waveband and NDSWIR to support forest fire scar detection in the Boreal Forest [44]. Particularly, they demonstrate that it is possible to detect fire scars up to ten years old using SPOT-VEGETATION data from a single year and that the use of a vegetation index based on near- and shortwave-infrared reflectance such as NDSWIR is crucial to this success:
NDSWIR = B 11 B 8 A B 11 + B 8 A

3.1.4. Mid-Infrared Bi-Spectral Index (MIRBI)

The Mid-Infrared Burn Index (MIRBI), developed by Trigg and Flasse in 2001 [40], is designed in the discriminating “short wavelength Mid-Infrared” and “long wavelength Mid-Infrared” space (respectively, in Sentinel-2 band 11 and band 12). This index is highly sensitive to spectral changes caused by burning and relatively insensitive to noise:
MIRBI = 10 × B 12 9.8 × B 11 + 2

3.1.5. Burnt Area Index for Sentinel 2 (BAIS2)

Burnt Area Index for Sentinel 2, BAIS2 [5], adapts the traditional BAI for Sentinel-2 bands, taking advantage of a combination of bands that have been demonstrated to be suitable for post-fire burned area detection (Visible (B4), Red-Edge (B6 and B7), NIR (B8A) and SWIR (B12) bands). The range of values for the BAIS2 is −1 to 1 for burn scars, and 1–6 for active fires. It can be obtained as shown in the following formula:
BAIS 2 = ( 1 B 6 × B 7 × B 8 A B 4 ) × ( B 12 B 8 A B 12 + B 8 A + 1 )

3.2. Development of a New Spectral Index for the Identification of BA

In order to correctly use the classical indices for the BA detection, it is usually necessary to apply a mask to remove the water bodies and clouds [45]. In fact, as reported in numerous articles [46,47,48,49,50], water bodies can be mistaken for BA by indicators such as NBR, since the BA show similar reflection values as the water surfaces [24]. Generally, a mask is created to remove water bodies by using the Normalized Difference Water Index (NDWI), an index that highlights the water in relation to the soil [51]. As well known, water typically has strong reflectance in blue-green wavelengths [52]; therefore, we propose an enhanced version of NBR that takes into account the reflectance of water, which we call NBR+:
NBR + = ( B 12 B 8 A B 3 B 2 ) ( B 12 + B 8 A + B 3 + B 2 )
On the other hand, this formula provides negative values for the clouds as they have the reflectance in the B12 band significantly lower than the sum of the other three bands. Consequently, since the NBR+ values can vary in a range between −1 and 1, the pixels related to the clouds in the resulting NBR+ image present certainly low (negative) values and therefore cannot be confused with those related to the BA, which present high values and tend to white. In fact, the NBR+ values can vary in a range between −1 and 1, where the pixels with higher values represent the BA.

3.3. Maximum Likelihood Classification

For this work, we chose to use the Maximum Likelihood Classification (MLC) in order to obtain two distinct classes, i.e., burned areas and unburned areas. This classifier falls into the category of supervised classification methods, which involve the use of training sites (TS) to have a priori knowledge of the samples of the investigated image [53]. The image is classified by starting from the reflectance values of each pixel and assigning them to the class for which there is the greatest similarity [54]. It is of fundamental importance that the TS are selected as accurately as possible [55], and that they comply with certain guidelines [56], such as being representative of the classes investigated, including a significant number of pixels for each class and containing sufficient information to describe the classes based on the behavior of its spectral signatures [57].
The MLC is based on Bayes’ Theorem [58], and it is considered one of the most accurate methods in the literature [59]. Generally, MLC is applied on multiple bands, but it can also be applied on a single band [60], for example, a synthetic band such as NBR. Particularly, in this study, TS relating the two researched classes are identified by means of visual analysis on RGB true color and RGB false color compositions of the Sentinel 2-A dataset.

3.4. Burned Area Mapping

Burned areas are identified by applying MLC to a false band. Particularly, in this work single date and bi-temporal approach is used for each index: a first classification is carried out on the post-burned images (single date approach); the second classification is carried out comparing pre- and post-burn images (bi-temporal approach), as proposed in several papers [61,62,63].
Since the singe date approach consists of classifying only the images related to the post-burn, it is faster to apply and does not present the errors related to the bi-temporal approach, such as those caused by differences in phenology [64], misregistration of image pixels [65], and differences in sensor calibration [66], sun-sensor geometry, and atmospheric effects [67]. Furthermore, it is not always easy to find images, useful for BA detection, in two different moments in time that are not partially covered by clouds [68]: clouds and their shadows can be easily mistaken for BA when classification is carried out [69]. The single date approach avoids all these problems, but the lack of a pre-burn reference image can generate difficulties in mapping areas whose spectral signature has remained constant, such as water and aging vegetation that can be mistaken for burned areas [70].
The bi-temporal analysis is carried out by calculating the indices listed above both on the pre-burn and post-burn images and then making the difference by generating the delta index:
Δ Index = PostBurn   Index PreBurn   Index

3.5. Thematic Accuracy Assessment

In order to evaluate the thematic accuracy of the results obtained, test sites are used. These are different from the training sites, but like the latter, they are representative of the two classes (BA/Non-BA) if a sufficient and significant sample is taken [71]. We consider two groups of test sites: group A including water bodies and clouds, and group B excluding them. In this way, we test the performance of each method for the ability both to not attribute to the class of burnt areas pixels of water and clouds (commission errors), and to distinguish the two clusters (BA/Non BA).
The test sites are achieved from the visual analysis of the multispectral images; they allow us to know how many pixels are classified correctly and how many are wrongly classified. Particularly, we apply the largely used approach based on the confusion matrix, a simple and powerful tool that denotes the accuracy of the remotely sensed image classification. A confusion matrix is a table that shows the correspondence between the classification result and the ground truth data, as derived by visual inspection of the same remotely sensed images or other information layers (such as maps) or acquired in situ and recorded with a GNSS (Global Navigation Satellite System) receiver [72]. Instead of considering the whole images, the confusion matrices are usually constructed from the values taken from the test sites, which allow evaluating the thematic accuracy of the adopted classification method [73]. In particular, in this study, two confusion matrices are created for each method used: one representative of the post-fire image only and one representative of the “delta” image concerning the bi-temporal approach. The numerical analysis of the thematic accuracy of each classified image is summarized by the values of the indices named Producer Accuracy (PA), User Accuracy (UA), and Overall Accuracy (OA) [74].
PA indicates the probability that a certain land cover of an area on the ground is classified as such. It is computed by dividing the number of pixels classified accurately in each category by the total number of reference pixels for that category.
UA indicates the probability that a certain area classified into a given category actually represents that category on the ground. It is computed by dividing the number of correctly classified pixels in each category by the total number of pixels that were classified in that category.
OA represents the probability that all categories are correctly classified. It is computed by dividing the total number of correctly classified pixels by the total number of reference pixels.

4. Results and Discussion

A first assessment of each resulting BA map can be made by means of visual inspection: image comparison between NBR+ and NBR is shown in Figure 5 and ΔNBR+ and ΔNBR in Figure 6.
Comparing the images in Figure 5, it is evident that NBR presents noise around the coasts, or more generally, in the shallow waters, which NBR+ does not present at all; areas affected by this noise are incorrectly classified as BA, since they have very high radiance values.
A second disturbing element in this case is the clouds and the shadows they generate: these not only hide the burned areas from the sensor, but in the case of NBR, they can be mistaken for BA. The wrong classification of clouds and shallow waters has been resolved in NBR+; in fact, these areas have very low radiance values compared to the high BA radiance values.
Comparing the images in Figure 6, in the case of NBR, the noise relating to shallow waters has been slightly reduced (although it continues to generate disturbance); however, the noise relating to clouds has increased. In fact, when the bi-temporal analysis is carried out, if the two images (pre- and post-fire) both have clouds, these will be represented on the Δ image in a different way: the clouds inherent to the pre-fire image are represented with low values of radiance, whereas the post-fire image clouds are represented with high radiance values.
Therefore, even in this case, the clouds and shallow waters (to a lesser extent) can be mistakenly classified as BAs if NBR is applied. In the image related to NBR+, the BAs immediately catch the eye due to the high radiance compared to the rest of the image, even if there is a slight noise on the edges of the clouds, which, however, is rarely identified as BA by the MLC.
The different results supplied by each index in terms of BA/NON BA detections are compared and shown in Figure 7 for single-date analysis and Figure 8 for bi-temporal analysis.
Furthermore, Figure 9 shows details to highlight the impact of clouds on classification, and Figure 10 shows detail to highlight the impact of water bodies on the classification of the images.
The quantitative analysis of the performance of each approach is based on test sites and the confusion matrix approach. The results obtained from the confusion matrices are shown in Table 2 and Table 3 for the post-fire analysis and in Table 4 and Table 5 for the bi-temporal analysis. Particularly, Table 2 and Table 4 concern test sites including water bodies and clouds (Group A), and Table 3 and Table 5 concern test sites excluding water bodies and clouds (Group B). The following indices are considered:
  • Producer Accuracy of the Burned Areas (PABA);
  • Producer Accuracy of the Non-Burned Areas (PANON BA);
  • User Accuracy of the Burned Areas (UABA);
  • User Accuracy of the Non-Burned Areas (UANON BA);
  • Overall Accuracy (OA).
The results obtained are very encouraging for the proposed index. In fact, the closer the accuracy index values are to 1, the more satisfactory the results are, as it means that the proposed method is effective and is able to correctly classify the areas considered, distinguishing between burnt and not.
NBR+ is always the best performing index, reaching in any case the highest value of OA. The results obtained for the already existing indices in the literature are in line with the results presented by other authors [75,76,77].
The statistical values obtained in the post-fire images tend to be lower than those obtained in the bi-temporal analysis, confirming the higher efficacy of the latter method as already stated in the literature [78].
Analyzing the values shown in Table 2, the MIRBI is always the least well-performing index; NBR, NBRSWIR and NDSWIR present results that are very close to each other; BAIS2 is the only one among the “traditional” indices to present values that exceed 0.9 (PA-NON BA and UA-BA) and a much higher OA value than the others (0.798). The NBR+ has very high values compared to the other indices, it is the only one to exceed the value of 0.9 in terms of OA and the only one to have all the values higher than 0.8.
Table 3 confirms the high performance of the proposed index in the case of single-date thematic accuracy assessment on the test sites excluding water bodies and clouds. In fact, NBR+ supplies the highest value of OA (0.925) and very good results in terms of PA and UA.
The values reported in Table 4 are generally high for all the indices; the OA is always higher than 0.85, confirming the validity of each method tested in this work. NDSWIR and MIRBI present OA values very close to each other (as well as being the lowest), and the PA and UA values are similar but opposite. In fact, NDSWIR has very high values of PA-BA and UA-NON BA (0.968 and 0.960) and lower values of PA-NON BA and UA-BA (0.766 and 0.805); on the contrary, the MIRBI has very high values of PA-NON BA and UA-BA (0.992 and 0.990, respectively) and lower values of PA-BA and UA-NON BA (0.749 and 0.798). BAIS2 has a slightly higher OA value (0.881) than NDSWIR and MIRBI. Very good results are obtained from the application of NBR and NBRSWIR, with OA values higher than 0.9 for both. The best-performing index is NBR+; in fact, it always presents excellent results, reaching an accuracy of 100% in the PA-BA and UA-NON BA, and the highest value ever in the OA (0.974).
Table 5 testifies that NBR+ is also the best-performing index in the case of bi-temporal thematic accuracy assessment on test sites excluding water bodies and clouds.

5. Conclusions

The results of this research confirm the capability to precisely map burned areas using remote sensing science: multispectral images provided by satellite sensors such as Sentinel-2 are very useful for the scope but require an appropriate selection and combination of the bands. In fact, several indices are available in the literature and allow us to achieve good results in many cases. However, the specificity of the scene can reduce the level of accuracy of the final map: the presence of water bodies or clouds induces commission errors for some indices and produces false alarms with an unjustified increase in the number of pixels classified as burnt. Even if the disturbance of water bodies and clouds can be removed from the observed scene by resorting to specific remedies such as, for example, the use of masks, it is true that the ability of the indicator used to self-exclude commission errors is certainly welcome.
The new index proposed in this work for Sentinel-2 images allows us to delineate fire scars accurately, similar to other indices, but better than these, it guarantees high performance even in the presence of water bodies and clouds. Compared with NBR, NBRSWIR, NDSWIR, MIRBI, and BIAS2, NBR+ is the most performing index reaching very high values of PA, UA, and OA. The excellent performances of NBR+ are evident both in the case of the uni-temporal (post-fire) and in the bi-temporal (pre-fire and post-fire in comparison) approach. The other indexes analyzed show a more limited accuracy of the results in the case of application on a single image (post-fire); on the contrary, NBR+ provides excellent performance even in this situation.
The genesis of the index is simple and effective, aimed at enhancing the performance provided by the NBR. Using only the bands B12 and B8A, water bodies tend to have high brightness values of the pixels, so we decided to also introduce the bands B2 and B3: by subtracting the amount of energy, the pixels that reflect in these bands (water) tend to go dark, so finally, only the burnt areas become light and are easily identifiable. In addition, NBR+ supplies negative values for clouds, therefore, cannot be confused with BA pixels.
Concerning the future developments of this work, further studies will be focused on the possibility of applying the NBR+ efficiently to other types of satellite images. Of course, the presence of meaningful bands for detecting burned areas must be found. In other words, we need images acquired in specific bands such as Blue, Green, NIR, and SWIR that allow us to apply NBR+ effectively. Tests will certainly be conducted on Landsat 9 images, also considering in this case the availability of bands similar to those of Sentinel-2 used in this study.

Author Contributions

C.P. conceived the article and designed NBR+; D.C. designed the experiments; F.G. and M.P. conducted the bibliographic research and organized data collection; E.A. carried out experiments; C.P. and D.C. supervised the applications; E.A., F.G. and M.P. carried out the accuracy tests; all authors took part in results analysis and in writing the paper. All authors have read and agreed to the published version of the manuscript.


The APC was funded by Polytechnic of Bari and University Parthenope of Naples.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Gargiulo, M.; Dell’Aglio, D.A.G.; Iodice, A.; Riccio, D.; Ruello, G. A CNN-based super-resolution technique for active fire detection on Sentinel-2 data. In Proceedings of the 2019 PhotonIcs & Electromagnetics Research Symposium-Spring (PIERS-Spring), Rome, Italy, 17–20 June 2019; pp. 418–426. [Google Scholar] [CrossRef] [Green Version]
  2. Kaplan, G.; Yigit Avdan, Z. Space-borne air pollution observation from Sentinel-5P Tropomi: Relationship between pollutants, geographical and demographic data. Int. J. Eng. Geosci. 2020, 5, 130–137. [Google Scholar] [CrossRef]
  3. Lasaponara, R.; Proto, A.M.; Aromando, A.; Cardettini, G.; Varela, V.; Danese, M. On the mapping of burned areas and burn severity using self organizing map and sentinel-2 data. IEEE Geosci. Remote Sens. Lett. 2019, 17, 854–858. [Google Scholar] [CrossRef]
  4. Seydi, S.T.; Akhoondzadeh, M.; Amani, M.; Mahdavi, S. Wildfire damage assessment over Australia using sentinel-2 imagery and MODIS land cover product within the google earth engine cloud platform. Remote Sens. 2021, 13, 220. [Google Scholar] [CrossRef]
  5. Filipponi, F. BAIS2: Burned area index for Sentinel-2. Multidiscip. Digit. Publ. Inst. Proc. 2018, 2, 364. [Google Scholar] [CrossRef] [Green Version]
  6. Singh, D.; Kundu, N.; Ghosh, S. Mapping rice residues burning and generated pollutants using Sentinel-2 data over northern part of India. Remote Sens. Appl. Soc. Environ. 2021, 22, 100486. [Google Scholar] [CrossRef]
  7. Costantino, D.; Guastaferro, F.; Parente, C.; Pepe, M. Using images generated by sentinel-2 satellite optical sensor for burned area mapping. In Proceedings of the International Workshop on R3 in Geomatics: Research, Results and Review, Naples, Italy, 10–11 October 2019; Springer: Cham, Switzerland, 2019; pp. 350–362. [Google Scholar] [CrossRef]
  8. Vetrita, Y.; Cochrane, M.A.; Priyatna, M.; Sukowati, K.A.; Khomarudin, M.R. Evaluating accuracy of four MODIS-derived burned area products for tropical peatland and non-peatland fires. Environ. Res. Lett. 2021, 16, 035015. [Google Scholar] [CrossRef]
  9. Amos, C.; Petropoulos, G.P.; Ferentinos, K.P. Determining the use of Sentinel-2A MSI for wildfire burning & severity detection. Int. J. Remote Sens. 2019, 40, 905–930. [Google Scholar] [CrossRef]
  10. Chuvieco, E.; Martin, M.P.; Palacios, A. Assessment of different spectral indices in the red-near-infrared spectral domain for burned land discrimination. Int. J. Remote Sens. 2002, 23, 5103–5110. [Google Scholar] [CrossRef]
  11. Bagwell, R.; Peters, B. Advanced spaceborne thermal emission and reflection radiometer (ASTER) map of the thomas fire area in California. In Proceedings of the AGU Fall Meeting Abstracts, Washington, DC, USA, 10–14 December 2018; p. ED43C-1251. [Google Scholar]
  12. Zakšek, K.; Schroedter-Homscheidt, M. Parameterization of air temperature in high temporal and spatial resolution from a combination of the seviri and modis instruments. ISPRS J. Photogramm. Remote Sens. 2009, 64, 414–421. [Google Scholar] [CrossRef]
  13. Giglio, L.; Boschetti, L.; Roy, D.P.; Humber, M.L.; Justice, C.O. The collection 6 MODIS burned area mapping algorithm and product. Remote Sens. Environ. 2018, 217, 72–85. [Google Scholar] [CrossRef]
  14. Saidi, S.; Younes, A.B.; Anselme, B. A GIS-remote sensing approach for forest fire risk assessment: Case of Bizerte region, Tunisia. Appl. Geomat. 2021, 13, 587–603. [Google Scholar] [CrossRef]
  15. Ngadze, F.; Mpakairi, K.S.; Kavhu, B.; Ndaimani, H.; Maremba, M.S. Exploring the utility of Sentinel-2 MSI and Landsat 8 OLI in burned area mapping for a heterogenous savannah landscape. PLoS ONE 2020, 15, e0232962. [Google Scholar] [CrossRef] [PubMed]
  16. Szantoi, Z.; Strobl, P. Copernicus Sentinel-2 Calibration and Validation; Taylor & Francis: New York, NY, USA, 2019. [Google Scholar] [CrossRef] [Green Version]
  17. Barboza Castillo, E.; Turpo Cayo, E.Y.; de Almeida, C.M.; Salas López, R.; Rojas Briceño, N.B.; Silva López, J.O.; Espinoza-Villar, R. Monitoring wildfires in the northeastern peruvian amazon using landsat-8 and sentinel-2 imagery in the GEE platform. ISPRS Int. J. Geo-Inf. 2020, 9, 564. [Google Scholar] [CrossRef]
  18. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. NASA Spec. Publ. 1974, 351, 309. [Google Scholar]
  19. Viana-Soto, A.; Aguado, I.; Martínez, S. Assessment of post-fire vegetation recovery using fire severity and geographical data in the Mediterranean region (Spain). Environments 2017, 4, 90. [Google Scholar] [CrossRef] [Green Version]
  20. Meneses, B.M. Vegetation recovery patterns in burned areas assessed with landsat 8 OLI imagery and environmental biophysical data. Fire 2021, 4, 76. [Google Scholar] [CrossRef]
  21. García, M.J.L.; Caselles, V. Mapping burns and natural reforestation using thematic mapper data. Geocarto Int. 1991, 6, 31–37. [Google Scholar] [CrossRef]
  22. Roy, D.P.; Zhang, H.K.; Ju, J.; Gomez-Dans, J.L.; Lewis, P.E.; Schaaf, C.B.; Kovalskyy, V. A general method to normalize Landsat reflectance data to nadir BRDF adjusted reflectance. Remote Sens. Environ. 2016, 176, 255–271. [Google Scholar] [CrossRef] [Green Version]
  23. Pepe, M.; Parente, C. Burned area recognition by change detection analysis using images derived from Sentinel-2 satellite: The case study of Sorrento Peninsula, Italy. J. Appl. Eng. Sci. 2018, 16, 225–232. [Google Scholar] [CrossRef]
  24. Dindaroglu, T.; Babur, E.; Yakupoglu, T.; Rodrigo-Comino, J.; Cerdà, A. Evaluation of geomorphometric characteristics and soil properties after a wildfire using Sentinel-2 MSI imagery for future fire-safe forest. Fire Saf. J. 2021, 122, 103318. [Google Scholar] [CrossRef]
  25. Vanderhoof, M.K.; Hawbaker, T.J.; Teske, C.; Ku, A.; Noble, J.; Picotte, J. Mapping wetland burned area from Sentinel-2 across the Southeastern United States and its contributions relative to Landsat-8 (2016–2019). Fire 2021, 4, 52. [Google Scholar] [CrossRef]
  26. Szpakowski, D.M.; Jensen, J.L. A review of the applications of remote sensing in fire ecology. Remote Sens. 2019, 11, 2638. [Google Scholar] [CrossRef] [Green Version]
  27. Liu, S.; Zheng, Y.; Dalponte, M.; Tong, X. A novel fire index-based burned area change detection approach using Landsat-8 OLI data. Eur. J. Remote Sens. 2020, 53, 104–112. [Google Scholar] [CrossRef] [Green Version]
  28. Nolde, M.; Plank, S.; Riedlinger, T. An adaptive and extensible system for satellite-based, large scale burnt area monitoring in near-real time. Remote Sens. 2020, 12, 2162. [Google Scholar] [CrossRef]
  29. Bastarrika, A.; Chuvieco, E.; Martín, M.P. Mapping burned areas from Landsat TM/ETM+ data with a two-phase algorithm: Balancing omission and commission errors. Remote Sens. Environ. 2011, 115, 1003–1012. [Google Scholar] [CrossRef]
  30. Oliveira, E.R.; Disperati, L.; Alves, F.L. A new method (MINDED-BA) for automatic detection of burned areas using remote sensing. Remote Sens. 2021, 13, 5164. [Google Scholar] [CrossRef]
  31. Sentinel-2 User Handbook, ESA. 2015. Available online: (accessed on 6 December 2021).
  32. Falchi, U. IT tools for the management of multi—Representation geographical information. Int. J. Eng. Technol 2017, 7, 65–69. [Google Scholar] [CrossRef] [Green Version]
  33. Copernicus Emergency Management Service. Directorate Space, Security and Migration, European Commission Joint Research Centre (EC JRC). Available online: (accessed on 6 December 2021).
  34. Pappalardo, G.; Imposa, S.; Barbano, M.S.; Grassi, S.; Mineo, S. Study of landslides at the archaeological site of Abakainon necropolis (NE Sicily) by geomorphological and geophysical investigations. Landslides 2018, 15, 1279–1297. [Google Scholar] [CrossRef]
  35. Sciandrello, S.; D’Agostino, S.; Minissale, P. Vegetation analysis of the Taormina Region in Sicily: A plant landscape characterized by geomorphology variability and both ancient and recent anthropogenic influences. Lazaroa 2013, 34, 151. [Google Scholar] [CrossRef] [Green Version]
  36. Congalton, R.G. A review of assessing the accuracy of classifications of remotely sensed data. Remote Sens. Environ. 1991, 37, 35–46. [Google Scholar] [CrossRef]
  37. Lewis, H.G.; Brown, M. A generalized confusion matrix for assessing area estimates from remotely sensed data. Int. J. Remote Sens. 2001, 22, 3223–3235. [Google Scholar] [CrossRef]
  38. Hasmadi, M.; Pakhriazad, H.Z.; Shahrin, M.F. Evaluating supervised and unsupervised techniques for land cover mapping using remote sensing data. Geogr. Malays. J. Soc. Space 2009, 5, 1–10. [Google Scholar]
  39. Silva, J.M.N.; Pereira, J.M.C.; Cabral, A.I.; Sa’, A.C.L.; Vasconcelos, M.J.P.; Mota, B.; Gre’Goire, J.-M. An estimate of the area burned in southern Africa during the 2000 dry season using SPOT-VEGETATION satellite data. J. Geophys. Res. 2003, 108, 8498. [Google Scholar] [CrossRef] [Green Version]
  40. Trigg, S.; Flasse, S. An evaluation of different bi-spectral spaces for discriminating burned shrub-savannah. Int. J. Remote Sens. 2001, 22, 2641–2647. [Google Scholar] [CrossRef]
  41. Schepers, L.; Haest, B.; Veraverbeke, S.; Spanhove, T.; Vanden Borre, J.; Goossens, R. Burned area detection and burn severity assessment of a heathland fire in Belgium using airborne imaging spectroscopy (APEX). Remote Sens. 2014, 6, 1803–1826. [Google Scholar] [CrossRef] [Green Version]
  42. Evangelides, C.; Nobajas, A. Red-edge normalised difference vegetation index (NDVI705) from Sentinel-2 imagery to assess post-fire regeneration. Remote Sens. Appl. Soc. Environ. 2020, 17, 100283. [Google Scholar] [CrossRef]
  43. Van Dijk, D.; Shoaie, S.; van Leeuwen, T.; Veraverbeke, S. Spectral signature analysis of false positive burned area detection from agricultural harvests using Sentinel-2 data. Int. J. Appl. Earth Obs. Geoinf. 2021, 97, 102296. [Google Scholar] [CrossRef]
  44. Gerard, F.; Plummer, S.; Wadsworth, R.; Sanfeliu, A.F.; Iliffe, L.; Balzter, H.; Wyatt, B. Forest fire scar detection in the boreal forest with multitemporal spot-vegetation data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 2575–2585. [Google Scholar] [CrossRef]
  45. Wang, L.; Qu, J.J.; Hao, X. Forest fire detection using the normalized multi-band drought index (NMDI) with satellite measurements. Agric. For. Meteorol. 2008, 148, 1767–1776. [Google Scholar] [CrossRef]
  46. Huang, H.; Chen, Y.; Clinton, N.; Wang, J.; Wang, X.; Liu, C.; Zhu, Z. Mapping major land cover dynamics in Beijing using all Landsat images in Google Earth Engine. Remote Sens. Environ. 2017, 202, 166–176. [Google Scholar] [CrossRef]
  47. Suresh Babu, K.V.; Roy, A.; Aggarwal, R. Mapping of forest fire burned severity using the sentinel datasets. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 5. [Google Scholar] [CrossRef] [Green Version]
  48. Kovács, K.D. Evaluation of burned areas with Sentinel-2 using SNAP: The case of Kineta and Mati, Greece, July 2018. Geogr. Tech. 2019, 14, 20–38. [Google Scholar] [CrossRef] [Green Version]
  49. Pulvirenti, L.; Squicciarino, G.; Fiori, E.; Fiorucci, P.; Ferraris, L.; Negro, D.; Puca, S. An automatic processing chain for near real-time mapping of burned forest areas using sentinel-2 data. Remote Sens. 2020, 12, 674. [Google Scholar] [CrossRef] [Green Version]
  50. Llorens, R.; Sobrino, J.A.; Fernández, C.; Fernández-Alonso, J.M.; Vega, J.A. A methodology to estimate forest fires burned areas and burn severity degrees using Sentinel-2 data. Application to the October 2017 fires in the Iberian Peninsula. Int. J. Appl. Earth Obs. Geoinf. 2021, 95, 102243. [Google Scholar] [CrossRef]
  51. McFeeters, S.K. The use of the Normalized Difference Water Index (NDWI) in the delineation of open water features. Int. J. Remote Sens. 1996, 17, 1425–1432. [Google Scholar] [CrossRef]
  52. Ip, F.; Dohm, J.M.; Baker, V.R.; Doggett, T.; Davies, A.G.; Castano, B.; Cichy, B. ASE floodwater classifier development for EO-1 hyperion imagery. Lunar Planet. Sci. 2004, 35, 1–2. [Google Scholar]
  53. Sisodia, P.S.; Tiwari, V.; Kumar, A. Analysis of supervised maximum likelihood classification for remote sensing image. In Proceedings of the International Conference on Recent Advances and Innovations in Engineering (ICRAIE-2014), Jaipur, India, 9–11 May 2014; pp. 1–4. [Google Scholar] [CrossRef]
  54. Schowengerdt, R.A. Techniques for Image Processing and Classifications in Remote Sensing; Academic Press: Cambridge, MA, USA, 2012. [Google Scholar]
  55. Cingolani, A.M.; Renison, D.; Zak, M.R.; Cabido, M.R. Mapping vegetation in a heterogeneous mountain rangeland using Landsat data: An alternative method to define and classify land-cover units. Remote Sens. Environ. 2004, 92, 84–97. [Google Scholar] [CrossRef]
  56. Colwell, R.N. Manual of Remote Sensing; American Society of Photogrammetry: Falls Church, VA, USA, 1983. [Google Scholar]
  57. Muñoz-Marí, J.; Bruzzone, L.; Camps-Valls, G. A support vector domain description approach to supervised classification of remote sensing images. IEEE Trans. Geosci. Remote Sens. 2007, 45, 2683–2692. [Google Scholar] [CrossRef]
  58. Srivastava, P.K.; Han, D.; Rico-Ramirez, M.A.; Bray, M.; Islam, T. Selection of classification techniques for land use/land cover change investigation. Adv. Space Res. 2012, 50, 1250–1265. [Google Scholar] [CrossRef]
  59. Alcaras, E.; Amoroso, P.P.; Parente, C.; Prezioso, G. Remotely sensed image fast classification and smart thematic map production. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, 46, 43–50. [Google Scholar] [CrossRef]
  60. Alcaras, E.; Errico, A.; Falchi, U.; Parente, C.; Vallario, A. Coastline extraction from optical satellite imagery and accuracy evaluation. In Proceedings of the International Workshop on R3 in Geomatics: Research, Results and Review, Naples, Italy, 10–11 October 2019; Springer: Cham, Switzerland, 2019; pp. 336–349. [Google Scholar] [CrossRef]
  61. Arellano-Pérez, S.; Ruiz-González, A.D.; Álvarez-González, J.G.; Vega-Hidalgo, J.A.; Díaz-Varela, R.; Alonso-Rego, C. Mapping fire severity levels of burned areas in Galicia (NW Spain) by Landsat images and the dNBR index: Preliminary results about the influence of topographical, meteorological and fuel factors on the highest severity level. Adv. For. Fire Res. 2018, 5, 1053. [Google Scholar] [CrossRef] [Green Version]
  62. Shimabukuro, Y.E.; Dutra, A.C.; Arai, E.; Duarte, V.; Cassol, H.L.G.; Pereira, G.; Cardozo, F.D.S. Mapping burned areas of Mato Grosso state Brazilian Amazon using multisensor datasets. Remote Sens. 2020, 12, 3827. [Google Scholar] [CrossRef]
  63. Ponomarev, E.; Zabrodin, A.; Ponomareva, T. Classification of fire damage to boreal forests of Siberia in 2021 based on the dNBR index. Fire 2022, 5, 19. [Google Scholar] [CrossRef]
  64. Wang, W.Y.; Fang, W.H.; Cai, G.Y.; Nie, P.J.; Liu, D.X. Image change detection and statistical test. DEStech transactions on computer science and engineering. In Proceedings of the 2018 International Conference on Computational, Modeling, Simulation and Mathematical Statistics (CMSMS 2018), Xi’an, China, 24–25 June 2018; pp. 598–603. [Google Scholar] [CrossRef]
  65. Liu, T.; Yang, L.; Lunga, D.D. Towards misregistration-tolerant change detection using deep learning techniques with object-based image analysis. In Proceedings of the 27th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems, Chicago, IL, USA, 5–8 November 2019; pp. 420–423. [Google Scholar] [CrossRef] [Green Version]
  66. Mallinis, G.; Gitas, I.Z.; Giannakopoulos, V.; Maris, F.; Tsakiri-Strati, M. An object-based approach for flood area delineation in a transboundary area using ENVISAT ASAR and LANDSAT TM data. Int. J. Digit. Earth 2013, 6, 124–136. [Google Scholar] [CrossRef]
  67. Veraverbeke, S.; Hook, S.; Hulley, G. An alternative spectral index for rapid fire severity assessments. Remote Sens. Environ. 2012, 123, 72–80. [Google Scholar] [CrossRef]
  68. Tanase, M.A.; Belenguer-Plomer, M.A.; Roteta, E.; Bastarrika, A.; Wheeler, J.; Fernández-Carrillo, Á.; Chuvieco, E. Burned area detection and mapping: Intercomparison of sentinel-1 and sentinel-2 based algorithms over tropical Africa. Remote Sens. 2020, 12, 334. [Google Scholar] [CrossRef] [Green Version]
  69. Sanchez, A.H.; Picoli, M.C.A.; Camara, G.; Andrade, P.R.; Chaves, M.E.D.; Lechler, S.; Queiroz, G.R. Comparison of cloud cover detection algorithms on sentinel–2 images of the amazon tropical forest. Remote Sens. 2020, 12, 1284. [Google Scholar] [CrossRef] [Green Version]
  70. Epting, J.; Verbyla, D.; Sorbel, B. Evaluation of remotely sensed indices for assessing burn severity in interior Alaska using Landsat TM and ETM+. Remote Sens. Environ. 2005, 96, 328–339. [Google Scholar] [CrossRef]
  71. Nasr, M.; Zenati, H.; Dhieb, M. Using RS and GIS to mapping land cover of the Cap Bon (Tunisia). Environmental Remote Sensing and GIS in Tunisia; Springer: Cham, Switzerland, 2021; pp. 117–142. [Google Scholar] [CrossRef]
  72. Story, M.; Congalton, R.G. Accuracy assessment: A user’s perspective. Photogramm. Eng. Remote Sens. 1986, 52, 397–399. [Google Scholar]
  73. Dibs, H.; Hasab, H.A.; Al-Rifaie, J.K.; Al-Ansari, N. An optimal approach for land-use/land-cover mapping by integration and fusion of multispectral landsat OLI images: Case study in Baghdad, Iraq. Water Air Soil Pollut. 2020, 231, 488. [Google Scholar] [CrossRef]
  74. Liu, C.; Frazier, P.; Kumar, L. Comparative assessment of the measures of thematic classification accuracy. Remote Sens. Environ. 2007, 107, 606–616. [Google Scholar] [CrossRef]
  75. Mpakairi, K.S.; Ndaimani, H.; Kavhu, B. Exploring the utility of Sentinel-2 MSI derived spectral indices in mapping burned areas in different land-cover types. Sci. Afr. 2020, 10, e00565. [Google Scholar] [CrossRef]
  76. Abdikan, S.; Bayik, C.; Sekertekin, A.; Bektas Balcik, F.; Karimzadeh, S.; Matsuoka, M.; Balik Sanli, F. Burned area detection using multi-sensor SAR, optical, and thermal data in Mediterranean pine forest. Forests 2022, 13, 347. [Google Scholar] [CrossRef]
  77. Dixon, D.J.; Callow, J.N.; Duncan, J.M.; Setterfield, S.A.; Pauli, N. Regional-scale fire severity mapping of Eucalyptus forests with the Landsat archive. Remote Sens. Environ. 2022, 270, 112863. [Google Scholar] [CrossRef]
  78. Stow, D.; Petersen, A.; Rogan, J.; Franklin, J. Mapping burn severity of Mediterranean-type vegetation using satellite multispectral data. Giscience Remote Sens. 2007, 44, 1–23. [Google Scholar] [CrossRef]
Figure 1. Study area in WGS 1984 Web Mercator Auxiliary Sphere projection (EPSG:3857).
Figure 1. Study area in WGS 1984 Web Mercator Auxiliary Sphere projection (EPSG:3857).
Remotesensing 14 01727 g001
Figure 2. Satellite image in UTM projection of the scene in pre- and post-fire scenario: color representation (composite bands 4-3-2) pre-fire (a); false color representation (composite bands 12-8A-4) pre-fire (b); color representation (composite bands 4-3-2) post-fire (c); false color representation (composite bands 12-8A-4) post-fire (d).
Figure 2. Satellite image in UTM projection of the scene in pre- and post-fire scenario: color representation (composite bands 4-3-2) pre-fire (a); false color representation (composite bands 12-8A-4) pre-fire (b); color representation (composite bands 4-3-2) post-fire (c); false color representation (composite bands 12-8A-4) post-fire (d).
Remotesensing 14 01727 g002
Figure 3. The workflow of the whole methodological approach for testing the performance of the proposed index in comparison with other five indices.
Figure 3. The workflow of the whole methodological approach for testing the performance of the proposed index in comparison with other five indices.
Remotesensing 14 01727 g003
Figure 4. Comparison of the spectral response of healthy vegetation and burned areas (Source: USDA Forest Service).
Figure 4. Comparison of the spectral response of healthy vegetation and burned areas (Source: USDA Forest Service).
Remotesensing 14 01727 g004
Figure 5. Comparison between NBR+ (left) and NBR (right) post-burn.
Figure 5. Comparison between NBR+ (left) and NBR (right) post-burn.
Remotesensing 14 01727 g005
Figure 6. Comparison between ΔNBR+ (left) and ΔNBR (right).
Figure 6. Comparison between ΔNBR+ (left) and ΔNBR (right).
Remotesensing 14 01727 g006
Figure 7. Maps of burned areas produced by the adopted indices using single-date approach: NBR+ (a), NBR (b), NDSWIR (c), NBRSWIR (d), BAIS2 (e), MIRBI (f).
Figure 7. Maps of burned areas produced by the adopted indices using single-date approach: NBR+ (a), NBR (b), NDSWIR (c), NBRSWIR (d), BAIS2 (e), MIRBI (f).
Remotesensing 14 01727 g007
Figure 8. Maps of burned areas produced by the adopted indices using bi-temporal approach: NBR+ (a), NBR (b), NDSWIR (c), NBRSWIR (d), BAIS2 (e), MIRBI (f).
Figure 8. Maps of burned areas produced by the adopted indices using bi-temporal approach: NBR+ (a), NBR (b), NDSWIR (c), NBRSWIR (d), BAIS2 (e), MIRBI (f).
Remotesensing 14 01727 g008
Figure 9. The impact of clouds on BA/NON BA detections supplied by each index: RGB true color composition (a), NBR+ (b), NDSWIR (c), BAIS2 (d), false color composition (e), NBR (f), NBRSWIR (g), MIRBI (h).
Figure 9. The impact of clouds on BA/NON BA detections supplied by each index: RGB true color composition (a), NBR+ (b), NDSWIR (c), BAIS2 (d), false color composition (e), NBR (f), NBRSWIR (g), MIRBI (h).
Remotesensing 14 01727 g009
Figure 10. The impact of water bodies on BA/NON BA detections supplied by each index: RGB true color composition (a), NBR+ (b), NDSWIR (c), BAIS2 (d), false color composition (e), NBR (f), NBRSWIR (g), MIRBI (h).
Figure 10. The impact of water bodies on BA/NON BA detections supplied by each index: RGB true color composition (a), NBR+ (b), NDSWIR (c), BAIS2 (d), false color composition (e), NBR (f), NBRSWIR (g), MIRBI (h).
Remotesensing 14 01727 g010
Table 1. Features of Sentinel-2A images.
Table 1. Features of Sentinel-2A images.
BandsCentral Wavelength (µm)Resolution (m)
B1—Coastal Aerosol0.44360
B5—Red Edge10.70520
B6—Red Edge20.74020
B7—Red Edge30.78320
B8A—Narrow NIR0.86520
B9—Water Vapor0.94560
B10—SWIR Cirrus1.37560
Table 2. Results of single-date thematic accuracy assessment on test sites including also water bodies and clouds (Group A).
Table 2. Results of single-date thematic accuracy assessment on test sites including also water bodies and clouds (Group A).
Table 3. Results of single-date thematic accuracy assessment on test sites excluding water bodies and clouds (Group B).
Table 3. Results of single-date thematic accuracy assessment on test sites excluding water bodies and clouds (Group B).
Table 4. Results of bi-temporal thematic accuracy assessment on test sites including water bodies and clouds (Group A).
Table 4. Results of bi-temporal thematic accuracy assessment on test sites including water bodies and clouds (Group A).
Table 5. Results of bi-temporal thematic accuracy assessment on test sites excluding water bodies and clouds (Group B).
Table 5. Results of bi-temporal thematic accuracy assessment on test sites excluding water bodies and clouds (Group B).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Alcaras, E.; Costantino, D.; Guastaferro, F.; Parente, C.; Pepe, M. Normalized Burn Ratio Plus (NBR+): A New Index for Sentinel-2 Imagery. Remote Sens. 2022, 14, 1727.

AMA Style

Alcaras E, Costantino D, Guastaferro F, Parente C, Pepe M. Normalized Burn Ratio Plus (NBR+): A New Index for Sentinel-2 Imagery. Remote Sensing. 2022; 14(7):1727.

Chicago/Turabian Style

Alcaras, Emanuele, Domenica Costantino, Francesca Guastaferro, Claudio Parente, and Massimiliano Pepe. 2022. "Normalized Burn Ratio Plus (NBR+): A New Index for Sentinel-2 Imagery" Remote Sensing 14, no. 7: 1727.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop