Next Article in Journal
Improving Urban Land Cover/Use Mapping by Integrating A Hybrid Convolutional Neural Network and An Automatic Training Sample Expanding Strategy
Previous Article in Journal
Evaluation and Normalization of Topographic Effects on Vegetation Indices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Sentinel-2 Data for Land Cover/Use Mapping: A Review

1
Department of Plant and Environmental Sciences, School of Natural Resources, Copperbelt University, Kitwe 10101, Zambia
2
Scion, 49 Sala Street, Private Bag 3020, Rotorua 3046, New Zealand
3
Department of Zoology and Aquatic Sciences, School of Natural Resources, Copperbelt University, Kitwe 10101, Zambia
4
Faculty of Life and Environmental Sciences, University of Tsukuba, 1-1-1, Tennodai, Tsukuba, Ibaraki 305-8572, Japan
5
Department of Environmental Management, Faculty of Social Sciences and Humanities, Rajarata University of Sri Lanka, Mihintale 50300, Sri Lanka
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(14), 2291; https://doi.org/10.3390/rs12142291
Submission received: 19 April 2020 / Revised: 2 July 2020 / Accepted: 4 July 2020 / Published: 16 July 2020
(This article belongs to the Section Urban Remote Sensing)

Abstract

:
The advancement in satellite remote sensing technology has revolutionised the approaches to monitoring the Earth’s surface. The development of the Copernicus Programme by the European Space Agency (ESA) and the European Union (EU) has contributed to the effective monitoring of the Earth’s surface by producing the Sentinel-2 multispectral products. Sentinel-2 satellites are the second constellation of the ESA Sentinel missions and carry onboard multispectral scanners. The primary objective of the Sentinel-2 mission is to provide high resolution satellite data for land cover/use monitoring, climate change and disaster monitoring, as well as complementing the other satellite missions such as Landsat. Since the launch of Sentinel-2 multispectral instruments in 2015, there have been many studies on land cover/use classification which use Sentinel-2 images. However, no review studies have been dedicated to the application of ESA Sentinel-2 land cover/use monitoring. Therefore, this review focuses on two aspects: (1) assessing the contribution of ESA Sentinel-2 to land cover/use classification, and (2) exploring the performance of Sentinel-2 data in different applications (e.g., forest, urban area and natural hazard monitoring). The present review shows that Sentinel-2 has a positive impact on land cover/use monitoring, specifically in monitoring of crop, forests, urban areas, and water resources. The contemporary high adoption and application of Sentinel-2 can be attributed to the higher spatial resolution (10 m) than other medium spatial resolution images, the high temporal resolution of 5 days and the availability of the red-edge bands with multiple applications. The ability to integrate Sentinel-2 data with other remotely sensed data, as part of data analysis, improves the overall accuracy (OA) when working with Sentinel-2 images. The free access policy drives the increasing use of Sentinel-2 data, especially in developing countries where financial resources for the acquisition of remotely sensed data are limited. The literature also shows that the use of Sentinel-2 data produces high accuracies (>80%) with machine-learning classifiers such as support vector machine (SVM) and Random forest (RF). However, other classifiers such as maximum likelihood analysis are also common. Although Sentinel-2 offers many opportunities for land cover/use classification, there are challenges which include mismatching with Landsat OLI-8 data, a lack of thermal bands, and the differences in spatial resolution among the bands of Sentinel-2. Sentinel-2 data show promise and have the potential to contribute significantly towards land cover/use monitoring.

Graphical Abstract

1. Introduction

The global land cover is rapidly changing due to anthropogenic activities (e.g., agricultural expansion and urbanisation) and natural processes (e.g., flooding) [1,2,3]. These changes impact human life, and hence effective monitoring mechanisms are needed for the sustainable management and utilisation of natural resources (e.g., forests, water). The development of satellite remote sensing technology has revolutionised the approaches in monitoring the natural and human resources on the Earth’s surface, and this technology makes it possible to monitor large areas [4]. Since the launch of the first satellite, which was dedicated to monitoring the surface of the Earth (Landsat 1) on 23 July 1972 [5], the scientific community has seen several satellites with both commercial (e.g., IKONOS, SPOT) and non-commercial (e.g., Landsat, Sentinel) business models. These satellites produce different remotely sensed data for different applications, such as forest, urban, natural hazard and agricultural monitoring. The available remotely sensed data, based on a free access policy (e.g., Landsat), have been playing an important role in monitoring natural resources and various ecosystem processes, such as forest dynamics, especially in developing countries where financial resources for the acquisition of remotely sensed data are limited [6,7].
In 2014, the Copernicus Programme, which is under the European Space Agency (ESA), launched the first Sentinel satellite—Sentinel-1A. So far, the Copernicus Programme has launched several satellite missions including Sentinels-1, 2, 3 and 5. One significant contribution of the Copernicus Programme was the launch of the multispectral instruments—Sentinel-2 satellites. The Sentinel-2 constellation is made of twin satellites; Sentinel-2A and Sentinel-2B (https://sentinel.esa.int/web/sentinel/missions/sentinel-2). After launching Sentinel-2A on 23 June 2015, the first images were received a few days later [8,9]. Sentinel-2B was then launched on 7 March 2017. Sentinel-2 satellites carry onboard multispectral imaging instruments (MSI) with the capabilities of recording 13 wide-swaths bands [9]. The primary objective of the Sentinel-2 mission is to provide high-resolution satellite data for land cover/use monitoring, climate change and disaster monitoring [9,10]. The other important objective of Sentinel-2 is to complement the other global satellite programmes such as the Landsat and SPOT (Satellite Pour l’Observation de la Terre) satellite programmes by ensuring continuity in monitoring the dynamics on Earth’s surface [8,11,12,13].
The scientific community, government agencies and private sectors have used Sentinel-2 data for different applications, such as agricultural, urban development and forest monitoring [12,14,15]. For example, Bruzzone, et al. [16] cited land cover/use monitoring as one of the essential applications for Sentinel-2 data. Other examples of the important application of Sentinel-2 include the development of a high spatial resolution (20 m) map for Africa for 2016 (i.e., CCI Land Cover—S2 prototype Land Cover 20m map of Africa) [17], the Copernicus Land cover services high spatial resolution maps (https://land.copernicus.eu/pan-european/high-resolution-layers), and the new pan-European high spatial resolution land cover/use maps (http://s2glc.cbk.waw.pl/) [18]. Countrywide high spatial resolution maps based on Sentinel-2 data have also been produced for Germany, Belgium, Bulgaria, Belgium, and Greece [19,20].
Since the beginning of the 15th century [21], the Earth’s surface has experienced rapid changes, which are driven by agricultural expansion [22], climate change [23] and rapid urbanisation [24]. These changes need monitoring instruments such as Sentinel-2 remotely sensed data to assess the status of the Earth’s surface continuously and to inform decision-makers about future changes. Moreover, long-term (>5 years) land use/cover change monitoring by Sentinel-2 has the potential of strengthening existing policies by providing accurate and timely information [8,25]. For example, Sentinel-2 data is playing an important role in monitoring the progress of achieving the Sustainable Development Goals (SDGs) [26]. Like Landsat images [27,28], Sentinel data from all missions can be accessed free of charge on Copernicus Open Access Hub (https://scihub.copernicus.eu/). Hence this data has the potential to contribute to land cover/use monitoring in many parts of the world, especially in countries where financial resources for acquiring remotely sensed data are limited [29].
There have been many studies based on Sentinel-2 data since the launch of these satellites in 2015 [30,31,32]. However, to the best of our knowledge, there has not been a review study dedicated to the importance of Sentinel-2 land cover/use monitoring, highlighting its uses and effectiveness. Therefore, the objectives of this review are to, (1) assess the contribution of Sentinel-2 data in land cover/use monitoring, and (2) explore the utilisation and opportunities of Sentinel-2 images. At the end of this review study, the best practices for using Sentinel-2 data are recommended. This review will be useful to new users of Sentinel-2 images, especially that Sentinel-2 data is relatively new (i.e., five years) as compared to other free access images such as Landsat (i.e., over four decades of operation).

2. Methods for Searching Literature

A systematic approach to database search was used to explore literature in three databases; Google Scholar, Scopus and ScienceDirect by using approaches suggested by Blaschke [33] and Ma, et al. [34]. The literature search focused on articles on Sentinel-2 and land cover mapping. The search terms were combined using the Boolean operation (OR, AND) to search for specific literature relating to Sentinel-2. The initial search was done using the terms “Sentinel-2” AND “land cover” or “Sentinel-2” AND “landcover.” A further search was done on the specific applications of Sentinel-2 images using search terms such as “Sentinel-2” AND “Forest,” “Sentinel-2” AND “Agriculture”, and “Sentinel-2” AND “urban” (Table 1). Only the literature published between 2015 and 2020 was considered because Sentinel-2 was launched in 2015. Other records were also identified through other means, such as recommendations by experts, and these records mainly included reports on the current applications of Sentinel-2.
The literature search was refined by removing the double records from the three databases. The articles which were not having the search terms in the title, keywords and abstract were also removed from the list. The literature considered in this study, includes published articles, conference proceedings and book chapters. Due to the many important applications taking place, which have not been published in peer-reviewed journals, some reports were also considered. Although other articles did not meet this criterion, they were also consulted, especially on the background and characteristics of the Copernicus Programme and Sentinel-2 images.

3. Results

The initial literature search retained 4990 articles. The first refinement which considered articles which had the search terms in the title, abstract and keywords returned 1154 articles. To be concise, and reduce the number of articles under focus, a total of 204 articles remained after considering articles which directly address the topic of Sentinel-2 land cover/use mapping. Since the search terms had duplications (due to different databases searched), 36 articles were corrected for double records and then 168 articles were considered in the final analysis. With the addition of 9 other records from other sources, the total articles considered was 177 (Figure 1).

3.1. Characteristics of the Reviewed Studies

About 60% of the literature accessed were from journals such as Remote Sensing of the Environment, Applied Earth Observation and Geoinformation, Remote Sensing, and Photogrammetry and Remote Sensing. Other studies (40%) where from journals which do not directly deal with remote sensing (e.g., Applied Geography, Forest Ecology and Management) and other sources including conference proceedings and technical reports. Conference papers were an important component of this study considering that Sentinel-2 is a newer satellite programme compared to other programmes such as Landsat; therefore, there is a lot of debate presented in scientific conferences.

3.2. Trends of Published Articles on Sentinel-2

Figure 2 compared the trends for Sentinel-2 and Landsat-8, a satellite sensor that was launched 2 years earlier than the launch of Sentinel-2. The general search without refinement showed an upward trend for both Sentinel-2 and Landsat-8. Landsat had consistently higher numbers of published articles; however, the trends were similar for the two sensors. Blaschke [33] indicated that it is common to expect the number of published materials to increase for new sensors or contemporary processing methods due to increasing usage.
The studies considered in this review were conducted in many countries across the world including countries from Asia, America, Europe and Africa (Figure 3). The distribution of the articles shows that most of the articles on Sentinel-2 were done in Europe, specifically in countries such as German, Romania, France, Bulgaria and Turkey. These studies focused on different topics on Sentinel-2 land cover classification which ranged from pre-processing to practical applications (e.g., forest and urban area monitoring).

4. Discussion

4.1. Background of ESA Copernicus Sentinel Programme

The ESA Sentinel missions are coordinated by the Copernicus Programme which is under the European Union Earth’s observation programme [13]. All the operations of ESA are funded by the European Commission in partnership with ESA, EU member states and EU Agencies. In 1998, the ESA and EU introduced the Global Monitoring for Environment and Security (GMES), which was later called the Copernicus Programme in 2014 [13]. The ESA and the EU have established a funding programme for the Copernicus Programme to provide financial support for the period between 2014 and 2020 to manage the satellite networks and to launch new satellites [10,25]. The Copernicus Programme has three main objectives: (1) to produce and disseminate information to support EU global policies for environment and security, (2) to provide a platform for stockholders, providers and users for dialogue and collaboration, and (3) to provide a legal, financial, organisational and institutional framework for the smooth function for ESA satellite missions.
The Copernicus Programme has strategic plans for developing seven satellite missions; four of these satellites (Sentinel-1–3, 5) (It is important to note that Sentinel-4 is still under construction. The two satellites under Sentinel-4 are due to be launched in 2023 and 2030). constellation have already been launched [8,9]. The first Sentinel satellite, Sentinel 1A, was launched on 3 April 2014 and carries a C-band synthetic aperture radio detection and ranging (radar) instrument. The remotely sensed data collected by Sentinel-1 satellite has a wide range of applications which include sea and land monitoring, emergency response due to environmental disasters, and economic applications (e.g., urban expansion). Sentinel-3 is dedicated to oceanography, and the first satellite of the Sentinel-3 constellation (Sentinel 3A) was launched on 16 February 2016. In 2017, the Copernicus Programme launched Sentinel-5 to monitor air pollution.
Each Sentinel mission is based on the constellation of two satellites which reduces the revisiting time, and hence providing data in the shortest possible time [10,25]. The Sentinel programme has been implemented in three phases including the pre-operation (2008–2010), initial operation (2011–2013), and full operation (2014 and beyond) [25]. Under the strategic plans for the Copernicus Programme, other satellites will be launched starting with Sentinel-4 and will go beyond Sentinel 6 in the near future [25].

4.2. Overview of Sentinel-2 Mission

Sentinel-2 Earth observation satellites carry multispectral imaging systems and acquire optical images [8,35]. Sentinel-2 satellites are operated by ESA, and the satellites were manufactured by a consortium led by Airbus Defence and Space (Airbus DS). The mission supports several services and applications such as agricultural monitoring, disaster management and land cover/use classification [15,36,37,38].

4.2.1. Properties of Sentinel-2 Data

Sentinel-2 data has a global coverage of the Earth’s land surfaces from 56° S to 84° N, coastal waters, and the whole Mediterranean Sea [8,28]. Compared to the swath width of Landsat missions of 185 km [29], the Sentinel-2 mission has a wide swath of 290 km field of view [30]. The orbit for Sentinel-2 is sun-synchronous at an altitude of 786 km, 14.3 revolutions per day, with a 10:30 a.m. descending node at the equator [17]. This local time for the equator bypass was selected to minimise cloud cover and ensuring suitable sun illumination. The bypass time for Sentinel-2 satellite matches the Landsat’s and SPOT’s bypass time, combining the Sentinel-2 data with historical images to build long-term time-series data [8], which are necessary for natural resource monitoring.
Sentinel-2 offers improved data compared to other low to medium spatial resolution satellite images (e.g., Landsat), especially in temporal and spatial resolution [39]. The 13 bands for Sentinel-2 images have spatial resolutions ranging from 10 to 60 m (Table 2) [8,32]. The visible and the near-infrared (NIR) bands have a spatial resolution of 10 m, the infrared bands have 20 m spatial resolution and the other bands have 60 m (Table 1). The 10 m spatial resolution makes Sentinel data to have the potential for detailed exploration of the Earth’s surface (e.g., urban sprawls and agriculture). The other valuable characteristic of Sentinel-2 data is its high temporal resolution of 5 days [40]. This temporal resolution improved from 10 to 5 days after the launch of the second twin satellite, Sentinel-2B, which makes the two satellites to operate at 180° orbit phase [8,9].
Due to the high temporal resolution (i.e., 5 days), land cover/use changes that take place within a short period (e.g., fire incidences, floods, volcanic eruptions) can be monitored effectively. For example, Phiri, et al. [41] used Sentinel-2 images to monitor floods in the Beira region of Mozambique, while Verhegghen, et al. [42] monitored fire burnt areas using Sentinel-2 images in the Congo Basin. The application of Sentinel-2 data to monitor these incidences, which happen over a short period, makes the images more useful in countries where floods (e.g., Malawi, Mozambique and Zimbabwe), cyclones and fire incidences are common [41,42,43]. Furthermore, other programmes, such as the UN-Spider initiative have helped estimate the extent of flooding on a large-scale using Sentinel-2 products [44].
The Sentinel-2 sensor has low radiometric calibration uncertainty that makes the radiance of the images to produce reliable results. Gorroño, et al. [45] and Gorroño, et al. [46] reported radiometric uncertainty ranging from 0.03 to 0.4% for Sentinel-2 images. These values are comparable to other sensors such as the Landsat-8 [45] and thus, Sentinel-2 images have the potential to produce highly accurate information to support different applications.
The other characteristics of Sentinel-2 data that make the monitoring of the Earth’s surface more effective include the wide swath and the free access data policy. The wide swath of 290 km makes the processing of large areas much easier and more accurate with less need for data normalisation and merging [29]. Sentinel-2 data is free, making it easy for resource-constrained researchers to use the data and complement it with other free access data such as Landsat [6,30,32,47]. With many developing countries (e.g., African countries) having challenges with financial resources to secure commercial-based remotely sensed images, Sentinel-2 offers a good alternative for high spatial resolution images [6]. Sentinel-2 high spatial resolution images have already contributed to the 20 m land cover maps for Africa [17] and other regional land cover maps based on Sentinel-2 images, especially that most of the regional land cover/use maps have a spatial resolution of 30 m [1,48].

4.2.2. Sentinel-2 Data Products

Sentinel-2 data is available in different processed forms [9,25,50]. This is because Sentinel-2 MSI products undergo different stages of processing to reach a level that can be accessed by the users. The main stages include Level-0, Level-1A, Level-1B, Level-1C and Level 2A (Figure 4). Level-0 and Level-1A are not released to users and are in the form of compressed raw image data in instrument source packet (ISP) format. Level-1B product is made of granules with 25 by 23 km long of about 27 MB. Level 1B product provides radiometrically corrected imagery with Top-Of-Atmosphere (TOA) radiance values, and the product includes the refined geometry being used to produce the user accessed Level-1C products. Level-1C is made of 100 × 100 km tiles in an orthorectified format in UTM/WGS 84 projection. Using digital elevation models, Level-1C is produced in cartographic geometry (i.e., visualisable model) [9]. Level 2A production can also be processed from level 1C products by using the Sentinel-2 Toolbox [9]. From all these data products, Level-1C (Top-of-Atmospheric reflectance) and Level-2A (Bottom-of-Atmospheric reflectance) are the most commonly used products in land cover/use mapping.

4.3. Pre-processing of Sentinel-2 Images

4.3.1. Geometric Correction

The geometric correction adjusts the position of the images in line with the ground position [9,13,28,51]. The Sentinel-2 images use a physical model for geometric correction by employing ground control points (GCPs), known geographical locations for imagery referencing [25]. This model combines position, altitude and transformation information to carry out the geometric correction. An automated correlation process between reference images and GCPs is employed for geometric correction for Sentinel-2 data into a cartographic model. The reference data used for geometric correction belongs to the worldwide geo-referenced data, based on Sentinel-2 mono-spectral images [25]. The Shuttle Radar Topographic Mission (SRTM) Digital Elevation Model (DEM) is also used to improve the geometric accuracy [9].
Although Sentinel-2 has a low geometric error, Storey, et al. [52] reported that Sentinel-2 has a geolocation error of 12.5 m which is higher that the geolocation error for Landsat-8 of 12 m [53]. Storey, et al. [52] also reported that there is a sensor-to-sensor misalignment between Sentinel-2 and Landsat-8 of 38 m. To normalise this error, the National Aeronautics and Space Administration (NASA) has developed a robust harmonisation programme (Harmonized Landsat and Sentinel-2) for the two datasets to reduce this error [54,55]. So far, no known studies have focused on assessing the consistency of Sentinel data with other satellite data, including the earlier version of Landsat images (i.e., Landsat 1–7).
The atmospheric/topographic (ATCOR) software provided by the Sentinel-2 toolbox software handles the other part of the geometric and topographic correction. Topographic correction focuses on reducing effects due to shadows and surface irregularities [56]. Generally, the ESA carries validation meetings (Validation Team Meetings) to address different application challenges and to improve the accuracy of Sentinel-2 products [57].

4.3.2. Atmospheric Correction

Pre-processing improves the quality of the images by reducing the errors associated with data acquisition. Like other spaceborne optical sensors, atmospheric, topographic, shadows, and cloud cover effects also affect Sentinel-2 [58,59]. These effects have the potential of reducing classification accuracy during a land cover/use mapping [60]. It is important to note that the classification accuracy reported in this manuscript is the overall accuracy (OA). Generally, different atmospheric correction methods have been applied to Sentinel-2 data. For example, Pflug, et al. [58] tested the performance of ATCOR on Sentinel-2 images and reported that the results were similar to those of Landsat-8 and RapidEye images.
Other studies used different methods for atmospheric corrections of Sentinel-2 including Fast Line-of-sight Atmospheric Analysis of Hypercubes (FLAASH), 6S and Dark Object Subtraction (DOS) which have the potential of reducing atmospheric effects on Sentinel-2 [61]. The unique aspect of Sentinel-2 pre-processing is the availability of the Sentinel Application Platform (SNAP) or Sen2Core for pre-processing the images provided by ESA [62,63]. This platform offers an opportunity for different pre-processing including atmospheric and topographic corrections [9,59].
Due to the importance of image correction for atmospheric effects, many studies have reported the increasing development of new algorithms for Sentinel-2 atmospheric correction. Vanhellemont [64] tested the use of dark spectrum fitting atmospheric correction on the aquatic environment, and high overall classification accuracy was obtained. In a separate study, Hagolle, et al. [65] developed multi-temporal and multispectral methods for atmospheric correction to estimate aerosol optical thickness over land. Furthermore, image correction for atmospheric effects (iCOR) has also been developed, and it is effective on Sentinel-2 [66].

4.3.3. Cloud Cover Masking

Similar to other space-based optical sensors, cloud cover is common for Sentinel-2 [67]. Many studies have developed and tested different methods of cloud masking of Sentinel-2 images, including Fmasking and fusion with thermal bands from other images [67,68]. Fmasking is the most common method used for cloud masking and has been used in many studies to improve the quality of the Sentinel-2 images. For example, Frantz, et al. [68] developed an improved version of the Fmask algorithm by using a cloud displacement Index. Although different methods for cloud masking have been tested on Sentinel-2 data, studies focusing on how these methods improve the land cover/use classification accuracy for Sentinel-2 are still lacking. Other institutions such as Vlaamse Instelling voor Technologische (VITO) (https://remotesensing.vito.be/hubspot-topics/Sentinel-2) in Belgium have focused on improving the application of Sentinel-2 by developing cloud masking tools in order to improve the accuracy of Sentinel-2 data for different application such as forest, agricultural and disaster monitoring [69,70].

4.4. Land Cover/Use Classification with Sentinel-2

Sentinel-2 satellite was launched at a time when many advanced classification methods were already developed. These methods are based on both pixel [30,71] and objects [39,72]. High computing capabilities have also contributed to the advancement in land cover/use classification using Sentinel-2. Land cover/use classification of Sentinel-2 has been dominated by machine-learning approaches (Table 2) including random forests (RF), k-nearest neighbour (KNN), support vector machine (SVM) and Bayes. RF classifier is commonly used compared to the other classifiers [7,36,73].
Sophisticated machine-learning techniques such as convolutional neural network (CNN) have also been used on Sentinel-2 images [74]. For example, Segal-Rozenhaimer, et al. [75] applied CNN on land cover classification and achieved high classification accuracy of 91%. In addition, cloud-based computing has also contributed to improved land cover/use monitoring because large dataset can be analysed at a fast rate [76,77]. For example, Hiestermann, et al. [77] employed cloud-based computing using Google Earth Engine to map crops in South Africa. RF and Maximum likelihood classifiers (MLC) have also been used to produce the high spatial resolution (20 m) land cover map for Africa based on Sentinel-2 data—the CCI Land Cover—S2 prototype [17].

4.4.1. Supervised and Unsupervised

Many studies on Sentinel-2 data have shown that a supervised classification approach is applied more than an unsupervised classification approach [31]. The major reason is that many classification algorithms have been developed based on the supervised classification approach, while the unsupervised classification employs the Iterative Self-Organizing Data Analysis Technique (ISODATA) and k-means clustering as the major classification algorithms [78,79]. Supervised classification is often applied by combining object-based image analysis (OBIA) and machine-learning classifiers. However, recent studies show that many researchers are still using pixel-based classification [71,76,80]. The application of machine-learning classifiers with OBIA has shown the potential of producing high classification accuracy. However, the pixel-based approaches are still commonly used (Table 2).

4.4.2. Pixel-Based Image Analysis

The pixel-based land cover/use classification is one of the most common classification approaches applied to Sentinel-2 [31,81,82]. The literature shows that RF is the most common classifiers used for pixel-based approach (Table 3). Due to the limitations of pixel-based approaches, such as salt-and-pepper effect or speckle—spectral noise, sub-pixel methods have been applied on Sentinel-2 data to improve the classification accuracy [83,84]. Spectral mixture analysis (SMA) is one of the robust methods which are used for land cover/use classification for Sentinel-2 to reduce mixed pixel effects. For example, Degerickx, et al. [84] applied Multiple Endmember Spectral Mixture Analysis (MESMA) to urban land cover/use and achieved an accuracy of 85%.

4.4.3. Object-Based Image Analysis

Due to the high spatial resolution of Sentinel-2 data, OBIA is mostly preferred (see Figure 5) for land cover/use classification [8,93]. However, there is still a considerable number of studies that have used the pixel-based classification approach (Table 3 and Figure 5). This is shown by the growing number of studies reporting that OBIA produces higher classification accuracy compared to pixel-based [87,94]. OBIA has been used for different land cover/use classification including water, agriculture, forests and urban areas (Table 4). The development of machine-learning approaches in land cover classification has contributed to the efficient performance of OBIA [95]. For Sentinel-2 data, many studies have focused on the application of the combination of OBIA with machine-learning classifiers on land cover/use classification, due to the high spatial resolution [39,93]. Other important issues, such as the effects of segmentation parameters, have not been fully tested on Sentinel-2 data [28,96,97].

4.4.4. Accuracy of Sentinel Land Cover/Use Mapping

High accuracies have been reported on the land cover/use classification of Sentinel-2 data (Table 2 and Figure 6), and most of the classification accuracies achieved are above 80%. Other methods, including the pixel-based approach, using the maximum likelihood classifier (MLC), have also proved to produce high classification accuracies [31,38,112]. A comparison of accuracies based on different machine-learning classifiers has shown that RF and SVM outperform the other classifiers (Figure 6).
The comparison of accuracies for different classifiers needs to be interpreted with caution because the performance of these classifies depends on several factors such as number of training samples [113,114], number of land cover classes [112], the type of terrain [115] and pre-processing techniques applied on the images [60]. Since the studies reported here were conducted under different conditions, they are not totally comparable.

4.5. Integration of Sentinel-2 with Other Remotely Sensed Data

Sentinel-2 has been successfully fused/integrated with different remotely sensed data to improve its applicability in land cover/use mapping and analysis. Fusion by different bands from Sentinel-2 has also been successful because of the different spatial resolution of the different bands. For example, Wang, et al. [116] fused the 20 m bands with the 10 m bands to improve the spatial resolution. The standard and simplest fusion method of Sentinel-2 data is the pan-sharpening approach. Gašparović, et al. [87] indicated that Brovey, Intensity Hue Saturation (IHS), and Principal Component Analysis (PCI) improve the classification accuracy of Sentinel-2 land cover/use classification. Fusion with other images like the Landsat thermal band has been implemented to reduce cloud cover on Sentinel-2 data [67,116].
Sentinel-2 data has also been integrated with other datasets such as synthetic aperture radar (SAR), Light Detection and Ranging (LiDAR) and other higher spatial resolution images, including Unmanned Aerial vehicle (UAV) images [10,87,117]. The success of Sentinel missions has also been shown by the complementary use of data from different Sentinel missions. Many studies have reported the integration of Sentinel-2 with Sentinel-1 data in different applications including urban land cover/use, wetland mapping and biomass assessment [7,10]. Sentinel-1 synthetic aperture radar (SAR) data offers several advantages when integrated with optical Sentinel-2 data. These benefits mainly include the improvement in land cover/use classification accuracy. Sentinel-2 has also been successfully used with Sentinel-3 data (i.e., topographic and surface temperature data), and the results have shown promise [95]. da Silveira, et al. [117] integrated Sentinel-2 with LiDAR data to classify seven vegetation types in northeast Brazil, and the results showed a significant improvement of accuracy from 49 to 61%. This was mainly attributed to the complementary role played by Sentinel-2 reflectance information and LiDAR metrics, especially in differentiating forest succession stages based on vertical attributes.

4.6. Opportunities and Challenges of Sentinel-2 Data

The past five years of working with Sentinel-2 data, since 2015, has shown great potential in land cover/use mapping and analysis [15,118]. The major factors driving the success of the Sentinel-2 data are the free access to the data, the high spatial resolution (10–20 m), the short revisit time, and the presence of the red-edge bands. Turner, et al. [6] indicated that free and open access data contributes to the increasing applications of remotely sensed data. This is similar to the NASA Landsat data, which has become a global tool for land cover/use mapping and analysis at the region and global scale because of the free access policy [27,28]. With the availability of the high spatial resolution, the scientific community and practitioners are looking forward to land cover/use maps based on the high spatial resolution (10 m) Sentinel-2 images at the national, regional and global scale.
The high resolution of the Sentinel-2 data offers opportunities for detailed land cover/use mapping at a fine scale. The visible and near-infrared bands have been used for land cover/use analysis, which needs high resolution, such as urban land cover/use classification, crop monitoring and plantation forest mapping. For example, Yang, et al. [119] investigated the season flooding in urban areas using Sentinel-2 data. Generally, there has been an increase in studies on crop monitoring and these studies range from yield assessment to crop identification [36,90]. The monitoring of small-scale forest plantation was also reported to be successful using Sentinel-2 in the Tanintharyi region in Myanmar [38]. Given that the high spatial resolution of 10 m is only on four bands (visible to near infra-red), this poses a challenge when integrating other bands, such as short-wave bands and the red-edge bands, which need to have the same spatial resolution. Many studies have also proposed methods of downscaling the low spatial resolution images to 10 m through image fusion [63,87].
The presence of the three red-edge bands has shown high applicability to land cover/use mapping, especially in vegetation land covers/uses [62,120,121]. The red-edge bands have been used to improve land cover/use classification [62] and specific mapping of land covers/uses, such as grassland monitoring [15] and land cover/use disturbances, such as oil spill [121]. Kussul, et al. [90] compared the classification results between Landsat OLI-8 which has no red-edge band and Sentinel-2 images. The results showed that the red-edge bands improved classification results by 4–5%.
Besides the opportunities offered by the Sentinel-2 data, there are many challenges associated with using Sentinel-2 data. The primary limitations include misalignment with other remotely sensed data (e.g., Landsat 8), the absence of the panchromatic (i.e., black and white) and thermal bands, and the variation in the spatial resolution of the bands. Storey, et al. [52] reported that the Landsat OLI-8 and Sentinel-2 images have a sensor-to-sensor misalignment of 38 m due to the errors in ground control points (GCPs) and Global Land Survey (GLS) framework. On a global scale, there were plans to correct the misalignments by 2018. However, users need to be cautious and apply manual correction to align the images. Furthermore, other automated methods have been developed to solve the problem of misalignment through different steps for image pre-processing before integrating Landsat OLI-8 and Sentinel-2 images [30]. NASA is also undertaking the harmonisation of Landsat OLI-8 and Sentinel-2 (HLS), which aims at obtaining corrected images through atmospheric correction, cloud and shadow masking, co-registration, bidirectional reflectance normalization and bandpass adjustment [54,55].
The 13 bands from Sentinel-2 do not include a panchromatic band nor a thermal band. For those applications which need a panchromatic band, such as pan-sharpening, such applications cannot be applied on the original Sentinel-2. Different methods using other independent bands such as single bands or by averaging the bands with a high spatial resolution (i.e., the four bands with 10 m spatial resolution), have shown promise in improving Sentinel-2 land cover/use monitoring [116,122]. Thermal bands are essential for various applications of cloud cover masking [67,68]. With Sentinel-2 imagery lacking a thermal band, other studies have suggested using thermal bands from the Sentinel-3 mission [123]. However, the thermal bands from Sentinel-3 have the low spatial resolution, and hence, need to be pan-sharpened [123].

4.7. Best Practices for Optimal Classification Accuracy with Sentinel-2

To produce the desired classification accuracy with Sentinel-2 data, several technical aspects need to be taken into consideration. These include pre-processing and selecting appropriate methods for classification. Pre-processing, such as atmospheric and topographic corrections, needs to be considered before the classification. Atmospheric correction is ideal for multiple images, especially for time series analysis. There are many approaches for atmospheric correction. However, the Sentinel toolbox offers atmospheric correction software dedicated to Sentinel-2 pre-processing. Topographic correction aims to normalise the effects due to surface irregularities and can also be corrected using the Sentinel-2 Toolbox.
Due to the increasing number of available methods for land cover/use classification, critical consideration needs to be put into place to choose the appropriate classification method. Many studies have shown that supervised OBIA, using a machine-learning classifier, produces the desired results [34,61,124]. However, several considerations have not been tested, especially the scale parameters for OBIA [8,28,110]. Other methods like spectral mixture analysis (SMA) have the potential [83,125], especially in reducing mixed pixel effects.

4.8. Specific Applications of Sentinel-2 in Land Cover/Land Use Monitoring

Sentinel-2 multispectral images have been used for land cover/use monitoring in different ways across the world. These various aspects include forest mapping [126,127,128], carbon assessments [129,130], urban area mapping [131,132] and natural hazards monitoring [41,133,134]. Other vital applications include agricultural [135,136] and water resources monitoring [77,137]. The studies on the implementation of Sentinel-2 in land cover/use monitoring have been reported in many countries across the world, including Europe, Asia, America, and Africa. Many studies that have been reported in developed countries, especially in Europe, integrated Sentinel-2 data with contemporary datasets such as LiDAR and UAVs datasets [86,117,138], while studies, based in developing countries, combined Sentinel-2 data with other open-source data such as Sentinel-1 SAR data and Landsat [139,140].

4.8.1. Sentinel-2 for Forest Monitoring

In the forest sector, Sentinel-2 products have been a powerful tool because of being useful to different applications including mapping of forest area [126,128], establishing boundaries of specific forest types [127,131], discrimination of forest types [132,141], and other applications such as leaf area index (LAI) analysis [142,143]. In all these applications (Table 5), Sentinel-2 data have proved to be more useful than other low spatial resolution images (e.g., Landsat) because of the high spatial resolution (10 m) [144] and the availability of the red-edge band [145].
The application of Sentinel-2 images differ from region to region; however, the major difference is between the developing and the developed countries. In developed countries, such as Finland and Germany, Sentinel-2 data is increasingly applied for specific applications (e.g., forest inventory) requiring detailed analysis. However, most applications in developing countries focus on describing the extent of the forests (land cover/use). For example, Puliti, et al. [146] reported the use of Sentinel-2 in forest inventories in Norway. The increasing combination of UAVs [146,147] and LiDAR [86,117] datasets with Sentinel-2 has shown promise and was highly accurate in describing different forest attributes.
Invasive plant species have also been monitored using Sentinel-2 images. Kattenborn, et al. [147] reported the use of Sentinel-2 in combination with Sentinel-1 and UAVs to monitor three invasive species (Pinus radiata, Ulex europaeus and Acacia dealbata) in Chile. In Baringo County, Kenya, Ng, et al. [148], compared Sentinel-2 and Pléiades (2-m spatial resolution) data to produce a highly accurate vegetation map that would differentiate an invasive tree species (Prosopis) from native forest trees and mixed vegetation classes. This information is useful for effective forest management strategies. Although they observed that higher spatial resolution of Pléiades contributed to high accuracy, it was concluded that Pléiades is costly and the free of charge Sentinel-2 data provides a viable alternative as its increased spectral resolution compensates for the lack of spatial resolution.
Forest fire (wildfire) monitoring has been one of the crucial applications of Sentinel-2 imagery. With forest fires being common in most of the tropical regions [149], Sentinel-2 has proved to be a valuable tool due to the high temporal resolution of five days. Navarro, et al. [150] reported the application of Sentinel-2 multispectral images for post-fire monitoring using spectral indices in Madeira Island. Sentinel-2 images have also been used for mapping burned scars, fires severity and soil erosion susceptibility in southern France [151]. Continental level maps for Africa on fire have also been developed based on Sentinel-2 multispectral images [149]. The combination of multispectral Sentinel-2 and SAR Sentinel-1 improves the accuracy of fire monitoring [42,149]. Besides forest fires, Sentinel-2 has also been used to monitor the quality of foliage in national parks, such as the Kruger National Park in South Africa, especially after fire incidences.
Other sensitive areas requiring adequate monitoring, include the wetland ecosystems as they have been affected by different anthropogenic activities, such as an expansion of urban and agricultural areas [97]. Whyte, et al. [144] highlighted those wetland areas are sensitive to climate change, especially to the increasing temperature and the changing rainfall pattern. By using Sentinel-2, wetland maps have been developed for the Newfoundland (Canada) using Google Earth Engine. The combination of Sentinel-2 and Sentinel-1 to monitor wetland was tested in China, and this study achieved high accuracy (70–90%). In iSimangliso Wetland Park, South Africa, mapping of wetland areas was enhanced with the combination of Sentinel-2 and Sentinel-1 [144]. In line with wetland areas, mangrove forests are an important component of the wetland ecosystem and have also been affected by human activities [152,153]. Therefore, Sentinel-2 data have the potential to enhance effective monitoring of these mangrove ecosystems. For example, Mondal, et al. [152] mapped the mangrove forests in the coastlines of Senegal and the Gambia (West Africa) with high accuracy (>80%).
Sentinel-2 data has also been a critical monitoring mechanism in climate change through biomass and carbon assessment. Most of the studies have reported using Sentinel-2 data to monitor above-ground biomass [14,154,155]. In Northern Vietnam, Pham, et al. [154] reported the application of Sentinel-2 to map above-ground biomass for mangrove forests. Shoko, et al. [156] also characterised the above-ground biomass for C3 and C4 grass in Drankensberg region in South Africa using Sentinel-2 data. Although most of the studies on biomass focus on above-ground biomass, Bulut, et al. [130] determined the total biomass by considering both above- and below-ground biomass in Turkey. Many studies that focus on determining carbon quantities have used regression analysis (e.g., multivariant regression) to relate the spectral properties of Sentinel-2 and carbon quantities [129,130,157].

4.8.2. Sentinel-2 for Agricultural Monitoring

Sentinel-2 has become an important tool for monitoring agricultural activities (Table 6). This is evidenced by the various studies, which have focused on developing global products to support agricultural activities [178,179]. For example, global agricultural maps produced by the Onesoil project in Belarus using Sentinel-2 products and machine-learning, provide useful information to farmers [180]. Bellemans, et al. [181] also reported on a global project (Sentinel-2 for Agriculture) involving African countries, such as Burkina Faso, South Africa, Morocco and Madagascar. The specific applications of Sentinel-2 in agriculture, include crop production monitoring [77,135,182], crop type mapping [183,184], irrigation agriculture monitoring [137], nitrogen content assessment [185] and assessment of crop health [186]. These studies range from small scale monitoring (i.e., field-level) [182,183] to continental level [76].
Due to the high demand for information for agriculture production, Sentinel-2 data has been used for real-time monitoring of agricultural activities [77]. An exciting development in the application of Sentinel-2 in agricultural applications is the increasing use of cloud computing applications (e.g., use of Google Earth Engine) [178]. It is important to note that cloud-based computing has become common in other applications such as forest and wetland monitoring. Therefore, Xiong, et al. [76] and Hiestermann, et al. [77] highlighted that cloud computing is useful for extensive area monitoring (e.g., national and continental level). In addition, machine-learning algorithms produce high accuracies in agricultural monitoring, especially in the discrimination crop type [187] and identifying the specific types of agricultural systems (e.g., irrigation farming) [188].
Sentinel-2 images are also commonly used for monitoring crop diseases. Zheng, et al. [189] used Sentinel-2 data to monitor wheat yellow rust in China. Heavy metal-induced stress on rice was also assessed using Sentinel-2 multispectral data in China. The results from these studies have high accuracies of over 85%, indicating that Sentinel-2 data has the potential for crop health monitoring. Dhau, et al. [186] assessed the abilities of Sentinel-2 multispectral images to detect maize grey leaf spot disease in Durban, South Africa. The results showed that including all the 13 bands for Sentinel-2 and using the RF classifier produced a high accuracy of 83%. Other applications of Sentinel-2 images that are related to plant health monitoring include detecting crop residues [82,190] and assessing the chlorophyll in crops [191].
An accurate estimate of biophysical parameters is important for a number of applications such as precision agriculture, crop productivity and soil hydrology [169,192]. Sentinel-2 images have been used for estimating biophysical parameters [193,194,195]. For example, Xie, et al. [192] used Sentinel-2 images to retrieve chlorophyll content, leaf area index, nitrogen content and leaf chlorophyll index. One of the advantages of Sentinel-2 images is the availability of the red-edge band, which is reliable in retrieving biophysical parameters [193,195].

4.8.3. Sentinel-2 for Urban Area Monitoring

The characteristics of Sentinel-2 are driving the increasing use of these images in monitoring urban areas (Table 7). The application of Sentinel-2 in urban areas include urban expansion [204], urban heat island [205], rural-urban transition [206], informal settlement [207], and urban ecosystem (e.g., urban forests/green space) [140]. In addition to these applications, Sentinel-2 images have also been used to monitor surface water in urban areas. For example, Yang, et al. [208] extracted surface water in Beijing, China using Sentinel-2. For most countries in Africa, urban expansion (e.g., slum settlements) due to population growth has been a major challenge [209]. Hence, Sentinel-2 provides a reliable tool for urban land use planning [210,211]. The choice of Sentinel-2 data was attributed to its high spatial resolution (10 m) to identify informal settlements accurately, and in most cases, the combination of Sentinel-2 and Sentinel-1 has produced highly accurate results.
Although Sentinel-2 data offers an opportunity for monitoring urban activities, few studies have been dedicated to urban monitoring using Sentinel-2. Most of the studies that employ Sentinel-2 data in urban monitoring mainly focus on urban expansion [206,212]. Therefore, there is a need for studies on different urban monitoring aspects (e.g., road networks, access to facilities, waste management) using contemporary high spatial resolution data (i.e., Sentinel-2).

4.8.4. Sentinel-2 for Natural Hazards

Globally, there are different natural hazards (Table 7), which affect both flora and fauna. Phiri, et al. [41] reported that natural hazards have negative impacts on infrastructure (i.e., built-up areas). These natural hazards include floods [134], droughts [172], earthquakes [213] and volcanic eruptions [214]. Water resources affect human life through floods, especially in coastal and riparian areas [41,134]. For example, Phiri, et al. [41] attempted to help policymakers in making informed decisions on pre- and post-management of floods by employing Sentinel-2 data in Beira, Mozambique. Studies on the drought area mainly focus on the vegetation growth in drought-prone regions. For example, Munyati [171] and Dotzler, et al. [172] mapped drought stress in deciduous forest communities in South Africa, and Germany, respectfully.
Sentinel-2 data has also been used to map the effects of more destructive natural disasters such as earthquakes [215] and volcanic eruption [214]. For example, Jelének, et al. [215] investigated the post-earthquake landslide distribution by using Sentinel-2 and Sentinel-1 data in New Zealand. In Sounders Island of the South Sandwich Islands, Gray, et al. [214] investigated the volcanic activities using Landsat-8 and Sentinel-2 data. The application of Sentinel-2 data in monitoring natural disasters is in line with one of the main aims of the Copernicus Programme of monitoring the Earth’s natural disasters [25].

5. Conclusions

This study aimed at understanding the contribution of ESA Sentinel-2 data towards land cover/use monitoring. The current research has shown that most studies reviewed indicated that Sentinel-2 data has the potential for land cover/use monitoring across the world. Many studies have also reported the superiority of Sentinel-2 over similar sensors such as Landsat-8. The application of sentinel-2 data differs from region to region, especially the type of data which is integrated with Sentinel-2. In developing countries, Sentinel-2 is integrated with cotemporally datasets, such as LiDAR and UAVs data, while free access data is combined with Sentinel-2 in developing countries. The major strength of Sentinel-2 is the high spatial resolution, high temporal resolution and the availability of the red-edge band [8,25]. Many classification methods have been applied with Sentinel-2 data including both pixel- and object-based approaches. However, the use of OBIA and machine learning classifiers (e.g., RF and SVM) has proved to have the great potential of improving land cover classification. The studies have also shown that due to the high spatial resolution, Sentinel-2 data can achieve high accuracies compared to other medium spatial resolution satellite images, such as Landsat. Like other optical satellite images, Sentinel-2 images are affected by cloud cover, and hence limiting its applicability in cloud prone areas. Since Sentinel-2 images are relatively new (approximated 5 years), many regions have not been tested in comparison with earlier versions of Landsat images.
Moving forward, Sentinel-2 offers new opportunities for both the private sector, government organisations, the scientific community and practitioners to increasing availability of regional, national, continental and global level land cover/use maps based on the high spatial resolution Sentinel-2 data. Future review studies can explore the applications of Sentinel data to specific regions of the world (e.g., Africa, Asia).

Author Contributions

Conceptualization, D.P. and S.S.; methodology, D.P.; Writing—original draft preparation, D.P.; writing—review and editing, M.S.; V.R.N.; Y.M and M.R.; funding acquisition, Y.M. All authors have read and agreed to the published version of the manuscript.

Funding

This study was partly supported by the JSPS grant 18H00763 (2018-20).

Acknowledgments

We would like to acknowledge the anonymous reviewers for their valuable suggestions that have helped to improve this manuscript as they worked tirelessly during the difficult time of COVID-19.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hansen, M.C.; DeFries, R.; Townshend, J.R.; Sohlberg, R. Global land cover classification at 1 km spatial resolution using a classification tree approach. Int. J. Remote Sens. 2000, 21, 1331–1364. [Google Scholar] [CrossRef]
  2. Hosonuma, N.; Herold, M.; De Sy, V.; De Fries, R.S.; Brockhaus, M.; Verchot, L.; Angelsen, A.; Romijn, E. An assessment of deforestation and forest degradation drivers in developing countries. Environ. Res. Lett. 2012, 7, 044009. [Google Scholar] [CrossRef]
  3. Phiri, D.; Morgenroth, J.; Xu, C. Long-term land cover change in Zambia: An assessment of driving factors. Sci. Total Environ. 2019, 697, 134206. [Google Scholar] [CrossRef] [PubMed]
  4. Jucker, T.; Caspersen, J.; Chave, J.; Antin, C.; Barbier, N.; Bongers, F.; Dalponte, M.; van Ewijk, K.Y.; Forrester, D.I.; Haeni, M.J.G.C.B. Allometric equations for integrating remote sensing imagery into forest monitoring programmes. Glob. Chang. Biol. 2017, 23, 177–190. [Google Scholar] [CrossRef] [PubMed]
  5. Haack, B.N. Landsat: A tool for development. World Dev. 1982, 10, 899–909. [Google Scholar] [CrossRef]
  6. Turner, W.; Rondinini, C.; Pettorelli, N.; Mora, B.; Leidner, A.K.; Szantoi, Z.; Buchanan, G.; Dech, S.; Dwyer, J.; Herold, M. Free and open-access satellite data are key to biodiversity conservation. Biol. Conserv. 2015, 182, 173–176. [Google Scholar] [CrossRef] [Green Version]
  7. Denize, J.; Hubert-Moy, L.; Corgne, S.; Betbeder, J.; Pottier, E. Identification of winter land use in temperate agricultural landscapes based on Sentinel-1 and 2 Times-Series. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 8271–8274. [Google Scholar]
  8. Immitzer, M.; Vuolo, F.; Atzberger, C. First Experience with Sentinel-2 data for crop and tree species classifications in Central Europe. Remote Sens. 2016, 8, 166. [Google Scholar] [CrossRef]
  9. ESA. Sentinel-2 Missions-Sentinel Online; ESA: Paris, France, 2014. [Google Scholar]
  10. Malenovský, Z.; Rott, H.; Cihlar, J.; Schaepman, M.E.; García-Santos, G.; Fernandes, R.; Berger, M. Sentinels for science: Potential of Sentinel-1, -2, and -3 missions for scientific observations of ocean, cryosphere, and land. Remote Sens. Environ. 2012, 120, 91–101. [Google Scholar] [CrossRef]
  11. Korhonen, L.; Packalen, P.; Rautiainen, M. Comparison of Sentinel-2 and Landsat 8 in the estimation of boreal forest canopy cover and leaf area index. Remote Sens. Environ. 2017, 195, 259–274. [Google Scholar] [CrossRef]
  12. Pesaresi, M.; Corbane, C.; Julea, A.; Florczyk, A.J.; Syrris, V.; Soille, P. Assessment of the added-value of Sentinel-2 for detecting built-up areas. Remote Sens. 2016, 8, 299. [Google Scholar] [CrossRef] [Green Version]
  13. Drusch, M.; Del Bello, U.; Carlier, S.; Colin, O.; Fernandez, V.; Gascon, F.; Hoersch, B.; Isola, C.; Laberinti, P.; Martimort, P.; et al. Sentinel-2: ESA’s Optical High-Resolution Mission for GMES Operational Services. Remote Sens. Environ. 2012, 120, 25–36. [Google Scholar] [CrossRef]
  14. Sibanda, M.; Mutanga, O.; Rouget, M. Examining the potential of Sentinel-2 MSI spectral resolution in quantifying above ground biomass across different fertilizer treatments. ISPRS J. Photogramm. Remote Sens. 2015, 110, 55–65. [Google Scholar] [CrossRef]
  15. Otunga, C.; Odindi, J.; Mutanga, O.; Adjorlolo, C. Evaluating the potential of the Red Edge channel for C3 (Festuca spp.) grass discrimination using Sentinel-2 and Rapid Eye satellite image data. Geocarto Int. 2018, 1–21. [Google Scholar] [CrossRef]
  16. Bruzzone, L.; Bovolo, F.; Paris, C.; Solano-Correa, Y.T.; Zanetti, M.; Fernández-Prieto, D. Analysis of multitemporal Sentinel-2 images in the framework of the ESA Scientific Exploitation of Operational Missions. In Proceedings of the 2017 9th International Workshop on the Analysis of Multitemporal Remote Sensing Images (MultiTemp), Brugge, Belgium, 27–29 June 2017; pp. 1–4. [Google Scholar]
  17. Xu, Y.; Yu, L.; Feng, D.; Peng, D.; Li, C.; Huang, X.; Lu, H.; Gong, P. Comparisons of three recent moderate resolution African land cover datasets: CGLS-LC100, ESA-S2-LC20, and FROM-GLC-Africa30. Int. J. Remote Sens. 2019, 40, 6185–6202. [Google Scholar] [CrossRef]
  18. Gromny, E.; Lewiński, S.; Rybicki, M.; Malinowski, R.; Krupiński, M.; Nowakowski, A.; Jenerowicz, M. Creation of training dataset for Sentinel-2 land cover classification. In Proceedings of the Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2019, Wilga, Poland, 6 November 2019; p. 111763D. [Google Scholar]
  19. ESA. Mapping Germany’s Agricultural Landscape. ESA, Ed.; 2017. Available online: https://www.esa.int/ESA_Multimedia/Images/2017/2008/Mapping_Germany_s_agricultural_landscape (accessed on 28 April 2020).
  20. Sitokonstantinou, V.; Papoutsis, I.; Kontoes, C.; Lafarga Arnal, A.; Armesto Andrés, A.P.; Garraza Zurbano, J.A. Scalable Parcel-Based Crop Identification Scheme Using Sentinel-2 Data Time-Series for the Monitoring of the Common Agricultural Policy. Remote Sens. 2018, 10, 911. [Google Scholar] [CrossRef] [Green Version]
  21. Goldewijk, K.K.; Ramankutty, N.J.L.U. Land use changes during the past 300 years. In Land-Use, Land Cover and Soil Sciences-Volume I: Land Cover, Land-Use and the Global Change; EOLSS: Paris, France, 2009; pp. 147–168. [Google Scholar]
  22. DeFries, R.S.; Rudel, T.; Uriarte, M.; Hansen, M. Deforestation driven by urban population growth and agricultural trade in the twenty-first century. Nat. Geosci. 2010, 3, 178–181. [Google Scholar] [CrossRef]
  23. Hansen, M.C.; Potapov, P.V.; Moore, R.; Hancher, M.; Turubanova, S.A.; Tyukavina, A.; Thau, D.; Stehman, S.V.; Goetz, S.J.; Loveland, T.R.; et al. High-Resolution Global Maps of 21st-Century Forest Cover Change. Science 2013, 342, 850–853. [Google Scholar] [CrossRef] [Green Version]
  24. Sloan, S.; Sayer, J.A. Forest Resources Assessment of 2015 shows positive global trends, but forest loss and degradation persist in poor tropical countries. For. Ecol. Manag. 2015, 352, 134–145. [Google Scholar] [CrossRef] [Green Version]
  25. Spoto, F.; Martimort, P.; Drusch, M.J.E. Sentinel-2: ESA’s Optical High-Resolution Mission for GMES Operational Services. In Proceedings of the First Sentinel-2 Preparatory Symposium, Frascati, Italy, 23–27 April 2012. [Google Scholar]
  26. Helber, P.; Bischke, B.; Hees, J.; Dengel, A. Towards a sentinel-2 based human settlement layer. In Proceedings of the IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 5936–5939. [Google Scholar]
  27. Woodcock, C.E.; Allen, R.; Anderson, M.; Belward, A.; Bindschadler, R.; Cohen, W.; Gao, F.; Goward, S.N.; Helder, D.; Helmer, E.; et al. Free Access to Landsat Imagery. Science 2008, 320, 1011. [Google Scholar] [CrossRef]
  28. Phiri, D.; Morgenroth, J. Developments in Landsat land cover classification methods: A review. Remote Sens. 2017, 9, 967. [Google Scholar] [CrossRef] [Green Version]
  29. Hansen, M.C.; Loveland, T.R. A review of large area monitoring of land cover change using Landsat data. Remote Sens. Environ. 2012, 122, 66–74. [Google Scholar] [CrossRef]
  30. Yan, L.; Roy, D.; Zhang, H.; Li, J.; Huang, H. An Automated Approach for Sub-Pixel Registration of Landsat-8 Operational Land Imager (OLI) and Sentinel-2 Multi Spectral Instrument (MSI) Imagery. Remote Sens. 2016, 8, 520. [Google Scholar] [CrossRef] [Green Version]
  31. Miranda, E.; Mutiara, A.B.; Wibowo, W.C. Classification of land cover from Sentinel-2 imagery using supervised classification technique (preliminary study). In Proceedings of the 2018 International Conference on Information Management and Technology (ICIMTech), Jakarta, Indonesia, 3–5 September 2018; pp. 69–74. [Google Scholar]
  32. Chastain, R.; Housman, I.; Goldstein, J.; Finco, M. Empirical cross sensor comparison of Sentinel-2A and 2B MSI, Landsat-8 OLI, and Landsat-7 ETM+ top of atmosphere spectral characteristics over the conterminous United States. Remote Sens. Environ. 2019, 221, 274–285. [Google Scholar] [CrossRef]
  33. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef] [Green Version]
  34. Ma, L.; Li, M.; Ma, X.; Cheng, L.; Du, P.; Liu, Y. A review of supervised object-based land-cover image classification. ISPRS J. Photogramm. Remote Sens. 2017, 130, 277–293. [Google Scholar] [CrossRef]
  35. Mandanici, E.; Bitelli, G. Preliminary comparison of Sentinel-2 and Landsat 8 imagery for a combined use. Remote Sens. 2016, 8, 1014. [Google Scholar] [CrossRef] [Green Version]
  36. Inglada, J.; Arias, M.; Tardy, B.; Morin, D.; Valero, S.; Hagolle, O.; Dedieu, G.; Sepulcre, G.; Bontemps, S.; Defourny, P. Benchmarking of algorithms for crop type land-cover maps using Sentinel-2 image time series. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 3993–3996. [Google Scholar]
  37. Cai, W.; Zhao, S.; Zhang, Z.; Peng, F.; Xu, J. Comparison of different crop residue indices for estimating crop residue cover using field observation data. In Proceedings of the 2018 7th International Conference on Agro-geoinformatics (Agro-geoinformatics), Hangzhou, China, 6–9 August 2018; pp. 1–4. [Google Scholar]
  38. Nomura, K.; Mitchard, E. More than meets the eye: Using Sentinel-2 to map small plantations in complex forest landscapes. Remote Sens. 2018, 10, 1693. [Google Scholar] [CrossRef] [Green Version]
  39. Novelli, A.; Aguilar, M.A.; Nemmaoui, A.; Aguilar, F.J.; Tarantino, E. Performance evaluation of object-based greenhouse detection from Sentinel-2 MSI and Landsat 8 OLI data: A case study from Almería (Spain). Int. J. Appl. Earth Obs. Geoinf. 2016, 52, 403–411. [Google Scholar] [CrossRef] [Green Version]
  40. Vuolo, F.; Neuwirth, M.; Immitzer, M.; Atzberger, C.; Ng, W.-T. How much does multi-temporal Sentinel-2 data improve crop type classification? Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 122–130. [Google Scholar] [CrossRef]
  41. Phiri, D.; Simwanda, M.; Nyirenda, V. Mapping the impacts of Cyclone Idai in Mozambique using Sentinel-2 and OBIA Approach. S. Afr. J. Geogr. 2020, 1–22. [Google Scholar] [CrossRef]
  42. Verhegghen, A.; Eva, H.; Ceccherini, G.; Achard, F.; Gond, V.; Gourlet-Fleury, S.; Cerutti, P. The potential of Sentinel satellites for burnt area mapping and monitoring in the Congo Basin forests. Remote Sensi. 2016, 8, 986. [Google Scholar] [CrossRef] [Green Version]
  43. Hoque, M.A.-A.; Phinn, S.; Roelfsema, C.; Childs, I. Tropical cyclone disaster management using remote sensing and spatial analysis: A review. Int. J. Disaster Risk Reduct. 2017, 22, 345–354. [Google Scholar] [CrossRef] [Green Version]
  44. UN-Spider. Recommended Practice: Flood Mapping and Damage Assessment using Sentinel-2 (S2) Optical Data; UN: Queensland, Australia, 2017; Available online: http://www.un-spider.org/advisory-support/recommended-practices/recommended-practice-flood-mapping-and-damage-assessment (accessed on 5 May 2020).
  45. Gorroño, J.; Banks, A.C.; Fox, N.P.; Underwood, C. Radiometric inter-sensor cross-calibration uncertainty using a traceable high accuracy reference hyperspectral imager. ISPRS J. Photogramm. Remote Sens. 2017, 130, 393–417. [Google Scholar] [CrossRef] [Green Version]
  46. Gorroño, J.; Fomferra, N.; Peters, M.; Gascon, F.; Underwood, C.I.; Fox, N.P.; Kirches, G.; Brockmann, C.J.R.S. A radiometric uncertainty tool for the Sentinel 2 mission. Remote Sens. 2017, 9, 178. [Google Scholar]
  47. Phiri, D.; Morgenroth, J.; Xu, C. Four decades of land cover and forest connectivity study in Zambia—An object-based image analysis approach. Int. J. Appl. Earth Obs. Geoinf. 2019, 79, 97–109. [Google Scholar] [CrossRef]
  48. Chen, J.; Chen, J.; Liao, A.; Cao, X.; Chen, L.; Chen, X.; He, C.; Han, G.; Peng, S.; Lu, M. Global land cover mapping at 30 m resolution: A POK-based operational approach. ISPRS J. Photogramm. Remote Sens. 2015, 103, 7–27. [Google Scholar] [CrossRef] [Green Version]
  49. Clevers, J.G.; Gitelson, A.A. Remote estimation of crop and grass chlorophyll and nitrogen content using red-edge bands on Sentinel-2 and -3. Int. J. Appl. Earth Obs. Geoinf. 2013, 23, 344–351. [Google Scholar] [CrossRef]
  50. Martimor, P.; Arino, O.; Berger, M.; Biasutti, R.; Carnicero, B.; Del Bello, U.; Fernandez, V.; Gascon, F.; Silvestrin, P.; Spoto, F. Sentinel-2 optical high-resolution mission for GMES operational services. In Proceedings of the 2007 IEEE International Geoscience and Remote Sensing Symposium, Barcelona, Spain, 23–28 July 2007; pp. 2677–2680. [Google Scholar]
  51. Young, N.E.; Anderson, R.S.; Chignell, S.M.; Vorster, A.G.; Lawrence, R.; Evangelista, P.H. A survival guide to Landsat preprocessing. Ecology 2017, 98, 920–932. [Google Scholar] [CrossRef] [Green Version]
  52. Storey, J.; Roy, D.P.; Masek, J.; Gascon, F.; Dwyer, J.; Choate, M.J.R.S.O.E. A note on the temporary misregistration of Landsat-8 Operational Land Imager (OLI) and Sentinel-2 Multi Spectral Instrument (MSI) imagery. Remote Sens. Environ. 2016, 186, 121–122. [Google Scholar] [CrossRef] [Green Version]
  53. Storey, J.; Choate, M.; Lee, K.J.R.S. Landsat 8 operational land imager on-orbit geometric calibration and performance. Remote sens. 2014, 6, 11127–11152. [Google Scholar] [CrossRef] [Green Version]
  54. Masek, J.; Ju, J.; Roger, J.-C.; Skakun, S.; Claverie, M.; Dungan, J. Harmonized Landsat/Sentinel-2 products for land monitoring. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 8163–8165. [Google Scholar]
  55. Claverie, M.; Ju, J.; Masek, J.G.; Dungan, J.L.; Vermote, E.F.; Roger, J.-C.; Skakun, S.V.; Justice, C. The harmonized Landsat and Sentinel-2 surface reflectance data set. Remote Sens. Environ. 2018, 219, 145–161. [Google Scholar] [CrossRef]
  56. Pahlevan, N.; Sarkar, S.; Franz, B.A.; Balasubramanian, S.V.; He, J. Sentinel-2 MultiSpectral instrument (MSI) data processing for aquatic science applications: Demonstrations and validations. Remote Sens. Environ. 2017, 201, 47–56. [Google Scholar] [CrossRef]
  57. ESA. 4th Sentinel-2 validation team meeting. In ESA Abstract Book; ESA: Paris, France, 2020. [Google Scholar]
  58. Pflug, B.; Makarau, A.; Richter, R. Processing Sentinel-2 data with ATCOR. In Proceedings of the EGU General Assembly Conference Abstracts, Vienna, Austria, 17–22 April 2016; p. 15488. [Google Scholar]
  59. Main-Knorn, M.; Pflug, B.; Louis, J.; Debaecker, V.; Müller-Wilm, U.; Gascon, F. Sen2Cor for sentinel-2. In Proceedings of the Image and Signal Processing for Remote Sensing XXIII, Warsaw, Poland, 4 October 2017; p. 1042704. [Google Scholar]
  60. Phiri, D.; Morgenroth, J.; Xu, C.; Hermosilla, T. Effects of pre-processing methods on Landsat OLI-8 land cover classification using OBIA and random forests classifier. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 170–178. [Google Scholar] [CrossRef]
  61. Lantzanakis, G.; Mitraka, Z.; Chrysoulakis, N. Comparison of physically and image based atmospheric correction methods for Sentinel-2 satellite imagery. In Perspectives on Atmospheric Sciences; Springer: Cham, Switzerland, 2017; pp. 255–261. [Google Scholar]
  62. Forkuor, G.; Dimobe, K.; Serme, I.; Tondoh, J.E. Landsat-8 vs. Sentinel-2: Examining the added value of sentinel-2’s red-edge bands to land-use and land-cover mapping in Burkina Faso. GIScience Remote Sens. 2018, 55, 331–354. [Google Scholar] [CrossRef]
  63. Wu, M.; Yang, C.; Song, X.; Hoffmann, W.C.; Huang, W.; Niu, Z.; Wang, C.; Li, W.; Yu, B. Monitoring cotton root rot by synthetic Sentinel-2 NDVI time series using improved spatial and temporal data fusion. Sci. Rep. 2018, 8, 2016. [Google Scholar] [CrossRef] [Green Version]
  64. Vanhellemont, Q. Adaptation of the dark spectrum fitting atmospheric correction for aquatic applications of the Landsat and Sentinel-2 archives. Remote Sens. Environ. 2019, 225, 175–192. [Google Scholar] [CrossRef]
  65. Hagolle, O.; Huc, M.; Villa Pascual, D.; Dedieu, G. A multi-temporal and multi-spectral method to estimate aerosol optical thickness over Land, for the Atmospheric Correction of FormoSat-2, LandSat, VENμS and Sentinel-2 Images. Remote Sens. 2015, 7, 2668–2691. [Google Scholar] [CrossRef] [Green Version]
  66. De Keukelaere, L.; Sterckx, S.; Adriaensen, S.; Knaeps, E.; Reusen, I.; Giardino, C.; Bresciani, M.; Hunter, P.; Neil, C.; Van der Zande, D.J.E.J.O.R.S. Atmospheric correction of Landsat-8/OLI and Sentinel-2/MSI data using iCOR algorithm: Validation for coastal and inland waters. Eur. J. Remote Sens. 2018, 51, 525–542. [Google Scholar] [CrossRef] [Green Version]
  67. Zhu, Z.; Wang, S.; Woodcock, C.E. Improvement and expansion of the Fmask algorithm: Cloud, cloud shadow, and snow detection for Landsats 4–7, 8, and Sentinel 2 images. Remote Sens. Environ. 2015, 159, 269–277. [Google Scholar] [CrossRef]
  68. Frantz, D.; Haß, E.; Uhl, A.; Stoffels, J.; Hill, J. Improvement of the Fmask algorithm for Sentinel-2 images: Separating clouds from bright surfaces based on parallax effects. Remote Sens. Environ. 2018, 215, 471–481. [Google Scholar] [CrossRef]
  69. Goor, E.; Dries, J.; Daems, D.; Paepen, M.; Niro, F.; Goryl, P.; Mougnaud, P.; Della Vecchia, A. PROBA-V Mission Exploitation Platform. Remote Sens. 2016, 8, 564. [Google Scholar] [CrossRef] [Green Version]
  70. Coluzzi, R.; Imbrenda, V.; Lanfredi, M.; Simoniello, T.J.R.S.O.E. A first assessment of the Sentinel-2 Level 1-C cloud mask product to support informed surface analyses. Remote Sens. Environ. 2018, 217, 426–443. [Google Scholar] [CrossRef]
  71. Sekertekin, A.; Marangoz, A.; Akcin, H. Pixel-Based Classification Analysis of Land Use Land Cover Using SENTINEL-2 and LANDSAT-8 Data. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, 42, 91–93. [Google Scholar] [CrossRef] [Green Version]
  72. Kaplan, G.; Avdan, U. Object-based water body extraction model using Sentinel-2 satellite imagery. Eur. J. Remote Sens. 2017, 50, 137–143. [Google Scholar] [CrossRef] [Green Version]
  73. Thanh Noi, P.; Kappas, M. Comparison of Random Forest, k-Nearest Neighbor, and Support Vector Machine Classifiers for Land Cover Classification Using Sentinel-2 Imagery. Sensors 2018, 18, 18. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  74. Suresh, R.; Sneghalatha, R.; Devishree, S.; Pavethera, K. A Survey on Hyperspectral Image classification Using Machine Learning. Available online: https://www.semanticscholar.org/paper/A-Survey-of-Hyperspectral-Image-Classification-in-Ablin-Sulochana/8e6b723e0c971eafd5151030de7fc4ec18edbee5 (accessed on 7 May 2019).
  75. Segal-Rozenhaimer, M.; Li, A.; Das, K.; Chirayath, V. Cloud detection algorithm for multi-modal satellite imagery using convolutional neural-networks (CNN). Remote Sens. Environ. 2020, 237, 111446. [Google Scholar] [CrossRef]
  76. Xiong, J.; Thenkabail, P.S.; Tilton, J.C.; Gumma, M.K.; Teluguntla, P.; Oliphant, A.; Congalton, R.G.; Yadav, K.; Gorelick, N.J.R.S. Nominal 30-m cropland extent map of continental Africa by integrating pixel-based and object-based algorithms using Sentinel-2 and Landsat-8 data on Google Earth Engine. Remote Sens. 2017, 9, 1065. [Google Scholar] [CrossRef] [Green Version]
  77. Hiestermann, J.; Ferreira, S.L. Cloud-based agricultural solution: A case study of near real-time regional agricultural crop growth information in South Africa. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 79–82. [Google Scholar] [CrossRef] [Green Version]
  78. Lu, D.; Weng, Q. A survey of image classification methods and techniques for improving classification performance. Int. J. Remote Sens. 2007, 28, 823–870. [Google Scholar] [CrossRef]
  79. Olaode, A.; Naghdy, G.; Todd, C.J.I.J.O.I.P. Unsupervised classification of images: A review. Int. J. Image Process. 2014, 8, 325–342. [Google Scholar]
  80. Derksen, D.; Inglada, J.; Michel, J. Spatially precise contextual features based on Superpixel Neighborhoods for land cover mapping with high resolution satellite image time series. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 200–203. [Google Scholar]
  81. Rujoiu-Mare, M.-R.; Olariu, B.; Mihai, B.-A.; Nistor, C.; Săvulescu, I. Land cover classification in Romanian Carpathians and Subcarpathians using multi-date Sentinel-2 remote sensing imagery. Eur. J. Remote Sens. 2017, 50, 496–508. [Google Scholar] [CrossRef] [Green Version]
  82. Andersson, J.; Bontemps, M.S. Detecting crop residues burning using Sentinel-2 imagery: Conservation agriculture promotion in Central Malawi. Master’s Thesis, Catholic University of Lovain, Lovain-la-Neuve, Belgium, 2018. Available online: http://hdl.handle.net/2078.1/thesis:17258 (accessed on 7 May 2020).
  83. Clark, M.L. Comparison of simulated hyperspectral HyspIRI and multispectral Landsat 8 and Sentinel-2 imagery for multi-seasonal, regional land-cover mapping. Remote Sens. Environ. 2017, 200, 311–325. [Google Scholar] [CrossRef]
  84. Degerickx, J.; Roberts, D.A.; Somers, B. Enhancing the performance of Multiple Endmember Spectral Mixture Analysis (MESMA) for urban land cover mapping using airborne lidar data and band selection. Remote Sens. Environ. 2019, 221, 260–273. [Google Scholar] [CrossRef]
  85. Colkesen, I.; Kavzoglu, T. Ensemble-based canonical correlation forest (CCF) for land use and land cover classification using sentinel-2 and Landsat OLI imagery. Remote Sens. Lett. 2017, 8, 1082–1091. [Google Scholar] [CrossRef]
  86. Fragoso-Campón, L.; Quirós, E.; Mora, J.; Gutiérrez, J.A.; Durán-Barroso, P. Accuracy enhancement for land cover classification using LiDAR and multitemporal Sentinel 2 images in a forested watershed. Multidiscip. Digit. Publ. Inst. Proc. 2018, 2, 1280. [Google Scholar] [CrossRef] [Green Version]
  87. Gašparović, M.; Jogun, T. The effect of fusing Sentinel-2 bands on land-cover classification. Int. J. Remote Sens. 2018, 39, 822–841. [Google Scholar] [CrossRef]
  88. Glinskis, E.A.; Gutiérrez-Vélez, V.H. Quantifying and understanding land cover changes by large and small oil palm expansion regimes in the Peruvian Amazon. Land Use Policy 2019, 80, 95–106. [Google Scholar] [CrossRef]
  89. Khaliq, A.; Peroni, L.; Chiaberge, M. Land cover and crop classification using multitemporal sentinel-2 images based on crops phenological cycle. In Proceedings of the 2018 IEEE Workshop on Environmental, Energy, and Structural Monitoring Systems (EESMS), Salerno, Italy, 21–22 June 2018; pp. 1–5. [Google Scholar]
  90. Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
  91. Steinhausen, M.J.; Wagner, P.D.; Narasimhan, B.; Waske, B. Combining Sentinel-1 and Sentinel-2 data for improved land use and land cover mapping of monsoon regions. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 595–604. [Google Scholar] [CrossRef]
  92. Weinmann, M.; Weidner, U. Land-cover and land-use classification based on multitemporal Sentinel-2 data. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 4946–4949. [Google Scholar]
  93. Zheng, H.; Du, P.; Chen, J.; Xia, J.; Li, E.; Xu, Z.; Li, X.; Yokoya, N. Performance evaluation of downscaling Sentinel-2 imagery for land use and land cover classification by Spectral-Spatial features. Remote Sens. 2017, 9, 1274. [Google Scholar] [CrossRef] [Green Version]
  94. Gómez, C.; White, J.C.; Wulder, M.A. Optical remotely sensed time series data for land cover classification: A review. ISPRS J. Photogramm. Remote Sens. 2016, 116, 55–72. [Google Scholar] [CrossRef] [Green Version]
  95. Verrelst, J.; Muñoz, J.; Alonso, L.; Delegido, J.; Rivera, J.P.; Camps-Valls, G.; Moreno, J. Machine-learning regression algorithms for biophysical parameter retrieval: Opportunities for Sentinel-2 and -3. Remote Sens. Environ. 2012, 118, 127–139. [Google Scholar] [CrossRef]
  96. Belgiu, M.; Csillik, O. Sentinel-2 cropland mapping using pixel-based and object-based time-weighted dynamic time warping analysis. Remote Sens. Environ. 2018, 204, 509–523. [Google Scholar] [CrossRef]
  97. Dronova, I. Object-based image analysis in wetland research: A review. Remote Sens. 2015, 7, 6380–6413. [Google Scholar] [CrossRef] [Green Version]
  98. Dong, Q.; Chen, X.; Chen, J.; Zhang, C.; Liu, L.; Cao, X.; Zang, Y.; Zhu, X.; Cui, X.J.R.S. Mapping Winter Wheat in North China Using Sentinel 2A/B Data: A Method Based on Phenology-Time Weighted Dynamic Time Warping. Remote Sens. 2020, 12, 1274. [Google Scholar] [CrossRef] [Green Version]
  99. Csillik, O.; Belgiu, M. Cropland mapping from Sentinel-2 time series data using object-based image analysis. In Proceedings of the 20th AGILE International Conference on Geographic Information Science Societal Geo-Innovation Celebrating, Wageningen, The Netherlands, 9–12 May 2017. [Google Scholar]
  100. Delalay, M.; Tiwari, V.; Ziegler, A.D.; Gopal, V.; Passy, P. Land-use and land-cover classification using Sentinel-2 data and machine-learning algorithms: Operational method and its implementation for a mountainous area of Nepal. J. Appl. Remote Sens. 2019, 13, 014530. [Google Scholar] [CrossRef]
  101. Gómez, V.P.; Medina, V.D.B.; Bengoa, J.L.; García, D.A.N. Accuracy assessment of a 122 classes land cover map based on Sentinel-2, Landsat 8 and Deimos-1 images and Ancillary data. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 5453–5456. [Google Scholar]
  102. Heryadi, Y.; Miranda, E. Land cover classification based on Sentinel-2 satellite imagery using Convolutional Neural Network model: A case study in Semarang Area, Indonesia. In Asian Conference on Intelligent Information and Database Systems; Springer: Cham, Switzerland, 2019; pp. 191–206. [Google Scholar]
  103. Kaplan, G.; Avdan, U. Mapping and monitoring wetlands using sentinel-2 satellite imagery. In Proceedings of the ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume IV-4/W4, 2017 4th International GeoAdvances Workshop, Safranbolu, Karabuk, Turkey, 14–15 October 2017. [Google Scholar]
  104. Kolokoussis, P.; Karathanassi, V. Oil spill detection and mapping using sentinel 2 imagery. J. Mar. Sci. Eng. 2018, 6, 4. [Google Scholar] [CrossRef] [Green Version]
  105. Labib, S.M.; Harris, A. The potentials of Sentinel-2 and LandSat-8 data in green infrastructure extraction, using object-based image analysis (OBIA) method. Eur. J. Remote Sens. 2018, 51, 231–240. [Google Scholar] [CrossRef]
  106. Laurent, V.C.E.; Schaepman, M.E.; Verhoef, W.; Weyermann, J.; Chávez, R.O. Bayesian object-based estimation of LAI and chlorophyll from a simulated Sentinel-2 top-of-atmosphere radiance image. Remote Sens. Environ. 2014, 140, 318–329. [Google Scholar] [CrossRef]
  107. Lu, L.; Tao, Y.; Di, L. Object-Based Plastic-Mulched Landcover Extraction Using Integrated Sentinel-1 and Sentinel-2 Data. Remote Sens. 2018, 10, 1820. [Google Scholar] [CrossRef] [Green Version]
  108. Marangoz, A.M.; Sekertekin, A.; Akçin, H. Analysis of land use land cover classification results derived from sentinel-2 image. In Proceedings of the 17th International Multidisciplinary Scientific GeoConference Surveying Geology and Mining Ecology Management, SGEM, Albena, Bulgaria, 29 June–5 July 2017; pp. 25–32. [Google Scholar]
  109. Sánchez-Espinosa, A.; Schröder, C.J.J. Land use and land cover mapping in wetlands one step closer to the ground: Sentinel-2 versus landsat 8. J. Environ. Manag. 2019, 247, 484–498. [Google Scholar] [CrossRef]
  110. Mongus, D.; Žalik, B. Segmentation schema for enhancing land cover identification: A case study using Sentinel 2 data. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 56–68. [Google Scholar] [CrossRef]
  111. Popescu, A.; Faur, D.; Vaduva, C.; Datcu, M. Enhanced classification of land cover through joint analysis of Sentinel-1 and Sentinel-2 data. In Proceedings of the ESA Living Planet Symposium, Prague, CzechRepublic, 9–13 May 2016; pp. 9–13. [Google Scholar]
  112. Li, C.; Wang, J.; Wang, L.; Hu, L.; Gong, P. Comparison of classification algorithms and training sample sizes in urban land classification with Landsat thematic mapper imagery. Remote Sens. 2014, 6, 964–983. [Google Scholar] [CrossRef] [Green Version]
  113. Ruppert, G.; Hussain, M.; Müller, H.J.A.J.O.S. Accuracy assessment of satellite image classification depending on training sample. Austrian J. Stat. 1999, 28, 195–201. [Google Scholar] [CrossRef]
  114. Topaloğlu, R.H.; Sertel, E.; Musaoğlu, N. Assessment of classification accuracies of sentinel-2 and landsat-8 data for land cover/use mapping. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 1055–1059. [Google Scholar] [CrossRef]
  115. Blesius, L.; Weirich, F. The use of the Minnaert correction for land-cover classification in mountainous terrain. Int. J. Remote Sens. 2005, 26, 3831–3851. [Google Scholar] [CrossRef]
  116. Wang, Q.; Shi, W.; Li, Z.; Atkinson, P.M. Fusion of Sentinel-2 images. Remote Sens. Environ. 2016, 187, 241–252. [Google Scholar] [CrossRef] [Green Version]
  117. da Silveira, H.L.F.; Galvão, L.S.; Sanches, I.D.A.; de Sá, I.B.; Taura, T.A. Use of MSI/Sentinel-2 and airborne LiDAR data for mapping vegetation and studying the relationships with soil attributes in the Brazilian semi-arid region. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 179–190. [Google Scholar] [CrossRef]
  118. Mura, M.; Bottalico, F.; Giannetti, F.; Bertani, R.; Giannini, R.; Mancini, M.; Orlandini, S.; Travaglini, D.; Chirici, G. Exploiting the capabilities of the Sentinel-2 multi spectral instrument for predicting growing stock volume in forest ecosystems. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 126–134. [Google Scholar] [CrossRef]
  119. Yang, X.; Qin, Q.; Grussenmeyer, P.; Koehl, M. Urban surface water body detection with suppressed built-up noise based on water indices from Sentinel-2 MSI imagery. Remote Sens. Environ. 2018, 219, 259–270. [Google Scholar] [CrossRef]
  120. Delegido, J.; Verrelst, J.; Alonso, L.; Moreno, J. Evaluation of sentinel-2 red-edge bands for empirical estimation of green LAI and chlorophyll content. Sensors 2011, 11, 7063–7081. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  121. Ozigis, M.; Kaduk, J.; Jarvis, C. Synergistic application of Sentinel 1 and Sentinel 2 derivatives for terrestrial oil spill impact mapping. In Proceedings of the Active and Passive Microwave Remote Sensing for Environmental Monitoring II, Berlin, Germany, 20 November 2018; p. 107880R. [Google Scholar]
  122. Wang, Q.; Atkinson, P.M. Spatio-temporal fusion for daily sentinel-2 images. Remote Sens. Environ. 2018, 204, 31–42. [Google Scholar] [CrossRef] [Green Version]
  123. Guzinski, R.; Nieto, H. Evaluating the feasibility of using Sentinel-2 and Sentinel-3 satellites for high-resolution evapotranspiration estimations. Remote Sens. Environ. 2019, 221, 157–172. [Google Scholar] [CrossRef]
  124. Szantoi, Z.; Strobl, P. Copernicus Sentinel-2 Calibration and Validation; Taylor & Francis: New York, NY, USA, 2019. [Google Scholar]
  125. Pardo-Pascual, J.E.; Sánchez-García, E.; Almonacid-Caballer, J.; Palomar-Vázquez, J.M.; Priego De Los Santos, E.; Fernández-Sarría, A.; Balaguer-Beser, Á.J.R.S. Assessing the accuracy of automatically extracted shorelines on microtidal beaches from Landsat 7, Landsat 8 and Sentinel-2 imagery. Remote Sens. 2018, 10, 326. [Google Scholar] [CrossRef] [Green Version]
  126. Szostak, M.; Hawryło, P.; Piela, D. Using of Sentinel-2 images for automation of the forest succession detection. Eur. J. Remote Sens. 2018, 51, 142–149. [Google Scholar] [CrossRef]
  127. Wang, D.; Wan, B.; Qiu, P.; Su, Y.; Guo, Q.; Wang, R.; Sun, F.; Wu, X.J.R.S. Evaluating the performance of sentinel-2, landsat 8 and pléiades-1 in mapping mangrove extent and species. Remote Sens. 2018, 10, 1468. [Google Scholar] [CrossRef] [Green Version]
  128. Nzimande, N.; Mutanga, O.; Kiala, Z.; Sibanda, M.J.S.A.G.J. Mapping the spatial distribution of the yellowwood tree (Podocarpus henkelii) in the Weza-Ngele forest using the newly launched Sentinel-2 multispectral imager data. S. Afr. Geogr. J. 2020, 1–19. [Google Scholar] [CrossRef]
  129. Naidoo, L.; Van Deventer, H.; Ramoelo, A.; Mathieu, R.; Nondlazi, B.; Gangat, R. Estimating above ground biomass as an indicator of carbon storage in vegetated wetlands of the grassland biome of South Africa. Int. J. Appl. Earth Obs. Geoinf. 2019, 78, 118–129. [Google Scholar] [CrossRef]
  130. Bulut, S.; Gunlut, A. Determination of total carbon storage using Sentinel-2 and geographic information systems in mixed forests. Anadolu Orman Araştırmaları Dergisi 2019, 5, 127–135. [Google Scholar]
  131. Adjognon, G.S.; Rivera-Ballesteros, A.; van Soest, D.J.D.E. Satellite-based tree cover mapping for forest conservation in the drylands of Sub Saharan Africa (SSA): Application to Burkina Faso gazetted forests. Dev. Eng. 2019, 4, 100039. [Google Scholar] [CrossRef]
  132. Laurin, G.V.; Puletti, N.; Hawthorne, W.; Liesenberg, V.; Corona, P.; Papale, D.; Chen, Q.; Valentini, R.J.R.S.O.E. Discrimination of tropical forest types, dominant species, and mapping of functional guilds by hyperspectral and simulated multispectral Sentinel-2 data. Remote Sens. Environ. 2016, 176, 163–176. [Google Scholar] [CrossRef] [Green Version]
  133. Shikwambana, L.; Ncipha, X.; Malahlela, O.E.; Mbatha, N.; Sivakumar, V. Characterisation of aerosol constituents from wildfires using satellites and model data: A case study in Knysna, South Africa. Int. J. Remote Sens. 2019, 40, 4743–4761. [Google Scholar] [CrossRef]
  134. Caballero, I.; Ruiz, J.; Navarro, G.J.W. Sentinel-2 satellites provide Near-Real time evaluation of catastrophic floods in the West Mediterranean. Wate 2019, 11, 2499. [Google Scholar] [CrossRef] [Green Version]
  135. Al-Gaadi, K.A.; Hassaballa, A.A.; Tola, E.; Kayad, A.G.; Madugundu, R.; Alblewi, B.; Assiri, F.J.P.O. Prediction of potato crop yield using precision agriculture techniques. PLoS ONE 2016, 11, e0162219. [Google Scholar] [CrossRef] [PubMed]
  136. Lebourgeois, V.; Dupuy, S.; Vintrou, É.; Ameline, M.; Butler, S.; Bégué, A.J.R.S. A combined random forest and OBIA classification scheme for mapping smallholder agriculture at different nomenclature levels using multisource data (simulated Sentinel-2 time series, VHRS and DEM). Remote Sens. 2017, 9, 259. [Google Scholar] [CrossRef] [Green Version]
  137. Vogels, M.F.; De Jong, S.; Sterk, G.; Douma, H.; Addink, E. Spatio-temporal patterns of smallholder irrigated agriculture in the horn of Africa using GEOBIA and Sentinel-2 imagery. Remote Sens. 2019, 11, 143. [Google Scholar] [CrossRef] [Green Version]
  138. Estrada, J.; Sánchez, H.; Hernanz, L.; Checa, M.J.; Roman, D. Enabling the Use of Sentinel-2 and LiDAR Data for Common Agriculture Policy Funds Assignment. ISPRS Int. J. Geo-Inf. 2017, 6, 255. [Google Scholar] [CrossRef] [Green Version]
  139. Gao, Q.; Zribi, M.; Escorihuela, M.J.; Baghdadi, N.J.S. Synergetic use of Sentinel-1 and Sentinel-2 data for soil moisture mapping at 100 m resolution. Sensor 2017, 17, 1966. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  140. Haas, J.; Ban, Y. Sentinel-1A SAR and sentinel-2A MSI data fusion for urban ecosystem service mapping. Remote Sens. Appl. Soc. Environ. 2017, 8, 41–53. [Google Scholar] [CrossRef]
  141. Puletti, N.; Chianucci, F.; Castaldi, C. Use of Sentinel-2 for forest classification in Mediterranean environments. Ann. Silvic. Res. 2017, 42, 32–38. [Google Scholar]
  142. Sibanda, M.; Mutanga, O.; Dube, T.; S Vundla, T.; L Mafongoya, P. Estimating LAI and mapping canopy storage capacity for hydrological applications in wattle infested ecosystems using Sentinel-2 MSI derived red edge bands. GIScience Remote Sens. 2019, 56, 68–86. [Google Scholar] [CrossRef]
  143. Clasen, A.; Somers, B.; Pipkins, K.; Tits, L.; Segl, K.; Brell, M.; Kleinschmit, B.; Spengler, D.; Lausch, A.; Förster, M.J.R.S. Spectral unmixing of forest crown components at close range, airborne and simulated Sentinel-2 and EnMAP spectral imaging scale. Remote Sens. 2015, 7, 15361–15387. [Google Scholar] [CrossRef] [Green Version]
  144. Whyte, A.; Ferentinos, K.P.; Petropoulos, G.P. A new synergistic approach for monitoring wetlands using Sentinels-1 and 2 data with object-based machine learning algorithms. Environ. Model. Softw. 2018, 104, 40–54. [Google Scholar] [CrossRef] [Green Version]
  145. Richter, K.; Hank, T.B.; Vuolo, F.; Mauser, W.; D’Urso, G.J.R.S. Optimal exploitation of the Sentinel-2 spectral capabilities for crop leaf area index mapping. Remote Sens. 2012, 4, 561–582. [Google Scholar] [CrossRef] [Green Version]
  146. Puliti, S.; Saarela, S.; Gobakken, T.; Ståhl, G.; Næsset, E. Combining UAV and Sentinel-2 auxiliary data for forest growing stock volume estimation through hierarchical model-based inference. Remote Sens. Environ. 2018, 204, 485–497. [Google Scholar] [CrossRef]
  147. Kattenborn, T.; Lopatin, J.; Förster, M.; Braun, A.C.; Fassnacht, F.E. UAV data as alternative to field sampling to map woody invasive species based on combined Sentinel-1 and Sentinel-2 data. Remote Sens. Environ. 2019, 227, 61–73. [Google Scholar] [CrossRef]
  148. Ng, W.-T.; Rima, P.; Einzmann, K.; Immitzer, M.; Atzberger, C.; Eckert, S.J.R.S. Assessing the potential of Sentinel-2 and Pléiades data for the detection of Prosopis and Vachellia spp. in Kenya. Remote Sens. 2017, 9, 74. [Google Scholar] [CrossRef] [Green Version]
  149. Roteta, E.; Bastarrika, A.; Padilla, M.; Storm, T.; Chuvieco, E.J.R.S.O.E. Development of a Sentinel-2 burned area algorithm: Generation of a small fire database for sub-Saharan Africa. Remote Sens. Environ. 2019, 222, 1–17. [Google Scholar] [CrossRef]
  150. Navarro, G.; Caballero, I.; Silva, G.; Parra, P.-C.; Vázquez, Á.; Caldeira, R. Evaluation of forest fire on Madeira Island using Sentinel-2A MSI imagery. Int. J. Appl. Earth Obs. Geoinf. 2017, 58, 97–106. [Google Scholar] [CrossRef] [Green Version]
  151. Martinis, S.; Caspard, M.; Plank, S.; Clandillon, S.; Haouet, S. Mapping burn scars, fire severity and soil erosion susceptibility in Southern France using multisensoral satellite data. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 1099–1102. [Google Scholar]
  152. Mondal, P.; Liu, X.; Fatoyinbo, T.E.; Lagomasino, D.J.R.S. Evaluating combinations of Sentinel-2 data and Machine-Learning Algorithms for Mangrove mapping in West Africa. Remote Sens. 2019, 11, 2928. [Google Scholar] [CrossRef] [Green Version]
  153. Gress, S.K.; Huxham, M.; Kairo, J.G.; Mugi, L.M.; Briers, R.A.J.G.C.B. Evaluating, predicting and mapping belowground carbon stores in Kenyan mangroves. Glob. Chang. Biol. 2017, 23, 224–234. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  154. Pham, T.D.; Xia, J.; Baier, G.; Le, N.N.; Yokoya, N. Mangrove species mapping using Sentinel-1 and Sentinel-2 Data in North Vietnam. In Proceedings of the IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 6102–6105. [Google Scholar]
  155. Majasalmi, T.; Rautiainen, M. The potential of Sentinel-2 data for estimating biophysical variables in a boreal forest: A simulation study. Remote Sens. Lett. 2016, 7, 427–436. [Google Scholar] [CrossRef]
  156. Shoko, C.; Mutanga, O.; Dube, T.; Slotow, R. Characterizing the spatio-temporal variations of C3 and C4 dominated grasslands aboveground biomass in the Drakensberg, South Africa. Int. J. Appl. Earth Obs. Geoinf. 2018, 68, 51–60. [Google Scholar] [CrossRef] [Green Version]
  157. Gholizadeh, A.; Žižala, D.; Saberioon, M.; Borůvka, L. Soil organic carbon and texture retrieving and mapping using proximal, airborne and Sentinel-2 spectral imaging. Remote Sens. Environ. 2018, 218, 89–103. [Google Scholar] [CrossRef]
  158. Suresh, G.; Hovenbitzer, M. Quantification of forest extent in Germany by combining multi-temporal stacks of Sentinel-1 and Sentinel-2 images. In Proceedings of the Sixth International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2018), Paphos, Cyprus, 6 August 2018; p. 107730N. [Google Scholar]
  159. Filchev, L.J.A.R.I.B. Land-use/land-cover change of Bistrishko Branishte biosphere reserve using Sentinel-2 simulated data. Aerosp. Res. Bulg. 2015, 27, 54–65. [Google Scholar]
  160. Konko, Y.; Rudant, J.P.; Akpamou, G.K.; Noumonvi, K.D.; Kokou, K. Spatio-Temporal Distribution of Southeastern Community Forests in Togo (West Africa). Geosci. Environ. Prot. 2018, 6, 51. [Google Scholar] [CrossRef] [Green Version]
  161. Mutowo, G.; Mutanga, O.; Masocha, M. Mapping foliar N in miombo woodlands using sentinel-2 derived chlorophyll and structural indices. J. Appl. Remote Sens. 2018, 12, 046028. [Google Scholar] [CrossRef]
  162. Ramoelo, A.; Cho, M.; Mathieu, R.; Skidmore, A.K. Potential of Sentinel-2 spectral configuration to assess rangeland quality. J. Appl. Remote Sens. 2015, 9, 094096. [Google Scholar] [CrossRef] [Green Version]
  163. Darvishzadeh, R.; Skidmore, A.; Abdullah, H.; Cherenet, E.; Ali, A.; Wang, T.; Nieuwenhuis, W.; Heurich, M.; Vrieling, A.; O’Connor, B.; et al. Mapping leaf chlorophyll content from Sentinel-2 and RapidEye data in spruce stands using the invertible forest reflectance model. Int. J. Appl. Earth Obs. Geoinf. 2019, 79, 58–70. [Google Scholar] [CrossRef] [Green Version]
  164. Chrysafis, I.; Mallinis, G.; Siachalou, S.; Patias, P. Assessing the relationships between growing stock volume and Sentinel-2 imagery in a Mediterranean forest ecosystem. Remote Sens. Lett. 2017, 8, 508–517. [Google Scholar] [CrossRef]
  165. Astola, H.; Häme, T.; Sirro, L.; Molinier, M.; Kilpi, J. Comparison of Sentinel-2 and Landsat 8 imagery for forest variable prediction in boreal region. Remote Sens. Environ. 2019, 223, 257–273. [Google Scholar] [CrossRef]
  166. Wittke, S.; Yu, X.; Karjalainen, M.; Hyyppä, J.; Puttonen, E. Comparison of two-dimensional multitemporal Sentinel-2 data with three-dimensional remote sensing data sources for forest inventory parameter estimation over a boreal forest. Int. J. Appl. Earth Obs. Geoinf. 2019, 76, 167–178. [Google Scholar] [CrossRef]
  167. Yesou, H.; Pottier, E.; Mercier, G.; Grizonnet, M.; Haouet, S.; Giros, A.; Faivre, R.; Huber, C.; Michel, J. Synergy of Sentinel-1 and Sentinel-2 imagery for wetland monitoring information extraction from continuous flow of sentinel images applied to water bodies and vegetation mapping and monitoring. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; pp. 162–165. [Google Scholar]
  168. Mahdianpari, M.; Salehi, B.; Mohammadimanesh, F.; Homayouni, S.; Gill, E. The first wetland inventory map of newfoundland at a spatial resolution of 10 m using sentinel-1 and sentinel-2 data on the google earth engine cloud computing platform. Remote Sens. 2019, 11, 43. [Google Scholar] [CrossRef] [Green Version]
  169. Dimitrov, P.; Kamenova, I.; Roumenina, E.; Filchev, L.; Ilieva, I.; Jelev, G.; Gikov, A.; Banov, M.; Krasteva, V.; Kolchakov, V.; et al. Estimation of biophysical and biochemical variables of winter wheat through Sentinel-2 vegetation indices. Bulg. J. Agric. Sci. 2019, 25, 819–832. [Google Scholar]
  170. Nedkov, R. Quantitative assessment of forest degradation after fire using ortogonalized satellite images from SENTINEL-2. Comptes Rendus de l’Academie Bulgare Sci. 2018, 71, 83–86. [Google Scholar]
  171. Munyati, C. The potential for integrating Sentinel 2 MSI with SPOT 5 HRG and Landsat 8 OLI imagery for monitoring semi-arid savannah woody cover. Int. J. Remote Sens. 2017, 38, 4888–4913. [Google Scholar] [CrossRef]
  172. Dotzler, S.; Hill, J.; Buddenbaum, H.; Stoffels, J.J.R.S. The potential of EnMAP and Sentinel-2 data for detecting drought stress phenomena in deciduous forest communities. Remote Sens. 2015, 7, 14227–14258. [Google Scholar] [CrossRef] [Green Version]
  173. Sothe, C.; Almeida, C.M.D.; Liesenberg, V.; Schimalski, M.B.J.R.S. Evaluating Sentinel-2 and Landsat-8 data to map sucessional forest stages in a subtropical forest in Southern Brazil. Remote Sens. 2017, 9, 838. [Google Scholar] [CrossRef] [Green Version]
  174. Hojas-Gascon, L.; Belward, A.; Eva, H.; Ceccherini, G.; Hagolle, O.; Garcia, J.; Cerutti, P. Potential improvement for forest cover and forest degradation mapping with the forthcoming Sentinel-2 program. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 417. [Google Scholar] [CrossRef] [Green Version]
  175. Hawryło, P.; Bednarz, B.; Wężyk, P.; Szostak, M. Estimating defoliation of Scots pine stands using machine learning methods and vegetation indices of Sentinel-2. Eur. J. Remote Sens. 2018, 51, 194–204. [Google Scholar] [CrossRef] [Green Version]
  176. Lange, M.; Dechant, B.; Rebmann, C.; Vohland, M.; Cuntz, M.; Doktor, D.J.S. Validating MODIS and sentinel-2 NDVI products at a temperate deciduous forest site using two independent ground-based sensors. Sensor 2017, 17, 1855. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  177. Pandit, S.; Tsuyuki, S.; Dube, T. Estimating above-ground biomass in sub-tropical buffer zone community forests, Nepal, using Sentinel 2 data. Remote Sens. 2018, 10, 601. [Google Scholar] [CrossRef] [Green Version]
  178. Bontemps, S.; Arias, M.; Cara, C.; Dedieu, G.; Guzzonato, E.; Hagolle, O.; Inglada, J.; Matton, N.; Morin, D.; Popescu, R.J.R.S. Building a data set over 12 globally distributed sites to support the development of agriculture monitoring applications with Sentinel-2. Remote Sens. 2015, 7, 16062–16090. [Google Scholar] [CrossRef] [Green Version]
  179. Bontemps, S.; Arias, M.; Cara, C.; Dedieu, G.; Guzzonato, E.; Hagolle, O.; Inglada, J.; Morin, D.; Rabaute, T.; Savinaud, M. “Sentinel-2 for agriculture”: Supporting global agriculture monitoring. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 4185–4188. [Google Scholar]
  180. OneSoil. An AgriTech Start-up from Belarus Demonstrates That Societal and Economic Benefits of Copernicus go Beyond the Borders of the European Union; ESA: Paris, France, 2019. [Google Scholar]
  181. Bellemans, N.; Bontemps, S.; Defourny, P. Sentinel-2 for Agriculture project: Preparing Sentinel-2 operational exploitation for supporting national and global crop monitoring. In Proceedings of the 6th Digital Earth Summit, ISDE, Beijing, China, 7–8 July 2016. [Google Scholar]
  182. Lambert, M.-J.; Traoré, P.C.S.; Blaes, X.; Baret, P.; Defourny, P. Estimating smallholder crops production at village level from Sentinel-2 time series in Mali’s cotton belt. Remote Sens. Environ. 2018, 216, 647–657. [Google Scholar] [CrossRef]
  183. Taona, M.T. Crop Type Mapping in a Highly Heterogeneous Agriculture Landscape: A Case of Marble Hall Using Multi-Temporal Landsat 8 and Sentinel 2 Imageries. Master’s Thesis, University of the Witwatersrand, Johannesburg, South Africa, 2019. [Google Scholar]
  184. Kussul, N.; Kolotii, A.; Shelestov, A.; Lavreniuk, M.; Bellemans, N.; Bontemps, S.; Defourny, P.; Koetz, B. Sentinel-2 for agriculture national demonstration in Ukraine: Results and further steps. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 5842–5845. [Google Scholar]
  185. Schlemmer, M.; Gitelson, A.; Schepers, J.; Ferguson, R.; Peng, Y.; Shanahan, J.; Rundquist, D. Remote estimation of nitrogen and chlorophyll contents in maize at leaf and canopy levels. Int. J. Appl. Earth Obs. Geoinf 2013, 25, 47–54. [Google Scholar] [CrossRef] [Green Version]
  186. Dhau, I.; Adam, E.; Mutanga, O.; Ayisi, K.; Abdel-Rahman, E.M.; Odindi, J.; Masocha, M. Testing the capability of spectral resolution of the new multispectral sensors on detecting the severity of grey leaf spot disease in maize crop. Geocarto Int. 2018, 33, 1223–1236. [Google Scholar] [CrossRef]
  187. Chemura, A.; Mutanga, O.; Odindi, J. Empirical modeling of leaf chlorophyll content in coffee (coffea arabica) plantations with sentinel-2 msi data: Effects of spectral settings, spatial resolution, and crop canopy cover. Remote Sens. 2017, 10, 5541–5550. [Google Scholar] [CrossRef]
  188. Vogels, M.; De Jong, S.; Sterk, G.; Addink, E. Mapping irrigated agriculture in complex landscapes using object-based image analysis. In Proceedings of the GEOBIA 2018-From Pixels to Ecosystems and Global Sustainability, Montpellier, France, 4–7 June 2018. [Google Scholar]
  189. Zheng, Q.; Huang, W.; Cui, X.; Shi, Y.; Liu, L.J.S. New spectral index for detecting wheat yellow rust using sentinel-2 multispectral imagery. Sensors 2018, 18, 868. [Google Scholar] [CrossRef] [Green Version]
  190. Mutanga, O.; Dube, T.; Galal, O. Remote sensing of crop health for food security in Africa: Potentials and constraints. Remote Sens. Appl. Soc. Environ. 2017, 8, 231–239. [Google Scholar] [CrossRef]
  191. Chemura, A.; Mutanga, O.; Dube, T.J.P.A. Separability of coffee leaf rust infection levels with machine learning methods at Sentinel-2 MSI spectral resolutions. Precis. Agric. 2017, 18, 859–881. [Google Scholar] [CrossRef]
  192. Xie, Q.; Dash, J.; Huete, A.; Jiang, A.; Yin, G.; Ding, Y.; Peng, D.; Hall, C.C.; Brown, L.; Shi, Y.; et al. Retrieval of crop biophysical parameters from Sentinel-2 remote sensing imagery. Int. J. Appl. Earth Obs. Geoinf. 2019, 80, 187–195. [Google Scholar] [CrossRef]
  193. Upreti, D.; Huang, W.; Kong, W.; Pascucci, S.; Pignatti, S.; Zhou, X.; Ye, H.; Casa, R. A comparison of hybrid machine learning algorithms for the retrieval of wheat biophysical variables from Sentinel-2. Remote Sens. 2019, 11, 481. [Google Scholar] [CrossRef] [Green Version]
  194. Frampton, W.J.; Dash, J.; Watmough, G.; Milton, E.J. Evaluating the capabilities of Sentinel-2 for quantitative estimation of biophysical variables in vegetation. ISPRS J. Photogramm. Remote Sens. 2013, 82, 83–92. [Google Scholar] [CrossRef] [Green Version]
  195. Verrelst, J.; Rivera, J.P.; Leonenko, G.; Alonso, L.; Moreno, J. Optimizing LUT-Based RTM Inversion for Semiautomatic Mapping of Crop Biophysical Parameters from Sentinel-2 and -3 Data: Role of Cost Functions. IEEE Trans. Geosci. Remote Sens. 2014, 52, 257–269. [Google Scholar] [CrossRef]
  196. Zheng, B.; Campbell, J.B.; Serbin, G.; Galbraith, J.M.J.S.; Research, T. Remote sensing of crop residue and tillage practices: Present capabilities and future prospects. Soil Tillage Res. 2014, 138, 26–34. [Google Scholar] [CrossRef]
  197. Veloso, A.; Mermoz, S.; Bouvet, A.; Le Toan, T.; Planells, M.; Dejoux, J.-F.; Ceschia, E. Understanding the temporal behavior of crops using Sentinel-1 and Sentinel-2-like data for agricultural applications. Remote Sens. Environ. 2017, 199, 415–426. [Google Scholar] [CrossRef]
  198. Dimitrov, P.; Dong, Q.; Eerens, H.; Gikov, A.; Filchev, L.; Roumenina, E.; Jelev, G. Sub-Pixel crop type classification using PROBA-V 100 m NDVI time series and reference data from Sentinel-2 classifications. Remote Sens. 2019, 11, 1370. [Google Scholar] [CrossRef] [Green Version]
  199. Delloye, C.; Weiss, M.; Defourny, P. Retrieval of the canopy chlorophyll content from Sentinel-2 spectral bands to estimate nitrogen uptake in intensive winter wheat cropping systems. Remote Sens. Environ. 2018, 216, 245–261. [Google Scholar] [CrossRef]
  200. Zhang, T.; Su, J.; Liu, C.; Chen, W.-H.; Liu, H.; Liu, G. Band selection in Sentinel-2 satellite for agriculture applications. In Proceedings of the 2017 23rd International Conference on Automation and Computing (ICAC), Huddersfield, UK, 7–8 September 2017; pp. 1–6. [Google Scholar]
  201. Valero, S.; Morin, D.; Inglada, J.; Sepulcre, G.; Arias, M.; Hagolle, O.; Dedieu, G.; Bontemps, S.; Defourny, P.; Koetz, B.J.R.S. Production of a dynamic cropland mask by processing remote sensing image series at high temporal and spatial resolutions. Remote Sens. 2016, 8, 55. [Google Scholar] [CrossRef] [Green Version]
  202. El Hajj, M.; Baghdadi, N.; Zribi, M.; Bazzi, H.J.R.S. Synergic use of Sentinel-1 and Sentinel-2 images for operational soil moisture mapping at high spatial resolution over agricultural areas. Remote Sens. 2017, 9, 1292. [Google Scholar] [CrossRef] [Green Version]
  203. Sadeghi, M.; Babaeian, E.; Tuller, M.; Jones, S.B. The optical trapezoid model: A novel approach to remote sensing of soil moisture applied to Sentinel-2 and Landsat-8 observations. Remote Sens. Environ. 2017, 198, 52–68. [Google Scholar] [CrossRef] [Green Version]
  204. Tavares, P.A.; Beltrão, N.E.S.; Guimarães, U.S.; Teodoro, A.C.J.S. Integration of sentinel-1 and sentinel-2 for classification and LULC mapping in the urban area of Belém, eastern Brazilian Amazon. Sensors 2019, 19, 1140. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  205. Chunping, Q.; Schmitt, M.; Lichao, M.; Xiaoxiang, Z. Urban local climate zone classification with a residual convolutional Neural Network and multi-seasonal Sentinel-2 images. In Proceedings of the 2018 10th IAPR Workshop on Pattern Recognition in Remote Sensing (PRRS), Beijing, China, 19–20 August 2018; pp. 1–5. [Google Scholar]
  206. Møller-Jensen, L. Mapping the rural-urban transition zone: Peri-urban development in Accra, Ghana. In Proceedings of the EARSEL-SIG 5th Joint Workshop, “Urban Remote Sensing–Challenges & Solutions”, Bochum, Germany, 4–26 September 2018. [Google Scholar]
  207. Gibson, L.; Engelbrecht, J.; Rush, D. Detecting historic informal settlement fires with sentinel 1 and 2 satellite data-Two case studies in Cape Town. Fire Saf. J. 2019, 108, 102828. [Google Scholar] [CrossRef]
  208. Yang, X.; Chen, L. Evaluation of automated urban surface water extraction from Sentinel-2A imagery using different water indices. J. Appl. Remote Sens. 2017, 11, 026016. [Google Scholar] [CrossRef]
  209. Simwanda, M.; Murayama, Y. Integrating geospatial techniques for urban land use classification in the developing sub-Saharan African city of Lusaka, Zambia. ISPRS Int. J. Geo-Inf. 2017, 6, 102. [Google Scholar] [CrossRef] [Green Version]
  210. Haas, J.; Ban, Y. Urban Land Cover and Ecosystem Service Changes based on Sentinel-2A MSI and Landsat TM Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 485–497. [Google Scholar] [CrossRef]
  211. Kranjčić, N.; Medak, D.; Župan, R.; Rezo, M.J.R.S. Support Vector Machine Accuracy Assessment for Extracting Green Urban Areas in Towns. Remote Sens. 2019, 11, 655. [Google Scholar]
  212. Gombe, K.E.; Asanuma, I.; Park, J.-G. Quantification of annual urban growth of Dar es Salaam Tanzania from Landsat time Series data. Adv. Remote Sens. 2017, 6, 175–191. [Google Scholar] [CrossRef] [Green Version]
  213. Kääb, A.; Altena, B.; Mascaro, J. Coseismic displacements of the 14 November 2016 Mw 7.8 Kaikoura, New Zealand, earthquake using the Planet optical cubesat constellation. Nat. Hazards Earth Syst. Sci. 2017, 17, 627–639. [Google Scholar] [CrossRef] [Green Version]
  214. Gray, D.M.; Burton-Johnson, A.; Fretwell, P.T. Evidence for a lava lake on Mt. Michael volcano, Saunders Island (South Sandwich Islands) from Landsat, Sentinel-2 and ASTER satellite imagery. J. Volcanol. Geotherm. Res. 2019, 379, 60–71. [Google Scholar] [CrossRef]
  215. Jelének, J.; Kopačková, V.; Fárová, K. Post-earthquake landslide distribution assessment using sentinel-1 and-2 data: The example of the 2016 mw 7.8 earthquake in New Zealand. Multidiscip. Digit. Publ. Inst. Proc. 2018, 2, 361. [Google Scholar] [CrossRef] [Green Version]
  216. Iannelli, G.C.; Gamba, P. Jointly exploiting Sentinel-1 and Sentinel-2 for urban mapping. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 8209–8212. [Google Scholar]
  217. Yang, X.; Zhao, S.; Qin, X.; Zhao, N.; Liang, L.J.R.S. Mapping of urban surface water bodies from Sentinel-2 MSI imagery at 10 m resolution via NDWI-based image sharpening. Remote Sens. 2017, 9, 596. [Google Scholar] [CrossRef] [Green Version]
  218. Sekertekin, A.; Cicekli, S.Y.; Arslan, N. Index-based identification of surface water resources using Sentinel-2 satellite imagery. In Proceedings of the 2018 2nd International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), Ankara, Turkey, 11–13 October 2018; pp. 1–5. [Google Scholar]
  219. Qiu, C.; Schmitt, M.; Ghamisi, P.; Zhu, X. Effect of the training set configuration on sentinel-2-based urban local climate zone classification. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences Symposium, Riva del Garda, Italy, 4–7 June 2018. [Google Scholar]
  220. Daudt, R.C.; Le Saux, B.; Boulch, A.; Gousseau, Y. Urban change detection for multispectral earth observation using convolutional neural networks. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 2115–2118. [Google Scholar]
  221. Recanatesi, F.; Giuliani, C.; Ripa, M.N.J.S. Monitoring Mediterranean Oak decline in a peri-urban protected area using the NDVI and Sentinel-2 images: The case study of Castelporziano State Natural Reserve. Sustainability 2018, 10, 3308. [Google Scholar] [CrossRef] [Green Version]
  222. Kaloustian, N.; Bechtel, B.J.P.E. Local climatic zoning and urban heat island in Beirut. Procedia Eng. 2016, 169, 216–223. [Google Scholar] [CrossRef]
  223. Qiu, C.; Mou, L.; Schmitt, M.; Zhu, X.X. Local climate zone-based urban land cover classification from multi-seasonal Sentinel-2 images with a recurrent residual network. ISPRS J. Photogramm. Remote Sens. 2019, 154, 151–162. [Google Scholar] [CrossRef] [PubMed]
  224. Stumpf, A.; Michéa, D.; Malet, J.-P. Improved co-registration of Sentinel-2 and Landsat-8 imagery for earth surface motion measurements. Remote Sens. 2018, 10, 160. [Google Scholar] [CrossRef] [Green Version]
  225. Valade, S.; Ley, A.; Massimetti, F.; D’Hondt, O.; Laiolo, M.; Coppola, D.; Loibl, D.; Hellwich, O.; Walter, T.R. Towards global volcano monitoring using multisensor sentinel missions and artificial intelligence: The mounts monitoring system. Remote Sens. 2019, 11, 1528. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The number of published articles for land cover monitoring using Sentinel-2 images and the refinement processes.
Figure 1. The number of published articles for land cover monitoring using Sentinel-2 images and the refinement processes.
Remotesensing 12 02291 g001
Figure 2. The general trends of articles published from 2015 to 2020 based on the accumulative number of published articles. Note that Sentinel-2 was launched in 2015, while Landsat-8 was launched in 2013.
Figure 2. The general trends of articles published from 2015 to 2020 based on the accumulative number of published articles. Note that Sentinel-2 was launched in 2015, while Landsat-8 was launched in 2013.
Remotesensing 12 02291 g002
Figure 3. Representative distribution of the articles published on Sentinel-2 land cover mapping across the world.
Figure 3. Representative distribution of the articles published on Sentinel-2 land cover mapping across the world.
Remotesensing 12 02291 g003
Figure 4. The processing levels and the format of Sentinel-2 data. It is important to note that the data undergoes different stages to get to level 2A, a form which is accessible and utilised by all the users [25].
Figure 4. The processing levels and the format of Sentinel-2 data. It is important to note that the data undergoes different stages to get to level 2A, a form which is accessible and utilised by all the users [25].
Remotesensing 12 02291 g004
Figure 5. The accumulative number of published articles in Google Scholar which mentioned pixel-based, sub-pixel and object-based. Search terms used included ”Sentinel-2” AND “sub-pixel”, “Sentinel-2” AND “pixel-based” and “Sentinel-2” AND “object-based”.
Figure 5. The accumulative number of published articles in Google Scholar which mentioned pixel-based, sub-pixel and object-based. Search terms used included ”Sentinel-2” AND “sub-pixel”, “Sentinel-2” AND “pixel-based” and “Sentinel-2” AND “object-based”.
Remotesensing 12 02291 g005
Figure 6. Accuracy of Sentinel-2 land cover/use mapping based on different classifiers from the reviewed studies based on 50 case studies (25 for Sentinel-2, and 25 for Landsat-8). Five studies were considered for each classifier. Note that RF refers to random forest, MLC is maximum likelihood classifier, SVM is support vector machine, CT is classification tree and k-NN is k nearest neighbour.
Figure 6. Accuracy of Sentinel-2 land cover/use mapping based on different classifiers from the reviewed studies based on 50 case studies (25 for Sentinel-2, and 25 for Landsat-8). Five studies were considered for each classifier. Note that RF refers to random forest, MLC is maximum likelihood classifier, SVM is support vector machine, CT is classification tree and k-NN is k nearest neighbour.
Remotesensing 12 02291 g006
Table 1. The criteria used for the systematic literature search.
Table 1. The criteria used for the systematic literature search.
CriteriaDetails
Keywords“Sentinel-2” AND “land cover”, “Sentinel-2” AND “landcover”, “Sentinel-2” AND “Forest”, “Sentinel-2” AND “Agriculture”, and “Sentinel-2” AND “urban”
Document typeJournal articles, book chapters and conference proceedings, reports
LanguageEnglish
Publication period2015 to 2020
Table 2. Characteristics of ESA Sentinel-2A and -2B satellite images [9,49].
Table 2. Characteristics of ESA Sentinel-2A and -2B satellite images [9,49].
Sentinel-2ASentinel-2B
Spatial Resolution
(m)
BandsCentral wavelength (nm)Bandwidth (nm)Central Wavelength (nm)Bandwidth (nm)
10Band 2—Blue492.466492.166
Band 3—Green559.83655936
Band 4—Red664.631664.931
Band 8—NIR832.8106832.9106
20Band 6—Red edge740.515739.115
Band 7—Red edge782.820779.720
Band 8A—Narrow NIR864.72186422
Band 11—SWIR1613.7911610.494
Band 12—SWIR2202.41752185.7185
60Band 1—Coastal aerosol442.721442.221
Band 9—Water vapour945.120943.221
Band 10—SWIR—Cirrus1373.5311376.930
Table 3. Selected studies on pixel and sub-pixel-based land cover/use classification of Sentinel-2 images.
Table 3. Selected studies on pixel and sub-pixel-based land cover/use classification of Sentinel-2 images.
StudyLand Cover/UseClassification MethodClassifierAccuracy (%)
Clark [83]Bareland, built-up area, vegetation, cropsSubpixelMESMA, RF74–84
Colkesen, et al. [85]forest, soil, water, corn, barren, impervious surfacespixel-basedCCF87–95
Degerickx, et al. [84]Roof, pavement, soil, shrub, treeSub-pixel MESMA57–90
Denize, et al. [7]Crop residues, bare soil, winter crop, grasslandSupervised pixel-basedSVM, RF81
Forkuor, et al. [62]Agriculture, urbanSupervised pixel-basedRF, SVM, ANN87–92
Fragoso-Campón, et al. [86]Forest, shrubs, water, rocksSupervised pixel-basedRF73–79
Gašparović, et al. [87]Water, built-up, bare soil, forestsSupervised classificationMLC, ANN83–91
Glinskis, et al. [88]Oil pamSupervised pixel-basedMLC60–70
Immitzer, et al. [8]Maize, onion, sunflower, sugar beetpixel-basedRF65–76
Khaliq, et al. [89]Water, Cabbage, Maize, built-upSupervised pixel-basedRF91
Kussul, et al. [90]Crops, bareland, waterUnsupervised pixel-based ANN88–94
Miranda, et al. [31]Water, forest, urban barelandSupervised pixel-basedMLC100
Pesaresi, et al. [12]Built-up areaSupervised pixel-basedSML60
Rujoiu-Mare, et al. [81]Forest, waterbodies, built-upSupervised pixel-basedMLC, SVM92–98
Sekertekin, et al. [71]Waterbody, settlement, bareland, vegetationSupervised pixel-basedMLC78–85
Steinhausen, et al. [91]Cropland, forest grassland, urban areas, waterSupervised pixel-basedRF89–91
Thanh Noi, et al. [73]Residential, impervious surface, agriculture, bareland, forest, waterSupervised pixel basedRF, SVM, KNN90–95
Vuolo, et al. [40]Carrot, Maize, potato
Pumpkin
Supervised pixel-basedRF91–95
Weinmann, et al. [92]Forest, garden, Fields, settlementsSupervised pixel-basedSVM72–80
Note that accuracy includes the range of producer’s, user’s and overall accuracy. In Table 3, RF refers to random forest, MLC is Maximum likelihood classifier, SVM is support vector machine, ANN refers to the Artificial Neural Network, MESMA is multiple endmember spectral mixture analysis, SML is Symbolic Machine-learning and CFF is canonical correlation forest.
Table 4. Selected studies on the object-based land cover/use classification of Sentinel-2 images.
Table 4. Selected studies on the object-based land cover/use classification of Sentinel-2 images.
StudyLand Cover/UseClassification MethodClassifierAccuracy (%)
Dong, et al. [98]CroplandOBIA-ClassifierRF78–96
Clark [83]Bareland, built-up area, vegetation, cropsOBIA-ClassifierRF75–84
Csillik, et al. [99]Wheat, maize, rice, sunflower, forest, unclassifiedOBIA-Rule basedRuleset78–98
Delalay, et al. [100]settlement, industry, water, forestOBIA-classifierRF, CT61–95
Derksen, et al. [80]Crops, road, orchardsOBIA-ContextureContextual80–90
Gašparović, et al. [87]Water, built-up, bare soil, forestsOBIA-ClassifierANN83– 91
Gašparović, et al. [87]Water, built-up, bare soil, low vegetation, forestOBIA-Rule basedRuleset84–91
Gómez, et al. [101]Winter, wheat, Others, Built-upOBIA-ClassifierRF84–98
Heryadi, et al. [102]forest, water body, urban, bare landOBIA-classifierk-NN80–98
Immitzer, et al. [8]Maize, onion, sunflower, sugar beetOBIA-Classifier RF65–76
Kaplan, et al. [103]Water, Forest, wetland, urban, green field, dry fieldsOBIA-Rule basedRuleset89–90
Kaplan, et al. [72]Water, Forest, wetland, urban, green field, dry fieldsOBIA-Rule basedRuleset88–90
Kolokoussis, et al. [104]Land, seawater, oil spill, possibly dissolved oil spillOBIA-Rule basedRuleset72–91
Labib, et al. [105]Built-up, water, vegetation, ShadowOBIA-Rule basedRuleset67–71
Laurent, et al. [106]Canopy, brown leaves, green leavesOBIA-ClassifierBayesian96–98
Lu, et al. [107]Plastic-mulched land-cover, cropsOBIA-ClassifierCT88–90
Marangoz, et al. [108]forest, water body, urbanOBIA-Rule basedRuleset80–88
Marangoz, et al. [109]Bare land, forest, settlement, vegetation, waterOBIA-Rule basedRuleset66–76
Mongus, et al. [110]Agriculture, forest, Water, grasslandOBIA-ClassifierNaïve Bayes88–95
Novelli, et al. [39]GreenhouseOBIA-ClassifierRF89–93
Phiri, et al. [41]Water, built-up area, forestsOBIA-ClassifierRF67–91
Popescu, et al. [111]urban area, water, forest, agricultureOBIA-ClassifierLatent Dirichlet Allocation (LDA)-
Weinmann, et al. [92]Settlement, industry, water, forest OBIA-ClassifierRF80–83
Xiong, et al. [76]CroplandOBIA-Google Earth EngineSVM, RF68–85
Zheng, et al. [93]Roads, bareland, ForestOBIA-ClassifierKNN, ANN, RF, SVM70–90
Note that accuracy includes the range of producer’s, user’s and overall accuracy. In the table, RF refers to random forest, SVM is support vector machine, ANN refers to the Artificial Neural Network, LDA refers to Latent Dirichlet Allocation and CT is classification tree.
Table 5. Summary of major forest application of Sentinel-2 imagery across the world.
Table 5. Summary of major forest application of Sentinel-2 imagery across the world.
ApplicationSpecific ApplicationCountryMethodsAccuracyReference
ForestForest extentPoland, China, Burkina Faso, South Africa, Madagascar, Zimbabwe, BulgariaMachine-learning, cloud computing80–90%Suresh, et al. [158], Wang, et al. [127], [126], Adjognon, et al. [131]; Nzimande, et al. [128]; Filchev [159]
Forest typesItaly, Ghana, South Africa, TogoLinear discrimination analysis, spectral indices, Machine learning88–90%Laurin, et al. [132]; Konko, et al. [160]; Puletti, et al. [141], Laurin, et al. [132]
Species IdentificationGermany, ItalyOBIA-RF, Stepwise regression65–76%Immitzer, et al. [8], Laurin, et al. [132]
Forest productivityGermany, South Africa, Southern AfricaMachine-learning (Random Forest), Invertible Forest Reflectance Model 90–92%Mutowo, et al. [161]; Ramoelo, et al. [162]; Darvishzadeh, et al. [163]
Growing stockNorway, Greece, Italy, FinlandFusion with UAV data, Linear regressionSE=3.4–5.8%Puliti, et al. [146], Chrysafis, et al. [164], Mura, et al. [118]
Forest InventoryFinland, NorwayFusion with UAV data, multivariable modelsSE=3.4–5.8%Puliti, et al. [146], Astola, et al. [165], [166]
Wetland mappingChina, Canada, South Africa, Senegal, Ghanamachine-learning, Google Earth Engine, OBIA83–90%Yesou, et al. [167], Mahdianpari, et al. [168], Whyte, et al. [144]; Mondal, et al. [152]
Leaf Area Index (LAI)Finland, Germany, South Africa, BulgariaRed-edge band with Partial Least Squares Regression (PLSR), Spectral IndicesR2 = 91%Clasen, et al. [143], Korhonen, et al. [11], Sibanda, et al. [142]; Dimitrov, et al. [169]
Forest Fires/WildfireMadeira Island, Bulgaria, Congo DRC, Africa (continent)Active fire products with SAR data fusion, New Algorithm80–89%Verhegghen, et al. [42]; Roteta, et al. [149]; Navarro, et al. [150], Nedkov [170], Filchev [159]
Dryland mappingGerman, South AfricaSub-pixel classification, BiomeBGC Simulations82%Munyati [171], Dotzler, et al. [172]
Grassland mappingSouth AfricaSparse Partial Least Squares Regression (SPLSR)R2 = 59%Shoko, et al. [156]
Canopy coverFinland, GermanGeneralized additive models, Spectral Unmixing and UAVsRMSE = 0.05–0.42Korhonen, et al. [11], Clasen, et al. [143]
Forest successionBrazil, PolandSVM, RF, OBIA90–97%Sothe, et al. [173], Szostak, et al. [126]
Forest DegradationBulgaria, TanzaniaOBIA-RFR2 = 0.97, 95%Hojas-Gascon, et al. [174], Nedkov [170]
Forest healthyPolandMachine-learning75–78Hawryło, et al. [175]
Forest phenologyGermanCorrelation with ground sensorR2 = 0.99Lange, et al. [176]
Biomass assessmentAbove-ground biomassVietnam, Finland, South Africa, Zimbabwe, ItalyMachine-Learning, SPLSR, PARAS, Regression analysis80–91%Pham, et al. [154], Pandit, et al. [177]; Shoko, et al. [156], Majasalmi, et al. [155], Laurin, et al. [132]
Below grown biomassTurkeyRegression analysis, Supervised classificationR2 = 97%Bulut, et al. [130]
Carbon AssessmentCarbon assessmentCzech Republic, Turkey, South AfricaBootstrapped Random Forest, Regression analysis, multivariate regression modelsR2 = 30–75%Bulut, et al. [130], Naidoo, et al. [129], Gholizadeh, et al. [157]
Table 6. Summary of agricultural application of Sentinel-2 imagery globally.
Table 6. Summary of agricultural application of Sentinel-2 imagery globally.
ApplicationCountryMethodsAccuracyReference
Crop diseasesChina, South Africa, ZimbabweRandom Forest77–94%Zheng, et al. [189], Dhau, et al. [186]; Chemura, et al. [191]
Crop residueSpain, Malawi Maximum likelihood, OBIA, regression90–97%Andersson, et al. [82], Estrada, et al. [138], Zheng, et al. [196]
Crop type detectionUkraine, France, Austria, Zimbabwe, Ethiopia, BulgariaDeep learning, Random Forest, Support vector regression77–96%Kussul, et al. [184], Chemura, et al. [187]; Vogels, et al. [188], Vuolo, et al. [40], Veloso, et al. [197]; Dimitrov, et al. [198]
Crop yield focusing/productivityBelgium, Saudi Arabia, Ukraine, Zimbabwe, MaliDeep learning, Random Forest, Maximum likelihood, Spectral indices35–96%Lambert, et al. [182]; Hiestermann, et al. [77], Al-Gaadi, et al. [135], Delloye, et al. [199]
Cropland extentUnited Kingdom, Madagascar, Ukraine Global Dataset (Burkina Faso, South Africa, Morocco, Madagascar)Cloud-based computing, Machine-learning, OBIA64.4–96%Zhang, et al. [200], Bontemps, et al. [178]; Inglada, et al. [201], Lebourgeois, et al. [136], Kussul, et al. [184]
Irrigation cropEthiopia, Global Dataset (Burkina Faso, South Africa, Morocco, Madagascar)Object-based94%Vogels, et al. [137]; Vogels, et al. [188]
Nitrogen contentBelgium, BulgariaMultivariant regression65–90%, RMSE = 0.25Clevers, et al. [185], Dimitrov, et al. [169]
Real-time crop monitoringSouth AfricaCloud-based computing (Google Earth Engine)-Hiestermann, et al. [77]
Smallholder crop monitoringMali, EthiopiaSupervised pixel-based, object-based80–94%Lambert, et al. [182]; Vogels, et al. [137]
Soil propertiesSpain, France, USAMultivariant analysis, Neural Network, TRApezoid Model64–88%Gao, et al. [139], El Hajj, et al. [202], Sadeghi, et al. [203]
Biophysical parameter estimatesFrance, Spain, Bulgarianeural networks (NN), support vector regression (SVR), kernel ridge regression (KRR), and Gaussian processes regression (GPR)RMSE = 0.1–0.2Upreti, et al. [193]; Xie, et al. [192], Dimitrov, et al. [169]
Table 7. Summary of urban and natural hazard application of Sentinel 2 imagery.
Table 7. Summary of urban and natural hazard application of Sentinel 2 imagery.
ApplicationSpecific ApplicationCountryMethodsAccuracyReference
UrbanUrban expansionBrazil, China, Tanzania, KenyaSpectral Indices, RF75–92%Gombe, et al. [212]; Ng, et al. [148]; Iannelli, et al. [216]; Tavares, et al. [204]
Urban extentChina, BrazilFusion83%Iannelli, et al. [216]; Tavares, et al. [204]
Rural-urban transitionGhanaPrincipal Components Analysis (PCA)-Møller-Jensen [206]
Informal settlementSouth AfricaCloud-based computing (Google Earth Engine)-Gibson, et al. [207]
Urban surface waterChina, MacedoniaPixel-based/OBIA80–92%Yang, et al. [217], Yang, et al. [208], Sekertekin, et al. [218]
Urban climateFrance, GermanyCanonical Correlation Forests69–75%Qiu, et al. [219]
Urban changeFranceConvolutional Neural Networks (CNN)60–91Daudt, et al. [220]
Urban ecosystem/forest/green spaceSlovakia, SwitzerlandSVM, Maximum likelihood73–90%Haas, et al. [140], Recanatesi, et al. [221]
Urban heat islandLebanon, France, GermanMLC, Neural Network82–84%Kaloustian, et al. [222]; Qiu, et al. [223], Chunping, et al. [205]
Natural hazardsFloodsSpain, MozambiqueSpectral indices and OBIA64–85%Caballero, et al. [134], Phiri, et al. [41]
DroughtsGermany, South AfricaSpectral Mixture Analysis, Biome-BGC Simulations73–82%Munyati [171], Dotzler, et al. [172]
EarthquakesNew Zealand, Francecross-correlationRMSE= 0.025–0.20Kääb, et al. [213], Jelének, et al. [215], Stumpf, et al. [224]
Volcanic eruptionSaunders Island, GermanyCorrelation, visual assessment, Convolutional neural network (CNN)RMSE = 0.03Gray, et al. [214], Valade, et al. [225]

Share and Cite

MDPI and ACS Style

Phiri, D.; Simwanda, M.; Salekin, S.; Nyirenda, V.R.; Murayama, Y.; Ranagalage, M. Sentinel-2 Data for Land Cover/Use Mapping: A Review. Remote Sens. 2020, 12, 2291. https://doi.org/10.3390/rs12142291

AMA Style

Phiri D, Simwanda M, Salekin S, Nyirenda VR, Murayama Y, Ranagalage M. Sentinel-2 Data for Land Cover/Use Mapping: A Review. Remote Sensing. 2020; 12(14):2291. https://doi.org/10.3390/rs12142291

Chicago/Turabian Style

Phiri, Darius, Matamyo Simwanda, Serajis Salekin, Vincent R. Nyirenda, Yuji Murayama, and Manjula Ranagalage. 2020. "Sentinel-2 Data for Land Cover/Use Mapping: A Review" Remote Sensing 12, no. 14: 2291. https://doi.org/10.3390/rs12142291

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop