Next Article in Journal
CgbZIP1: A bZIP Transcription Factor from Chrysanthemum Grandiflora Confers Plant Tolerance to Salinity and Drought Stress
Next Article in Special Issue
Using the CERES-Maize Model to Simulate Crop Yield in a Long-Term Field Experiment in Hungary
Previous Article in Journal
Evaluation of Germplasm and Development of Markers for Resistance to Plasmodiophora brassicae in Radish (Raphanussativus L.)
Previous Article in Special Issue
Developing and Testing Remote-Sensing Indices to Represent within-Field Variation of Wheat Yields: Assessment of the Variation Explained by Simple Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Technology and Data Fusion Methods to Enhance Site-Specific Crop Monitoring

1
Department of Agricultural, Environmental and Food Sciences, University of Molise, 86100 Campobasso, Italy
2
Department of Agricultural and Biosystems Engineering, University of Kassel, 37213 Witzenhausen, Germany
*
Authors to whom correspondence should be addressed.
Agronomy 2022, 12(3), 555; https://doi.org/10.3390/agronomy12030555
Submission received: 16 December 2021 / Revised: 18 February 2022 / Accepted: 21 February 2022 / Published: 23 February 2022
(This article belongs to the Special Issue Crop Yield Prediction in Precision Agriculture)

Abstract

:
Digital farming approach merges new technologies and sensor data to optimize the quality of crop monitoring in agriculture. The successful fusion of technology and data is highly dependent on the parameter collection, the modeling adoption, and the technology integration being accurately implemented according to the specified needs of the farm. This fusion technique has not yet been widely adopted due to several challenges; however, our study here reviews current methods and applications for fusing technologies and data. First, the study highlights different sensors that can be merged with other systems to develop fusion methods, such as optical, thermal infrared, multispectral, hyperspectral, light detection and ranging and radar. Second, the data fusion using the internet of things is reviewed. Third, the study shows different platforms that can be used as a source for the fusion of technologies, such as ground-based (tractors and robots), space-borne (satellites) and aerial (unmanned aerial vehicles) monitoring platforms. Finally, the study presents data fusion methods for site-specific crop parameter monitoring, such as nitrogen, chlorophyll, leaf area index, and aboveground biomass, and shows how the fusion of technologies and data can improve the monitoring of these parameters. The study further reveals limitations of the previous technologies and provides recommendations on how to improve their fusion with the best available sensors. The study reveals that among different data fusion methods, sensors and technologies, the airborne and terrestrial LiDAR fusion method for crop, canopy, and ground may be considered as a futuristic easy-to-use and low-cost solution to enhance the site-specific monitoring of crop parameters.

1. Introduction

Crop monitoring supports management techniques to optimize agricultural production according to field parameters [1]. The role of crop monitoring is to improve the quality of production and to reduce the environmental impact by optimizing the management strategies. Similarly, agricultural technology efficiently monitors crops to classify crop type, plant status, growth, and growth parameters in real-time conditions [2]. In these conditions, technology and data fusion support a wide variety of methods for analyzing crop traits over time in spatially explicit and nondestructive ways [3].
Crop production per unit area can be sufficiently improved to meet future crop demand and thus increase food security on a global scale. This can be achieved by optimizing the site-specific resources using precision agriculture methods. Research shows that the current methods are inappropriate for the management of crop nutrients in arable lands [4]. Due to field spatial variations, crop monitoring remains suboptimal and consequently limits crop growth and development, reduces the yield, increases management costs [5,6,7], causes a drop in net economic income, and increases the environmental risks. Therefore, there is a need for a developed site-specific crop monitoring solution capable of addressing such serious agronomic, environmental, and economic issues.
These issues can be addressed by the proper estimation of crop status, which depends on different factors, such as climate, soil, crop nutrients, growing cycles, pests, and diseases. Crop status and plant health indirectly influence productivity, which is the focus of the current study. Nitrogen (N), chlorophyll, leaf area index (LAI), and aboveground biomass (AGB) are some of the most important factors for monitoring crop status and health.
The management of N to meet production aims is difficult, as the crop response to N is highly variable due to changes in soil properties [8]. The potential to simultaneously estimate N dynamics and environmental damage could be a significant step in increasing crop production [9,10,11,12,13]. Chlorophyll pigment is a major parameter for quantifying the photosynthesis rate and productivity [14]. The retrieval of chlorophyll information relies on the relationship between spectral reflectance and biochemical contents [15]. While its pigment manages the spectral reflectance of the leaves and canopy, other parameters, e.g., canopy architecture, encompass both the spatial distributions of vegetation components, LAI, and background reflectance, which also contribute to it [14,16,17]. The LAI is an important index that provides crop growth condition by regulating water and predicting yield [18]. Therefore, accurate estimation of LAI can be used to precisely forecast crop-related factors, such as crop coefficient (kc) and crop yield factor (ky) [19]. The AGB content is strongly correlated with crop yield. Accurate and timely monitoring of AGB is important for developing site-specific management strategies [20]. Due to this direct correlation, AGB is of high importance to crop production during the crop growth stage [21].
Crop monitoring in precision farming is segmented into technologies (e.g., guidance system, remote sensing, and variable rate technology (VRT)), solutions (e.g., hardware, software, and services), and applications (e.g., yield monitoring, field mapping, and crop scouting).
Crop monitoring can be supplied with multiple sensors estimating environmental features, plant canopy, leaves indices using the internet of things (IoT). Due to the huge set of data collected, this technique is required to develop the types of data to analyze crop growing conditions and development. Moreover, big data fusion has experienced significant progress, and when used for agriculture, it would have a high impact on crop monitoring. Due to which different multi-sensors and IoT-based big data fusion techniques have been used in agriculture for crop monitoring purpose [22].
Monitoring tools using IoT and big data have led to extensive research in the field of digital agriculture. Studies have been proposed on the use of big data methods for agriculture mainly based on one of the several extracted data, such as IoT [23]. Other studies have addressed different crop monitoring problems, particularly, acquisition techniques using multispectral sensors, thermal sensors, hyperspectral sensors and satellite sensors [24].
Precision agriculture challenges require big solutions through advanced technologies, and Agriculture 5.0 is considered to be the one for the initial phase of the 21st century. Agriculture 5.0 is the implementation of Precision Agriculture systems and equipment utilization that consists of intelligent operations and autonomous decision support systems. Therefore, Agriculture 5.0 is considered as the implementation of robots and artificial intelligence (AI) [25]. Generally, crop fields need high labor for crop harvesting and high productivity. However, society has changed, now, with more people living in urban environments; ultimately, fields are facing the problems of a labor shortage. A potential solution is the implementation of agricultural 5.0 tools such as robots and sensors integrated with AI features. These systems replace the labor workforce and manage the crop growth and harvest with a more reliable pace [26].
An important method for crop monitoring is the use of optical sensors [27]. Optical sensors are positioned either in contact with or near the crop [27,28]. They use indirect, nondestructive, less laborious, and nutrient-efficient methods for crop monitoring [29]. However, they are sensitive to weather conditions, are time consuming, and provide low temporal resolution. Thermal infrared (TIR) collects the variation in leaf and canopy dynamics for leaf temperature estimation using thermal imagery [30]. It is a noncontact, less laborious, and nondestructive method; however, the system shows inaccuracies in yield estimation, crop water stress, and plant growth determination [31,32,33,34]. A multispectral system integrated with visible bands collects the digital imagery and provides crop monitoring analysis [34]. However, the system only uses qualitative data, and within-field color variations do not allow accurate quantitative estimation of crop parameters. Multispectral sensors are widely adopted for deriving crop vegetation indices (VIs) to predict crop yield [35,36]. However, the system is not able to provide rapid variations that occur in leaf chlorophyll [37]. Light detection and ranging (LiDAR) construct the crop into 3D images [38], geometrical characteristics, and water and nutrient management [38,39,40,41,42,43]. However, LiDAR has not been extensively tested for crop studies, which may lead to problems in canopy estimation [44], higher costs, and complexity in data management in diverse weather conditions [45]. In this aspect, synthetic aperture radar (SAR) efficiently delivers crop geometrical properties using its highly sensitive microwave signals under all weather conditions [46]. However, topsoil moisture estimation is not always accurately acquired [47]. The latest research shows that fusion of the diverse sensor systems to acquire crop and canopy structure, texture, and spectral and thermal information significantly improves the data related to plant trait determination in a diverse agricultural system [48].
These systems provide efficient data, with their integration at the ground-based platform, such as tractors. Studies have been performed on autonomous tractor systems for video surveillance and mapping. Klaina et al. [49] studied the routing protocols for semi-autonomous tractors in real-time conditions. An autonomous multi-tractor system was analyzed by Vasconez et al. [50], Zhou et al. [51], and Wang et al. [52]. They allowed for the installation of multiple systems, such as GPS, sensors, and computer vision systems. However, studies on autonomous and semi-autonomous tractors for crop monitoring are limited. Robots, however, are designed for guidance, detection, action, and mapping [53] and for developing field resource efficiency. However, the robot GPS has low accuracy due to signal blockage or occasional multireflection. Space-borne platforms such as satellites (Sentinel-2/Sentinel-1 10–60 m, MODIS 30 m and Landsat 250 m) provide real-time crop information and agrometeorological data for global food production estimates [54]. They provide early warnings for crop monitoring and food supply by assessing crops, farming activities, and rural developments. However, their coarse spatial resolution does not optimally classify the crops within mixed pixels [55] and thus presents errors. Unmanned aerial vehicles (UAV) provide high spatiotemporal resolution, cost-efficiency, and flexibility for crop monitoring at fine scales [2,3]. However, UAV-collected imagery may contain a degree of geometric distortion, inaccurate radiometric estimation, and environmental condition effectiveness to the quality of the data, such as vegetation indices [56].
Technology and data fusion provide a more accurate and robust analysis. Through data fusion, several data sources are joined together to create a more useful and accurate analysis. There are several methods to perform data fusion, such as via installation of a field sensor for crop parameters and moisture estimation. The objective of this study is to provide the latest data fusion methods for the efficient fusion of model data with different technologies. Considering this, the study highlights digital farming technologies and establishes a connection to show how a developed system (with technology and data fusion) in turn influences crop monitoring.
This article is structured as follows: Section 2 highlights the best sensors for crop monitoring. Section 3 presents platforms with new fusion methods for ground-based, space-borne and aerial platforms. Section 4 shows data fusion methods for site-specific crop monitoring for N, chlorophyll, LAI, and AGB. Finally, Section 5 concludes the study and points out the current challenges, opportunities, and future prospects of the technology and data fusion methods.

2. Sensor Types

The latest developments in sensor technologies have supported in-season crop monitoring [57]. Studies performed for over a few years have improved the potential of sensor data acquisition to develop agronomic management decisions. To achieve these decisions, a single sensor is not always able to provide satisfactory results. This could be achieved by multiple sensors such that their functions could improve and provide the required results. This section provides an overview of the most important sensors, such as optical, thermal infrared, multispectral, hyperspectral, LiDAR, and radar (Table 1). Furthermore, their limitations and solutions are also discussed.

2.1. Optical Sensors

Optical sensors measure crop reflectance at specific wavelengths within a short distance and identify the crops at a large scale. They analyze crop attributes and estimate crop production during the harvest. Optical sensors, such as OptRx [58], ClorofiLOG [59], Dualex [27], Multiplex [60], Isaria, Crop Circle [61], SPAD-502 [10], CropSpec [62], GreenSeeker [63], and ALS-2 N [10], provide crop monitoring under day and night conditions [64], as shown in Table 1. The method of analysis involves collecting a large quantity of recordings per field with many measurements per second [65]. These recordings are analyzed to collect data from different crops, such as Gossypium arboreum (L.) [66], Triticum aestivum (L.) [67], Triticale rimpaui (Wittm.) Muntz [68], Zea mays (L.) [69], Hordeum vulgare (L.) [70], Saccharum officinarum (L.) [64], and Solanum tuberosum (L.) [71].
Optical sensors have been successfully implemented for N prediction in a variety of crops, such as T. aestivum [27], S. tuberosum [63], Oryza sativa (L.) [72], Z. mays [73], Capsicum annuum (L.) [74], and Vitis vinifera (L.) [75].
Optical sensors are used to estimate leaf chlorophyll [76]. Detailed studies were performed by Dong et al. [77], Zhang et al. [78], Latifah et al. [69] and Jiang et al. [31] on T. aestivum, Z. mays, O. sativa, and Glycine max (L.) Merr. Studies of Dong et al. [76] and Canata et al. [64] for estimating chlorophyll reported similar results to that of Latifah et al. [69] and Jiang et al. [31]. However, Dualex and Isaria sensors are not able to classify crops due to the lack of classification features [77], soil type, and luminosity effects [67]. However, sensors experience multiple problems, such as data damage and redundancy, inference, and data generation differences [78,79].

2.2. Thermal Infrared Sensors

Thermal infrared sensors are a widely adopted technology for crop monitoring. It has shown good results in crop and parameter monitoring (Table 1). Particularly, it delivers a nondestructive estimation. However, thermal infrared sensors are manually operated devices [80], controlled through mobile systems [81] and installed in aircrafts [82], satellites [83] or UAVs [84]. Aircraft and satellite-installed thermal infrared sensors can estimate crop parameters remotely. They can spatially map the crop parameters by a thermal image to estimate delicate and heterogeneous features of the crop [85]. The imagery data are composed of pixel arrays representing pixel resolution of the sensor. Crop parameter observation with thermographic features allows to utilize the thermal sensing for particular features estimation [86,87,88,89,90], each plant temperature estimation [91], and develop variable rate irrigation strategies [92,93,94]. However, these applications do not generally provide a better spatial or temporal resolution [95]. The reduced resolution of the sensor and the distance to the target considerably minimize the quality of the data [96,97]. Thermal infrared sensors are sensitive to atmospheric conditions. Despite the sensitivity, they provide a high potential to be applied in precision agriculture and especially in irrigation strategies to estimate the right time, the right place and the right quantity of water to apply to minimize the quantity of water used per unit yield [98].

2.3. Multispectral Sensors

The launch of satellite missions by government and commercial companies have resulted in further progress and revisiting period of multispectral systems [99]. The development and use of lightweight and compact multispectral sensors was rapidly followed by their wide adoption for the collection of spatial resolution data [100]. A multispectral camera array (Parrot SA France, MicaSense, and Tetracam, among others) contains a standalone camera for the efficient acquisition of reflected radiation from the target source [101]. The data collected are then processed to deliver a single compound image with different spectral bands. This system is exclusively developed for crop parameter monitoring and offers benefits, such as low cost, simple solutions, and reliable software choices [102]. The applications of multispectral sensors, such as RapidEye, Sentinel-2 and WorldView-2 series, are considered as a trade-off between support delivered by remotely sensed data. Dhau et al. [103] applied this system and tested whether field spectrometry estimations resampled to various multispectral sensor resolutions can be used in maize production.

2.4. Hyperspectral Sensors

Hyperspectral sensors are moderately adopted for simplifying crop monitoring methods [104]. In a hyperspectral system, the camera acquires data from more wavelengths than the three commonly measured wavelengths (400–1300; 400–2350; and 1400–2350 nm wavelengths) [105]. Due to decreasing technology prices, the method is now economical and has thus become accessible to users globally. New methods have been introduced that push the limits of the hyperspectral imaging system for broad-range analysis. Hyperspectral imaging technology has the capacity to identify small changes in plant growth and development. To accurately derive the crop parameter data, the basic problem of atmospheric correction must first be solved. Thus, spectral radiance estimation includes the radiance emitted by the source and thermal radiation produced by the surroundings that is reflected from the surface of the source. Thus far, hyperspectral estimation of crop monitoring has gained little attention. Thus, new hyperspectral systems are needed to address the limitations by using narrowband measures, which provide precise emissivity retrieval and, therefore, better parametric estimations compared to thermal cameras [106].
Hyperspectral sensors with hundreds of bands can collect a wide variety of spectral responses and subtle variations of ground covers and their fluctuations over time. Thus, they can be used to address the challenges, such as spectral resolution and support accurate and timely estimation [107]. Studies have also shown a moderate performance of hyperspectral in monitoring crop vegetation, such as LAI [108], crop types [109] and biomass [110]. Hyperspectral sensors have been tested comparatively less in agricultural applications due to the high cost of the system and other applications [111]. Although terrestrial hyperspectral data can be reliably estimated with the help of a spectroradiometer which has already been used by many studies for observing canopy- and leaf-level spectral characteristics [112]. Such terrestrial estimations are limited to a few numbers of samples, and are not able to acquire spatial distribution in large tests.

2.5. LiDAR Sensors

LiDAR detects signals from the crop, canopy, and ground. It is capable of penetrating the vegetation canopy, a goal of which is to measure the height, terrain, and plant diameters [113]. The LiDAR system estimates the plant parameter with high precision in all types of environmental conditions [114]. It accurately maps the 3D coordinates of a plant with two-dimensional data [115], as shown in Table 1.
The airborne LiDAR system provides optimal results to estimate crop parametric data. Various features of LiDAR have been proven to be highly accurate in field tests. Fang et al. [114] studied the ratio of the sum of the intensity of ground returns to the sum of the intensity of total returns for the estimation of crop parameters. The results showed significant values (R2 = 0.610) with accurate estimation [116]. However, the type of data and management methods interrupt this accuracy when there is a high amount of variation. It was noticed that overestimation also occurs when there is dense vegetation. Weiss et al. [117] reported that the accuracy of the system was higher for few parameters and lower for others. This problem can be addressed by analyzing the effects of occlusions of dense vegetation. Another major problem of the system is inaccurate estimation of the model in time and space. This creates an urgent need to optimize the processing methods in new geographic locations and systems based on an understanding of the effects on particular crop parameters.

2.6. Radar Sensors

Radar sensors use microwave signals that have the potential to collect crop data in heavy clouds and/or haze conditions. The SAR estimates crop parameters and biophysical status using backscattered signals, the accuracy of which depends on sensor characteristics, such as wavelength, incident angle, and polarization. The collection procedure is based on the following four methods: (1) SAR backscattering technique; (2) SAR polarimetry technique; (3) SAR interferometry technique; and (4) polarimetry SAR interferometry technique. The main principles, challenges and solutions of each technique are given in the following section.

2.6.1. SAR Backscattering Technique

SAR backscattering data show the strength of the returned signal from the source, which depends on the source characteristics. The acquired data are recorded in digital numbers, which are then processed into the backscattering measurements for analysis, and this process is known as radiometric measurement. A major benefit is that the estimation method can be easily applied over another sensor [118]. However, this transformation from one sensor to another requires the same model of the system and is not compatible with all models. For continuous and accurate calibration without interruption, there is a need to develop an application that can be compatible for all models.

2.6.2. SAR Polarimetry Technique

The SAR return signal from the source depends on various properties, such as crop density, moisture availability, canopy structure, and soil moisture. Advantages of using SAR polarimetry include the delivery of high amounts of quantitative data in single and dual polarization among all four linear polarization combinations. Several decomposition methods for estimating scattering have been proposed. Coherent decomposition is beneficial for analyzing the crop sources and cannot be applied for natural sources. While incoherent decomposition is considered as more beneficial for crop monitoring purposes, its dispersion provides canopy orientation, structure, and crop moisture data [46]. To address these issues, Gella et al. [119], Cui et al. [120], Adrian et al. [121] and Verma et al. [122] mapped crop types, using time-series of the SAR imagery, phenological data, and time-analysis for Euclidean and angular distance estimation.

2.6.3. SAR Interferometry Technique

The SAR interferometry technique is based on a normalized cross-correlation of two compound signals. The benefit of this method is the high interferometric consistency for stable characteristics, such as settlements, while it is low for unstable characteristics, such as irrigation water. In the past, this technique was adopted for digital elevation model production and surface movement analysis, while researchers have shown the unique potential of SAR interferometry, which is currently being used in data collection for a diverse set of crop parameters, including plant height, biomass density, and surface irrigation, among others [123].

2.6.4. Polarimetry SAR Interferometry Technique

The polarimetry SAR interferometry technique estimates crop modeling using vegetation water content, which is estimated from the collected imagery [124]. Polarimetric SAR data have benefits in identifying and separating the scattering methods of a natural filter, while SAR interferometry data are sensitive to vertically spatially dispersed parameters that allow for data collection with high precision. The polarimetry SAR interferometry method uses both polarimetry and interferometry techniques to deliver the sensitivity scale to the source target. The polarimetry SAR interferometry technique significantly develops upon the application capacity of SAR analysis [125].

3. Artificial Intelligence and Internet of Things (IoT) Data Fusion

The fusion of artificial intelligence and internet of things (IoT) can be a significant method to address the issue of crop monitoring. Wireless sensor network (WSN) is an infrastructure of artificial intelligence and IoT systems that could be used to monitor crop parameters and manage the proper irrigation. Meanwhile, intelligent system can inform harvest initiation. The fusion of both pieces of the aforementioned information helps making crop and crop parameter-related decisions. Therefore, it can suggest the farmers about the precise quantity of water for irrigation purposes and avoid wasting the irrigation water due to bad irrigation management and strategies [136].
An efficient IoT system, particularly for field conditions such as agricultural fields, consists of low-cost things, which contain a series of limitations, i.e., mini-batteries, low processing, and storage chambers. Moreover, it needs to deal with the data errors of sensors and low accuracy due to atmosphere, weather, traditional sensors, communication problems, and data noise. Data fusion can help address these issues, as it is one of the highly accepted methods for improving sensor accuracy, displaying the global perspective, and delivering highly accurate decisions [137].
Different terminologies, such as artificial intelligence fusion, sensor fusion and data fusion, are used to describe some fusion characteristics. Yet, these terms are recognized and used interchangeably [138]. These concepts are widely accepted and may vary depending on the studies. Nakamura et al. [138] defined them as combination of multiple sources to achieve the developed form of data. This study adopted the definition of sensor and data fusion, reviewing to display the developed data quality and decision-making process.

4. Data Fusion by Platforms

Varying the scale of platforms allows for precise monitoring and options for fusion methods in field trials. These platforms include ground-based [139], space-borne [140] and aerial [3] platforms, which are outlined in the following subsections.

4.1. Ground-Based Platforms

Tractors are an important part of automated crop monitoring and can accurately operate for longer durations than a manual resource. Using a tractor, the operator is able to analyze both the vehicle and the harvester. The fusion of machine vision with the tractor could potentially improve its guidance capacity, as shown in Table 2. Its long range provides support for rapid decision making before a difficult turn is to be made. This would give the tractor the ability of rapid observation in detecting variations along its way [141]. However, the challenge in detecting the centimeter-level variation at a close-range path boundary or an obstacle is still present. The fusion of LiDAR is useful in addressing minute variations. However, noisy data are generated if there are slight differences at the path boundary. Therefore, the fusion of an autonomous vehicle guidance system integrated with machine vision is a good option for improving the quality of collected data.
An important aspect of automatic guidance is the ability to recognize the path between crop rows. The fusion of machine vision with the autonomous vehicle and its GPS is a good option for improving the guidance system. As crop rows are developed at field conditions, a significant step toward this objective is the development of a row recognition system. However, the weak strength of the GPS signal may lose the track, particularly at high amounts of plants/trees. To address this problem, the fused data given by an inertial measurement unit (IMU) are sent to strengthen the GPS signals in a short period (Table 1). The few drawbacks, such as limited estimation potential, charging issues, and weather restrictions, still need to be addressed. The fusion of stationary platforms with the autonomous vehicle could provide high precision, resolution, and calibration [142]; however, in general, the system has high cost and only allows operations over a limited area. Therefore, movable systems are considered to be the most efficient solution but are not compatible for all crops and environments.
A study conducted by Benet et al. [143] reported the fusion of a robot vehicle with other technologies for the improvement of crop monitoring. A ranging device, an inertial measurement unit (IMU) device, and a color camera were used in fusion mode to better understand the natural positions of the objects in real-time field conditions. Soil variations impacted autonomous navigation in the field, while the IMU data supported the precise positioning of the robot. A fusion method is applied between the corrected points and a color camera in order to recognize every single point [143].
The fusion method was performed between a ranging device and a color camera in different tests to estimate tree trunks and to guide the autonomous vehicle in the field [144]. The method improved the detection of the source target with the combination of a diverse and colorful illustration, delivered in the form of images.
The fusion of a 3D laser with a robot on its arms provides better flexibility in crop monitoring. However, data from behind the leaves could not be collected (Table 1) [145], while its fusion with the robotic manipulator is able to position the camera at the best viewpoint for the collection of data from behind the leaves [146]. However, this support is further challenged by the dependence of its performance on the software application. The utilization of a relevant software application could address this challenge in a timely manner.

4.2. Space-Borne Platforms

Crop monitoring is a process of systematically analyzing agricultural multi-factors using space-borne and aerial platforms such as satellites and UAVs. These platforms have rapidly transformed traditional agricultural monitoring and become one of the widely adopted methods of quickly obtaining information. It is therefore necessary to create a new method to address the available issues. This led to the development of the technologies and data fusion method, which has thus far shown efficient results [147].

Satellites

Currently, field-based experiments and data collection and analysis are implemented for crop monitoring, but they are laborious, time-consuming, destructive, and are not reliable in large-scale tests [48]. Conversely, satellite monitoring has been extensively applied for crop monitoring. However, limited revisiting frequency and spatial resolution [148] weaken the application of satellite monitoring at fine spatial and temporal scales. Moreover, atmospheric conditions, soil background properties [149], lack of 3D canopy [150], and asymptotic saturation data [151] further restrict its application for crop monitoring, particularly among dense and diverse vegetation crops.
In order to address the saturation issue of the spectral data, the fusion of two broadband vegetation indices (Vis) with three spectral bands, narrow spectral bands, or red-edge spectral bands could be a potential solution for its improvement [8,152,153], as shown in Table 2. Moreover, the fusion of the microwave satellite-based metrics with a variety of VIs of three spectral bands have also been tested in reducing the saturation effects for dense and diverse vegetation coverage [154].
The fusion of satellite data with ranging systems has been used for crop monitoring. The fusion of WorldView-2 imagery with a terrestrial ranging system and the fusion of Gaofen-1 satellite system with the airborne ranging system recorded better accuracy in analyzing crop parameters (Table 2). The airborne ranging system has not been widely tested for crop research. This may be caused by its poor capacity of point density and canopy penetration when applied to dense vegetation in addition to its high costs and related system complications in data collection and analysis [45].
Image fusion is a compelling method because it allows the combination of information from multiple satellites with different spatial, spectral, and temporal resolutions to achieve image products with improved results (Table 2) [155]. Satellite applications benefit from the use of image fusion in many ways, such as higher spatial and temporal coverage of the source target along with improved reliability and robustness. Therefore, this study provides more information on data, analysis and fusion methods to improve crop monitoring results.
Image fusion provides important spectral information, which is utilized for quantitative applications such as changes in reflectance over time. However, some image fusion methods provide high-quality reflectance data. Recently, image fusion methods based on spectral unmixing have shown a greater importance for delivering spectrally consistent merged images, while reducing the problem of pixel fusion (Table 2) [156]. The spectral unmixing technique results in the decomposition of mixed pixels into a collection of pure spectra and a set of fractional abundances that show the amount of each spectrum [157].
Mixed pixels collected from medium-resolution satellites, such as MODIS (250 m to 1 km) or MERIS (250 m), are analyzed with the spatial distribution of land covers at the best scale of Landsat (30 m) (Table 2). Similarly, the data fusion between Landsat and MERIS-FR was performed by Sisheber et al. [158], Zurita-Milla et al. [159] and Zurita-Milla et al. [160] to develop the spatial resolution of MERIS. Pignatti et al. [19] developed these methods by analyzing the spatial data of Landsat to improve the MERIS data (Table 2). These data are acquired by classifying the Landsat images to characterize the mixed pixels. The Sentinel-2 and Sentinel-1 satellites provide moderate real-time images with spatial (10–60 m) and temporal (1–5 days) resolution. Studies showed good performance of data for crop monitoring over different cropping systems and environmental conditions [161]. Moreover, latest research reported benefits of data fusion from Sentinel-2 with Sentinel-1, and SAR and Landsat for improving model quality and performance for crop assessment. The major limitation of these methods is the availability of poor spectral variability and characterization. To address this problem, the fusion of images is conducted to improve their class variability (Table 2). However, intraclass variability is removed in this process. Another solution is the utilization of soft clustering, which delivers landcover in proportion to every pixel according to the Landsat resolution, resulting in high-quality pixels [162].

4.3. Aerial Platforms

Unmanned Aerial Vehicle (UAV)

The development of an unmanned aerial vehicle (UAV) has improved the application of crop monitoring at fine scales [2,3]. Particularly, the fusion of lightweight UAV with the red green blue (RGB) wavelengths is flexible, simple, and cost-efficient [163]. Its fusion with the digital photogrammetric method based on the structure from motion method (SfM) provides a detailed 3D canopy, which further leverages its cost-efficiency (Table 2) [164]. Although hindered by insufficient canopy penetration [165], this method provides point clouds with high density and precision when combined with the RGB and digital photogrammetry using the fusion method [166].
The fusion of UAV data with satellite imagery has been applied for crop volume and physiological stress estimation (Table 2) [167]. Moreover, its fusion with the satellite imagery, ranging system, and RGB data significantly develops the estimation accuracy of crop properties [168], as shown in Table 2. However, the UAV-based high-resolution canopy data with the satellite-based diverse spectral data are less tested and need further investigation [167].
The fusion of UAV and satellite imagery could develop the potentials of both platforms. The methods of modified hyperspherical color, FuzeGo, Gram–Schmidt, and criteria-based fusion (0.10 m) with the WorldView-2 satellite (1.85 m) provided good results. The criteria-based fusion is considered to be highly successful in preserving the original color of the images and delivered the highest spatial resolution of all the compared methods. However, the cost of WorldView-2 data hinders their application for many users.
The fusion of UAV with the Sentinel-2A provides fine crop parametric data (Table 2) [169]. Although research shows that this type of fusion improves data quality, analysis of the resolution at a fine scale remains a challenge, and further development in this direction is needed for establishing high-quality fusion methods. The spatial resolution of a UAV depends on the type of sensor and height of the flight. During the tests, fusion of the high-spatial resolution needs a longer time to generate a large amount of data, which ultimately imply the need for more data saving space, transmission time, and processing resources.
Many studies demonstrated the development of UAV using the new type of image data which includes the methodology for radiometric correction of images or development of the entire processing chain by integrating the photogrammetric and quantitative approaches [170]. Generally, UAV offers cheaper applications and higher spatial resolution with high flexibility [171]. Main technology is based on the multispectral systems, which facilitate single band for estimation of broadband vegetation indices [172].
Sentinel-2A data fusion with this system provides integrated benefits of both technologies (Table 2). The fusion of ultra-high-spatial-resolution texture information with the Sentinel-2A diverse spectral imagery produces data of the highest quality. Zhao et al. [173] analyzed the fusion of UAV and satellite data and reported that the fusion method provides good results in comparison with the single data methods. The free access to Sentinel-2A with high spatial (10 m) and spectral resolution would certainly improve its fusion with UAV data. The fusion methods of UAV with satellite data (Table 2) with deep analysis still remain. The type of data also impacts the fusion method, which further affects the crop monitoring values (Table 2).
To obtain high-resolution imagery, the fusion of the lower-spatial-resolution data with the high-spatial-resolution panchromatic imagery is recommended. While the fusion of SPOT data with L-band PALSAR and C-band RADARSAT data acquires better land cover mapping to explore the separate functions of each wavelength. Hong et al. [174] reported a new method for the fusion of high-resolution with moderate resolution images using the image decomposition technique to minimize color distortion.
The random field method is the fusion method in between spatial contextual data for crop classification. The Markov random field (MRF) method was initially tested for image restoration [175] and is now widely used in solving crop classification problems [176]. The Markovian support vector classifier (MSVC) is a new fusion technique that combines the support vector machine (SVM) and MRF methods and delivers high spatial contextual classification [177]. The SVM is a widely adopted method for the fusion of spatial information [178], which promisingly addresses the problem of noise during crop classification.

5. Data Fusion for Site-Specific Crop Parameter Monitoring

A wide variety of data sources collected by modern technology, such as sensor technology and ground/space-borne/aerial platform outcomes, could be used as supplementary information to enhance a lightly sampled variable. However, a data fusion method could benefit the supplementary features of different data sources and merge them into a logically rapid manner [187]. Data fusion can then support the improvement of recognition of site-specific variables, such as N, chlorophyll, LAI, and AGB.

5.1. Nitrogen (N)

Adequate availability of N is important for crop growth and is one of the major factors after water [188]. Crop production with N is limited to its threshold; below this level, it may result in a loss of quality, which may lead to crop death in extreme cases [189]. Puntel et al. [190] tested the fusion of crop growth models while considering the variations in soil properties and the uncertainty of future weather conditions. The study found that the crop models provided a 77–81% accuracy rate by using real-time and 35 years of previous weather data. They reported that the model provided good estimates of N content with acceptable economic values. Chen et al. [191] and Wang et al. [192] reported the fusion of real-time weather data of a specific crop stage with the historical weather data as a new fusion method for exploring uncertainty and making predictions to guide N management decisions (Table 1). Hammad et al. [193] tested the CERES model to estimate plant growth and development and determined the optimal N content under semiarid conditions. Malik et al. [189] used particular integrative functions by the fusion method and improved N monitoring practices. Their results illustrated that farmers could decrease the N content from 30% to 50% under Mediterranean environments. The N monitoring studies performed by many researchers reported that the method could be improved by integrating multiple crop modeling strategies that support increasing crop yield and resource-use efficiency and decreasing the negative environmental impacts under current and future climate conditions [194,195,196,197]. Wang et al. [192] provided an optimum basis for the fusion of different methods and showed that the CERES model, out of many, is the best fusion model to improve the N monitoring strategy (Table 2).

5.2. Chlorophyll

Chlorophyll is positively related with the plant photosynthetic rate and productivity and provides important information about the physiological level of plants [198]. Chlorophyll is analyzed using extract-based and digestion or combustion methods, or a fusion of two [199].
The fusion of a low-cost multispectral sensor using vegetation indices (Vis) and near infrared (NIR) can precisely estimate the chlorophyll content of a diverse set of crops (Table 1). It further has the unique ability to use hyperspectral sensors with the fusion of thousands of spectral bands in between each other, which showed highly effective performance in chlorophyll estimation [200]. Mao et al. [201], Zheng et al. [202], Chen et al. [203], Srivastava et al. [204], and Sun et al. [205] have significantly contributed to the development of chlorophyll estimation methods using hyperspectral sensors integrated with spectral bands (Table 2).
The fusion of images acquired by a UAV merged together with different sensors is a widely accepted trend in recent years because the data fusion of images develops chlorophyll estimation by integrating benefits of diverse spectral, spatial, structural and thermal data available in different sensor technologies [205]. For example, the fusion of spectral and LiDAR structural information was studied for chlorophyll estimation. However, the fusion of spectral index data collected from multispectral sensors and canopy temperature data collected from thermal sensors is considered an improved method. This method could be used for either a single or several crops and is considered the best choice for chlorophyll analysis [204].

5.3. LAI

LAI is a measure of the photosynthetically active region and is subjected to transpiration at the same time. Estimating LAI with the help of remote sensing systems has been a key objective in precision agriculture [206]. The most common method for the estimation of LAI is the fusion of empirical relationships between vegetation indices (Vis) and LAI data (Table 1) [207,208,209,210,211,212]. The fusion of empirical models has been widely adopted for LAI monitoring; however, these models do not show efficient results in all environmental conditions, as their parameters are limited to particular regions [213,214,215].
Although the use of remote sensing systems to provide LAI data has been widely adopted, it shows discontinuous patterns in its spatial and temporal results due to clouds, snow, and technological problems. This has restricted the collection of LAI in crop monitoring, process simulations, and agriculture research [216]. The method involving the fusion of images from multiple sensors provides spatial, spectral, and temporal information of LAI (Table 2) [217]. This discontinuity in spatial and temporal data is caused by a gap in remote sensing data. This gap is addressed by the fusion of high-spatial- and temporal-multisource remote sensing data, which generates high-spatiotemporal-resolution imagery. They are classified into two categories: enhanced spatial and temporal adaptive reflectance fusion model (ESTARFM) [218] and unmixing method. ESTARFM was tested by Bei et al. [219], Tao et al. [220] and Zeng et al. [221], who all reported the enhancement of the fusion method for complex heterogeneous regions. Bei et al. [219] tested the fusion of spatial and temporal adaptive reflectance fusion models (STARFM) with the ESTARFM with different spatiotemporal features.
The unmixing method is the process in which the coarse resolution images are disaggregated. However, the disaggregation of coarse images cannot be processed in all conditions due to the spatial variability of surface reflectance. Several studies have reported that the solution to this is to analyze the spectral properties of landcover that shows little similarities [222].

5.4. AGB

AGB is the total quantity of plant-based living and dead organic substance. Remote sensing systems are widely adopted to estimate AGB in a rapid manner. The fusion of images acquired from a UAV using different sensors have shown efficient performance because the data fusion develops the data acquisition process by merging benefits of different spatiotemporal, spectral, and thermal information of the sensing systems (Table 1). Spectral and LiDAR data fusion were tested for AGB estimation [192]; spectral indices using the RGB images were acquired for AGB estimation [223,224,225,226,227]. These studies showed that data fusion significantly improved the estimation [228,229,230,231,232] and addressed saturation and disaggregation problems in acquiring the AGB, particularly for higher density biomass [233]. Marino and Alvino [234,235] used fusion of data from several vegetation indices for dynamic monitoring and classification of crop traits. Studies showed that the fusion of SAR with optical sensor data improves AGB estimation, as optical data offset the saturation impact. Therefore, AGB monitoring has improved as a result of the latest developments in satellite systems [236,237,238,239]. However, the SAR works for a particular range of AGB, and beyond this range, the system provides coarse data [240]. To address this problem, the fusion of multisource data (optical images) and SAR is suggested to improve the accuracy of AGB estimation (Table 2) [241,242]. This process is based on two methods: (i) the fusion of SAR with optical image data to generate a new dataset (PCA) [243], and (ii) the fusion of all the imagery data together [242]. However, this fusion method is still undergoing various developmental phases. Yu et al. [244], Choudhury et al. [245], Che et al. [246], Choudhury et al. [247], Baath et al. [248], Cooper et al. [249], and Vahtmäe et al. [250] tested the fusion of hyperspectral and multispectral data, which are collected by different sensors, to further achieve high-spatial- and spectral-resolution images. Thus, the best solution for the highly accurate estimation of AGB is the fusion of SAR Sentinel-2A and Sentinel-1 imagery, and it was reported that this fusion generated a significant trend for the estimation of the AGB [245,246]. However, studies on the fusion of ALOS-2, PALSAR-2, and Sentinel-2A data for estimating AGB are not yet fully established. Therefore, further research on the fusion between SAR products is needed to collect more data and reach logical conclusions.

6. Conclusions: Challenges, Opportunities and Future Prospects

Technology and data fusion play an important role in increasing the efficiency of crop monitoring. Spatiotemporal data fusion is based on combining data of fine spatial and coarse temporal resolution with fine temporal resolution and coarse spatial resolution to achieve the objective of creating fine spatiotemporal resolution data. The fusion of the sensor technologies efficiently classifies the crop type, status, and parameters for site-specific crop monitoring.
Optical, thermal infrared, multispectral, hyperspectral, LiDAR, and radar sensors are widely adopted for crop N, chlorophyll, LAI, and AGB estimation.
Multispectral, hyperspectral, and thermal systems are confirmed to be beneficial in crop monitoring, acquiring precise spectral sensitivity measurements in different stages of crop growth and production. Furthermore, LiDAR may be positively used for data fusion of crop parameters. For different sensors, the vis–NIR spectroscopy sensor provides good to excellent crop monitoring accuracy. This also supports the final conclusion of this study that multi-sensor and data fusion approach is a useful method for high crop production and optimizes the quantity and frequency of crop irrigation.
Crop monitoring by space-borne and aerial vehicles has experienced significant improvement. Different types of sensors were used to achieve this objective.
The widely adopted and successful fusion in precision agriculture is the fusion of multi-sensor imagery from UAVs, multiresolution imagery from satellites, and the fusion of UAV and satellite imagery data for improved results. The fusion of this type of data improves the estimation of crop monitoring processes. Therefore, for particular tasks such as yield prediction, the fusion of other data sources (e.g., for field sensors and ground vehicles) may further improve crop growth stage estimation performance.
Space-borne platforms now have the great ability to deliver suitable resolution for crop mapping and monitoring, weather condition assessment, as well as conservation and documentation. Airborne LiDAR provides the purposes of estimation below the vegetation canopy and to conduct field crop analysis better than spectral estimations, or SAR. Airborne UAVs can be used to acquire centimeter-level crop-related data that are mostly site-specific or site scale to develop the detailed crop monitoring and mapping.
The major constraint of data fusion is the multimodality of data from different types of sensors, systems, noise, poor resolution, or flaws. However, the assessment of these advanced models for crop monitoring is still in the development stages. Future developments, such as of intelligent and autonomous systems, are a major necessity for the expansion of technology and data fusion to provide precise data over the forecasted periods.

Author Contributions

Conceptualization, U.A., S.M. and A.N.; methodology, U.A., S.M.; validation, U.A., S.M. and A.N.; investigation, U.A.; resources, U.A., S.M.; writing—original draft preparation and development, U.A., S.M.; writing—review and editing, U.A., S.M., A.N. and O.H.; visualization, U.A.; supervision, U.A., S.M., A.N. and O.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Acknowledgments

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mohamed, E.S.; Belal, A.A.; Abd-Elmabod, S.K.; El-Shirbeny, M.A.; Gad, A.; Zahran, M.B. Smart farming for improving agricultural management. Egypt. J. Remote Sens. Space Sci. 2021, in press. [Google Scholar] [CrossRef]
  2. Khan, N.; Ray, R.L.; Sargani, G.R.; Ihtisham, M.; Khayyam, M.; Ismail, S. Current Progress and Future Prospects of Agriculture Technology: Gateway to Sustainable Agriculture. Sustainability 2021, 13, 4883. [Google Scholar] [CrossRef]
  3. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Daloye, A.M.; Erkbol, H.; Fritschi, F.B. Crop Monitoring Using Satellite/UAV Data Fusion and Machine Learning. Remote Sens. 2020, 12, 1357. [Google Scholar] [CrossRef]
  4. Samantha, M.; Carolina, C.S.; Philip, R.G. Restoring Soil Fertility on Degraded Lands to Meet Food, Fuel, and Climate Security Needs via Perennialization. Front. Sust. Food Syst. 2021, 5, 356. Available online: https://www.frontiersin.org/article/10.3389/fsufs.2021.706142 (accessed on 15 December 2021).
  5. Ziliani, M.G.; Parkes, S.D.; Hoteit, I.; McCabe, M.F. Intra-Season Crop Height Variability at Commercial Farm Scales Using a Fixed-Wing UAV. Remote Sens. 2018, 10, 2007. [Google Scholar] [CrossRef] [Green Version]
  6. Kganyago, M.; Mhangara, P.; Adjorlolo, C. Estimating Crop Biophysical Parameters Using Machine Learning Algorithms and Sentinel-2 Imagery. Remote Sens. 2021, 13, 4314. [Google Scholar] [CrossRef]
  7. Benami, E.; Jin, Z.; Carter, M.R.; Ghosh, A.; Hijmans, R.J.; Hobbs, A.; Kenduiywo, B.; Lobell, D.B. Uniting remote sensing, crop modelling and economics for agricultural risk management. Nat. Rev. Earth Environ. 2021, 2, 140–159. [Google Scholar] [CrossRef]
  8. Pranga, J.; Borra-Serrano, I.; Aper, J.; De Swaef, T.; Ghesquiere, A.; Quataert, P.; Roldán-Ruiz, I.; Janssens, I.A.; Ruysschaert, G.; Lootens, P. Improving Accuracy of Herbage Yield Predictions in Perennial Ryegrass with UAV-Based Structural and Spectral Data Fusion and Machine Learning. Remote Sens. 2021, 13, 3459. [Google Scholar] [CrossRef]
  9. Zha, H.; Miao, Y.; Wang, T.; Li, Y.; Zhang, J.; Sun, W.; Feng, Z.; Kusnierek, K. Improving Unmanned Aerial Vehicle Remote Sensing-Based Rice Nitrogen Nutrition Index Prediction with Machine Learning. Remote Sens. 2020, 12, 215. [Google Scholar] [CrossRef] [Green Version]
  10. Padilla, F.M.; Gallardo, M.; Peña-Fleitas, M.T.; De Souza, R.; Thompson, R.B. Proximal Optical Sensors for Nitrogen Management of Vegetable Crops: A Review. Sensors 2018, 18, 2083. [Google Scholar] [CrossRef] [Green Version]
  11. Bramley, R.G.V.; Ouzman, J. Farmer attitudes to the use of sensors and automation in fertilizer decision-making: Nitrogen fertilization in the Australian grains sector. Precis. Agric. 2019, 20, 157–175. [Google Scholar] [CrossRef]
  12. Gabriel, J.L.; Zarco-Tejada, P.J.; López-Herrera, P.J.; Pérez-Martín, E.; Alonso-Ayuso, M.; Quemada, M. Airborne and ground level sensors for monitoring nitrogen status in a maize crop. Biosyst. Eng. 2017, 160, 124–133. [Google Scholar] [CrossRef]
  13. Li, S.; Ding, X.; Kuang, Q.; Tahir, A.U.K.S.; Tao, C.; Xiaojun, L.; Yongchao, T.; Yan, Z.; Weixing, C.; Qiang, C. Potential of UAV-Based Active Sensing for Monitoring Rice Leaf Nitrogen Status. Front. Plant Sci. 2018, 9, 1834. Available online: https://www.frontiersin.org/article/10.3389/fpls.2018.01834 (accessed on 15 December 2021). [CrossRef] [Green Version]
  14. Santaga, F.S.; Benincasa, P.; Toscano, P.; Antognelli, S.; Ranieri, E.; Vizzari, M. Simplified and Advanced Sentinel-2-Based Precision Nitrogen Management of Wheat. Agronomy 2021, 11, 1156. [Google Scholar] [CrossRef]
  15. Messina, G.; Peña, J.; Vizzari, M.; Modica, G. A Comparison of UAV and Satellites Multispectral Imagery in Monitoring Onion Crop. An application in the ‘Cipolla Rossa di Tropea’ (Italy). Remote Sens. 2020, 12, 3424. [Google Scholar] [CrossRef]
  16. Gara, T.W.; Darvishzadeh, R.; Skidmore, A.K.; Wang, T. Impact of Vertical Canopy Position on Leaf Spectral Properties and Traits across Multiple Species. Remote Sens. 2018, 10, 346. [Google Scholar] [CrossRef] [Green Version]
  17. Prudnikova, E.; Savin, I.; Vindeker, G.; Grubina, P.; Shishkonakova, E.; Sharychev, D. Influence of Soil Background on Spectral Reflectance of Winter Wheat Crop Canopy. Remote Sens. 2019, 11, 1932. [Google Scholar] [CrossRef] [Green Version]
  18. Noda, H.M.; Muraoka, H.; Nasahara, K.N. Plant ecophysiological processes in spectral profiles: Perspective from a deciduous broadleaf forest. J. Plant Res. 2021, 134, 737–751. [Google Scholar] [CrossRef] [PubMed]
  19. Pignatti, S.; Casa, R.; Laneve, G.; Li, Z.; Liu, L.; Marzialetti, P.; Mzid, N.; Pascucci, S.; Silvestro, P.C.; Tolomio, M.; et al. Sino–EU Earth Observation Data to Support the Monitoring and Management of Agricultural Resources. Remote Sens. 2021, 13, 2889. [Google Scholar] [CrossRef]
  20. Jin, X.; Li, Z.; Yang, G.; Yang, H.; Feng, H.; Xu, X.; Wang, J.; Li, X.; Luo, J. Winter wheat yield estimation based on multi-source medium resolution optical and radar imaging data and the AquaCrop model using the particle swarm optimization algorithm. ISPRS J. Photogramm. Remote Sens. 2017, 126, 24–37. [Google Scholar] [CrossRef]
  21. Tianhai, W.; Yadong, L.; Minghui, W.; Qing, F.; Hongkun, T.; Xi, Q.; Yanzhou, L. Applications of UAS in Crop Biomass Monitoring: A Review. Front. Plant Sci. 2021, 12, 595. Available online: https://www.frontiersin.org/article/10.3389/fpls.2021.616689 (accessed on 15 December 2021).
  22. Liao, W.; Chanussot, J.; Philips, W. Remote sensing data fusion: Guided filter-based hyperspectral pansharpening and graph-based feature-level fusion. In Mathematical Models for Remote Sensing Image Processing; Moser, G., Zerubia, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; pp. 243–275. [Google Scholar]
  23. Kochhar, A.; Kumar, N. Wireless sensor networks for greenhouses: An end-to-end review. Comput. Electron. Agric. 2019, 163, 104877. [Google Scholar] [CrossRef]
  24. Shi, X.; An, X.; Zhao, Q.; Liu, H.; Xia, L.; Sun, X.; Guo, Y. State-of-the-art internet of things in protected agriculture. Sensors 2019, 19, 1833. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Zambon, I.; Cecchini, M.; Egidi, G.; Saporito, M.G.; Colantoni, A. Revolution 4.0: Industry vs. Agriculture in a Future Development for SMEs. Processes 2019, 7, 36. [Google Scholar] [CrossRef] [Green Version]
  26. Shamshiri, R.R.; Weltzien, C.; Hameed, I.A.; Yule, I.J.; Grift, T.E.; Balasundram, S.K.; Pitonakova, L.; Ahmad, D.; Chowdhary, G. Research and development in agricultural robotics: A perspective of digital farming. Int. J. Agric. Biol. Eng. 2018, 11, 1–14. [Google Scholar] [CrossRef]
  27. van der Meij, B.; Kooistra, L.; Suomalainen, J.; Barel, J.M.; De Deyn, G.B. Remote sensing of plant trait responses to field-based plant–soil feedback using UAV-based optical sensors. Biogeosci. 2017, 4, 733–749. [Google Scholar] [CrossRef] [Green Version]
  28. Thompson, R.B.; Tremblay, N.; Fink, M.; Gallardo, M.; Padilla, F.M. Tools and strategies for sustainable nitrogen haracterize of vegetable crops. In Advances in Research on Fertilization Management in Vegetable Crops; Tei, F., Nicola, S., Benincasa, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; pp. 11–63. [Google Scholar]
  29. Domingues Franceschini, M.H.; Bartholomeus, H.; Van Apeldoorn, D.; Suomalainen, J.; Kooistra, L. Intercomparison of Unmanned Aerial Vehicle and Ground-Based Narrow Band Spectrometers Applied to Crop Trait Monitoring in Organic Potato Production. Sensors 2017, 17, 1428. [Google Scholar] [CrossRef]
  30. Fahey, T.; Pham, H.; Gardi, A.; Sabatini, R.; Stefanelli, D.; Goodwin, I.; Lamb, D.W. Active and Passive Electro-Optical Sensors for Health Assessment in Food Crops. Sensors 2021, 21, 171. [Google Scholar] [CrossRef]
  31. Jiang, J.; Wang, C.; Wang, H.; Fu, Z.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Evaluation of Three Portable Optical Sensors for Non-Destructive Diagnosis of Nitrogen Status in Winter Wheat. Sensors 2021, 21, 5579. [Google Scholar] [CrossRef]
  32. Ahmad, U.; Alvino, A.; Marino, S. A Review of Crop Water Stress Assessment Using Remote Sensing. Remote Sens. 2021, 13, 4155. [Google Scholar] [CrossRef]
  33. Mahya, T.; Benjamin, W.; Graham, B.; Sigfredo, F.; Alexis, P.; Dorin, G. Optimizing Sensor-Based Irrigation Management in a Soilless Vertical Farm for Growing Microgreens. Front. Sustain. Food Sys. 2021, 4, 313. Available online: https://www.frontiersin.org/article/10.3389/fsufs.2020.622720 (accessed on 15 December 2021).
  34. Alexandris, S.; Psomiadis, E.; Proutsos, N.; Philippopoulos, P.; Charalampopoulos, I.; Kakaletris, G.; Papoutsi, E.-M.; Vassilakis, S.; Paraskevopoulos, A. Integrating Drone Technology into an Innovative Agrometeorological Methodology for the Precise and Real-Time Estimation of Crop Water Requirements. Hydrology 2021, 8, 131. [Google Scholar] [CrossRef]
  35. Quemada, C.; Pérez-Escudero, J.M.; Gonzalo, R.; Ederra, I.; Santesteban, L.G.; Torres, N.; Iriarte, J.C. Remote Sensing for Plant Water Content Monitoring: A Review. Remote Sens. 2021, 13, 2088. [Google Scholar] [CrossRef]
  36. Ludovisi, R.; Tauro, F.; Salvati, R.; Khoury, S.; Mugnozza, G.S.; Harfouche, A. Uav-based thermal imaging for high-throughput field phenotyping of black poplar response to drought. Front. Plant Sci. 2017, 8, 1–18. [Google Scholar] [CrossRef] [PubMed]
  37. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172. [Google Scholar] [CrossRef]
  38. Neupane, K.; Baysal-Gurel, F. Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review. Remote Sens. 2021, 13, 3841. [Google Scholar] [CrossRef]
  39. Traore, A.; Ata-Ul-Karim, S.T.; Duan, A.; Soothar, M.K.; Traore, S.; Zhao, B. Predicting Equivalent Water Thickness in Wheat Using UAV Mounted Multispectral Sensor through Deep Learning Techniques. Remote Sens. 2021, 13, 4476. [Google Scholar] [CrossRef]
  40. Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
  41. Zhou, J.-J.; Zhang, Y.-H.; Han, Z.-M.; Liu, X.-Y.; Jian, Y.-F.; Hu, C.-G.; Dian, Y.-Y. Evaluating the Performance of Hyperspectral Leaf Reflectance to Detect Water Stress and Estimation of Photosynthetic Capacities. Remote Sens. 2021, 13, 2160. [Google Scholar] [CrossRef]
  42. Colaço, A.F.; Schaefer, M.; Bramley, R.G.V. Broadacre Mapping of Wheat Biomass Using Ground-Based LiDAR Technology. Remote Sens. 2021, 13, 3218. [Google Scholar] [CrossRef]
  43. Yuan, W.; Li, J.; Bhatta, M.; Shi, Y.; Baenziger, P.S.; Ge, Y. Wheat height estimation using LiDAR in comparison to ultrasonic sensor and UAS. Sensors 2018, 18, 3731. [Google Scholar] [CrossRef] [Green Version]
  44. Jimenez-Berni, J.A.; Deery, D.M.; Rozas-Larraondo, P.; Condon, A.T.G.; Rebetzke, G.J.; James, R.A.; Bovill, W.D.; Furbank, R.T.; Sirault, X.R.R. High throughput determination of plant height, ground cover, and above-ground biomass in wheat with LiDAR. Front. Plant Sci. 2018, 9, 237. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Walter, J.D.C.; Edwards, J.; McDonald, G.; Kuchel, H. Estimating biomass and canopy height with LiDAR for field crop breeding. Front. Plant Sci. 2019, 10, 1145. [Google Scholar] [CrossRef] [PubMed]
  46. Deery, D.M.; Rebetzke, G.J.; Jimenez-Berni, J.A.; Condon, A.G.; Smith, D.J.; Bechaz, K.M.; Bovill, W.D. Ground-based lidar improves phenotypic repeatability of above-ground biomass and crop growth rate in wheat. Plant Phenomics 2020, 1–11. [Google Scholar] [CrossRef]
  47. Shendryk, Y.; Sofonia, J.; Garrard, R.; Rist, Y.; Skocaj, D.; Thorburn, P. Fine-scale prediction of biomass and leaf nitrogen content in sugarcane using UAV LiDAR and multispectral imaging. Int. J. Appl. Earth Obs. Geoinf. 2020, 92, 102177. [Google Scholar] [CrossRef]
  48. Bates, J.S.; Montzka, C.; Schmidt, M.; Jonard, F. Estimating canopy density parameters time-series for winter wheat using UAS mounted lidar. Remote Sens. 2021, 13, 710. [Google Scholar] [CrossRef]
  49. Klaina, H.; Guembe, I.P.; Lopez-Iturri, P.; Campo-Bescós, M.A.; Azpilicueta, L.; Aghzout, O.; Alejos, A.V.; Falcone, F. Analysis of low power wide area network wireless technologies in smart agriculture for large-scale farm monitoring and tractor communications. Measurement 2022, 187, 110231. [Google Scholar] [CrossRef]
  50. Vasconez, J.P.; Kantor, G.A.; Cheein, F.A.A. Human–robot interaction in agriculture: A survey and current challenges. Biosyst. Eng. 2019, 179, 35–48. [Google Scholar] [CrossRef]
  51. Zhou, S.; Zhao, H.; Chen, W.; Miao, Z.; Liu, Z.; Wang, H.; Liu, Y.H. Robust Path Following of the Tractor-Trailers System in GPS-Denied Environments. IEEE Rob. Autom. Lett. 2020, 5, 500–507. [Google Scholar] [CrossRef]
  52. Wang, H.; Ren, Y.; Meng, Z. A Farm Management Information System for Semi-Supervised Path Planning and Autonomous Vehicle Control. Sustainability 2021, 13, 7497. [Google Scholar] [CrossRef]
  53. Abdulazeez, A.M.; Faizi, F.S. Vision-Based Mobile Robot Controllers: A Scientific Review. Turkish J. Comp. Math. Edu. 2021, 12, 1563–1580. [Google Scholar]
  54. Nakalembe, C.; Becker-Reshef, I.; Bonifacio, R.; Hu, G.; Humber, M.L.; Justice, C.J.; Keniston, J.; Mwangi, K.; Rembold, F.; Shukla, S.; et al. A review of satellite-based global agri-cultural monitoring systems available for Africa. Glob. Food Secur. 2021, 29, 100543. [Google Scholar] [CrossRef]
  55. Munier, S.; Carrer, D.; Planque, C.; Camacho, F.; Albergel, C.; Calvet, J.-C. Satellite Leaf Area Index: Global Scale Analysis of the Tendencies Per Vegetation Type Over the Last 17 Years. Remote Sens. 2018, 10, 424. [Google Scholar] [CrossRef] [Green Version]
  56. Delavarpour, N.; Koparan, C.; Nowatzki, J.; Bajwa, S.; Sun, X. A Technical Study on UAV Characteristics for Precision Agri-culture Applications and Associated Practical Challenges. Remote Sens. 2021, 13, 1204. [Google Scholar] [CrossRef]
  57. Fernández-Novales, J.; Saiz-Rubio, V.; Barrio, I.; Rovira-Más, F.; Cuenca-Cuenca, A.; Santos Alves, F.; Valente, J.; Tardaguila, J.; Diago, M.P. Monitoring and Mapping Vineyard Water Status Using Non-Invasive Technologies by a Ground Robot. Remote Sens. 2021, 13, 2830. [Google Scholar] [CrossRef]
  58. Serrano, J.; Shahidian, S.; Carapau, Â.; Rato, A.E. Near-Infrared Spectroscopy (NIRS) and Optical Sensors for Estimating Protein and Fiber in Dryland Mediterranean Pastures. AgriEngineering 2021, 3, 73–91. [Google Scholar] [CrossRef]
  59. Martins, R.N.; de Carvalho Pinto, F.A.; Rosas, J.T.F.; dos Santos, F.F.L.; Viana, L.A. Comparison of optical sensors in assessing the nitrogen (N) status in corn. IDESIA Chile 2020, 38, 67–73. Available online: https://www.scielo.cl/pdf/idesia/v38n1/0718-3429-idesia-38-01-67.pdf (accessed on 15 December 2021). [CrossRef]
  60. Huang, S.; Miao, Y.; Yuan, F.; Cao, Q.; Ye, H.; Lenz-Wiedemann, V.I.S.; Bareth, G. In-Season Diagnosis of Rice Nitrogen Status Using Proximal Fluorescence Canopy Sensor at Different Growth Stages. Remote Sens. 2019, 11, 1847. [Google Scholar] [CrossRef] [Green Version]
  61. Cao, Q.; Miao, Y.; Shen, J.; Yuan, F.; Cheng, S.; Cui, Z. Evaluating Two Crop Circle Active Canopy Sensors for In-Season Di-agnosis of Winter Wheat Nitrogen Status. Agronomy 2018, 8, 201. [Google Scholar] [CrossRef] [Green Version]
  62. Zecha, C.W.; Peteinatos, G.G.; Link, J.; Claupein, W. Utilisation of Ground and Airborne Optical Sensors for Nitrogen Level Identification and Yield Prediction in Wheat. Agriculture 2018, 8, 79. [Google Scholar] [CrossRef] [Green Version]
  63. Satognon, F.; Lelei, J.J.; Owido, S.F.O. Use of GreenSeeker and CM-100 as manual tools for nitrogen management and yield prediction in irrigated potato (Solanum tuberosum) production. Arch. Agric. Environ. Sci. 2021, 6, 121–128. [Google Scholar] [CrossRef]
  64. Canata, T.F.; Wei, M.C.F.; Maldaner, L.F.; Molin, J.P. Sugarcane Yield Mapping Using High-Resolution Imagery Data and Machine Learning Technique. Remote Sens. 2021, 13, 232. [Google Scholar] [CrossRef]
  65. Khorramifar, A.; Rasekh, M.; Karami, H.; Malaga-Toboła, U.; Gancarz, M. A Machine Learning Method for Classification and Identification of Potato Cultivars Based on the Reaction of MOS Type Sensor-Array. Sensors 2021, 21, 5836. [Google Scholar] [CrossRef] [PubMed]
  66. Momin, M.A.; Rahman, M.J.; Mieno, T. Foot pressure sensor system made from MWCNT coated cotton fibers to monitor human activities. Surf. Coat. Technol. 2020, 394, 125749. [Google Scholar] [CrossRef]
  67. Apolo-Apolo, O.E.; Pérez-Ruiz, M.; Castro-Valdecantos, P.; Egea, G. Evaluation of a portable sensor suite for real time CWSI monitoring in wheat. In Precision Agriculture ’21; Stafford, J.V., Ed.; Wageningen Academic Publishers: Wageningen, The Netherlands, 2021; pp. 267–273. [Google Scholar] [CrossRef]
  68. Shan, G.; Maack, C.; Buescher, W.; Glenz, G.; Milimonka, A.; Deeken, H.; Grantz, D.A.; Wang, Y.; Sun, Y. Multi-sensor meas-urement of O2, CO2 and reheating in triticale silage: An extended approach from aerobic stability to aerobic microbial res-piration. Biosyst. Eng. 2021, 207, 1–11. [Google Scholar] [CrossRef]
  69. Latifah, A.; Ramdhani, W.; Nasrulloh, M.R.; Elsen, R. Ultrasonic sensor for monitoring corn growth based on Raspberry Pi. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2021; Volume 1098, p. 042087. [Google Scholar] [CrossRef]
  70. Dong, Y.; Wang, J.; Huang, W.; Ye, H.; Zhu, Y. Monitoring Barley Growth Condition with Multi-scale Remote Sensing Images. In Proceedings of the 2021 9th International Conference on Agro-Geoinformatics (Agro-Geoinformatics), Shenzhen, China, 26–29 July 2021; pp. 1–4. [Google Scholar] [CrossRef]
  71. Pal, P.; Sharma, R.P.; Tripathi, S.; Kumar, C.; Ramesh, D. Genetic algorithm optimized node deployment in IEEE 802.15.4 potato and wheat crop monitoring infrastructure. Sci. Rep. 2021, 11, 8231. [Google Scholar] [CrossRef]
  72. Ali, A.M.; Ibrahim, S.M. Wheat grain yield and nitrogen uptake prediction using at Leaf and GreenSeeker portable optical sensors at jointing growth stage. Inf. Proc. Agric. 2020, 7, 375–383. [Google Scholar] [CrossRef]
  73. Cummings, C.; Miao, Y.; Paiao, G.D.; Kang, S.; Fernández, F.G. Corn Nitrogen Status Diagnosis with an Innovative Mul-ti-Parameter Crop Circle Phenom Sensing System. Remote Sens. 2021, 13, 401. [Google Scholar] [CrossRef]
  74. da Silva, J.M.; Fontes, P.C.R.; Milagres, C.D.C.; Junior, E.G. Application of Proximal Optical Sensors to Assess Nitrogen Status and Yield of Bell Pepper Grown in Slab. J. Soil Sci. Plant Nutr. 2021, 21, 229–237. [Google Scholar] [CrossRef]
  75. Walker, H.V.; Jones, J.E.; Swarts, N.D.; Rodemann, T.; Kerslake, F.; Dambergs, R.G. Predicting grapevine canopy nitrogen status using proximal sensors and near-infrared reflectance spectroscopy. J. Soil Sci. Plant Nutr. 2021, 184, 204–304. [Google Scholar] [CrossRef]
  76. Dong, R.; Miao, Y.; Wang, X.; Chen, Z.; Yuan, F. Improving maize nitrogen nutrition index prediction using leaf fluorescence sensor combined with environmental and management variables. Field Crops Res. 2021, 269, 108180. [Google Scholar] [CrossRef]
  77. Dong, T.; Shang, J.; Chen, J.M.; Liu, J.; Qian, B.; Ma, B.; Morrison, M.J.; Zhang, C.; Liu, Y.; Shi, Y. Assessment of Portable Chlorophyll Meters for Measuring Crop Leaf Chlorophyll Concentration. Remote Sens. 2019, 11, 2706. [Google Scholar] [CrossRef] [Green Version]
  78. Zhang, K.; Liu, X.; Ma, Y.; Zhang, R.; Cao, Q.; Zhu, Y.; Cao, W.; Tian, Y. A Comparative Assessment of Measures of Leaf Nitrogen in Rice Using Two Leaf-Clip Meters. Sensors 2020, 20, 175. [Google Scholar] [CrossRef] [Green Version]
  79. Rovira-Más, F.; Saiz-Rubio, V.; Cuenca-Cuenca, A. Sensing Architecture for Terrestrial Crop Monitoring: Harvesting Data as an Asset. Sensors 2021, 21, 3114. [Google Scholar] [CrossRef] [PubMed]
  80. Fuentes, S.; de Bei, R.; Pech, J.; Tyerman, S. Computational water stress indices obtained from thermal image analysis of grapevine canopies. Irrig. Sci. 2012, 30, 523–536. [Google Scholar] [CrossRef]
  81. Petrie, P.R.; Wang, Y.; Liu, S.; Lam, S.; Whitty, M.A.; Skewes, M.A. The accuracy and utility of a low cost thermal camera and smartphone-based system to assess grapevine water status. Biosyst. Eng. 2019, 179, 126–139. [Google Scholar] [CrossRef]
  82. Bellvert, J.; Zarco-Tejada, P.J.; Marsal, J.; Girona, J.; González-Dugo, V.; Fereres, E. Vineyard irrigation scheduling based on airborne thermal imagery and water potential thresholds. Aust. J. Grape Wine Res. 2016, 22, 307–315. [Google Scholar] [CrossRef] [Green Version]
  83. Anderson, M.C.; Allen, R.G.; Morse, A.; Kustas, W.P. Use of landsat thermal imagery in monitoring evapotranspiration and managing water resources. Remote Sens. Environ. 2012, 122, 50–65. [Google Scholar] [CrossRef]
  84. Egea, G.; Padilla-Díaz, C.M.; Martinez-Guanter, J.; Fernández, J.E.; Pérez-Ruiz, M. Assessing a crop water stress index derived from aerial thermal imaging and infrared thermometry in super-high density olive orchards. Agric. Water Manag. 2017, 187, 210–221. [Google Scholar] [CrossRef] [Green Version]
  85. Liu, Y.; Subhash, C.; Yan, J.; Song, C.; Zhao, J.; Li, J. Maize leaf temperature responses to drought: Thermal imaging and quantitative trait loci (QTL) mapping. Environ. Exp. Bot. 2011, 71, 158–165. [Google Scholar] [CrossRef]
  86. Alves, I.; Pereira, L.S. Non-water-stressed baselines for irrigation scheduling with infrared thermometers: A new approach. Irrig. Sci. 2000, 19, 101–106. [Google Scholar] [CrossRef]
  87. Ayeneh, A.; van Ginkel, M.; Reynolds, M.P.; Ammar, K. Comparison of leaf, spike, peduncle and canopy temperature depression in wheat under heat stress. Fields Crops Res. 2002, 79, 173–184. [Google Scholar] [CrossRef]
  88. Berni, J.A.; Zarco-Tejada, P.J.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Transact. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef] [Green Version]
  89. Taghvaeian, S.; Chávez, J.L.; Altenhofen, J.; Trout, T.; DeJonge, K. Remote Sensing for Evaluating Crop Water Stress at Field Scale Using Infrared Thermography: Potential and Limitations. Ph.D. Dissertation, Colorado State University, Fort Collins, CO, USA, 2013. [Google Scholar]
  90. Wang, X.; Yang, W.; Wheaton, A.; Cooley, N.; Moran, B. Automated canopy temperature estimation via infrared thermography: A first step towards automated plant water stress monitoring. Comps. Electron. Agric. 2010, 73, 74–83. [Google Scholar] [CrossRef]
  91. Leinonen, I.; Jones, H.G. Combining thermal and visible imagery for estimating canopy temperature and identifying plant stress. J. Exp. Bot. 2004, 55, 1423–1431. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  92. Cohen, Y.; Alchanatis, V.; Meron, M.; Saranga, Y.; Tsipris, J. Estimation of leaf water potential by thermal imagery and spatial analysis. J. Exp. Bot. 2005, 56, 1843–1852. [Google Scholar] [CrossRef] [Green Version]
  93. Colaizzi, P.D.; O’Shaughnessy, S.O.; Evett, S.R.; Howell, T.A. Using Plant Canopy Temperature to Improve Irrigated Crop Management; U.S. Department of Agriculture: Colby, KS, USA, 2012; pp. 203–223.
  94. Fitzgerald, G.J.; Rodriguez, D.; Christensen, L.K.; Belford, R.; Sadras, V.O.; Clarke, T.R. Spectral and thermal sensing for nitrogen and water status in rainfed and irrigated wheat environments. Precis. Agric. 2006, 7, 233–248. [Google Scholar] [CrossRef]
  95. Crawford, K.E. Remote Sensing of Almond and Walnut Tree Canopy Temperatures Using An Inexpensive Infrared Sensor on a Small Unmanned Aerial Vehicle; University of California Davis: Davis, CA, USA, 2012. [Google Scholar]
  96. Gago, J.; Douthe, C.; Coopman, R.E.; Gallego, P.P.; Ribas-Carbo, M.; Flexas, J.; Escalona, J.; Medrano, H. UAVs challenge to assess water stress for sustainable agriculture. Agric. Water Manag. 2015, 153, 9–19. [Google Scholar] [CrossRef]
  97. Sepúlveda-Reyes, D.; Ingram, B.; Bardeen, M.; Zúñiga, M.; Ortega-Farías, S.; Poblete-Echeverría, C. Selecting canopy zones and thresholding approaches to assess grapevine water status by using aerial and ground-based thermal imaging. Remote Sens. 2016, 8, 822. [Google Scholar] [CrossRef] [Green Version]
  98. Sepulcre-Cantó, G.; Zarco-Tejada, P.J.; Jiménez-Muñoz, J.C.; Sobrino, J.A.; de Miguel, E.; Villalobos, F.J. Detection of water stress in an olive orchard with thermal remote sensing imagery. Agric. For. Meteorol. 2006, 136, 31–44. [Google Scholar] [CrossRef]
  99. ESA. Resolution and Swath. Available online: Earth.esa.int/web/sentinel/missions/sentinel-2/instrument-payload/resolution-and-swath (accessed on 16 September 2021).
  100. Wang, S.; Garcia, M.; Ibrom, A.; Bauer-gottwein, P. Temporal interpolation of land surface fluxes derived from remote sensing—Results with an Unmanned Aerial System. Hydrol. Earth Syst. Sci. 2020, 24, 3643–3661. [Google Scholar] [CrossRef]
  101. Easterday, K.; Kislik, C.; Dawson, T.; Hogan, S.; Kelly, M. Remotely Sensed Water Limitation in Vegetation: Insights from an Experiment with Unmanned Aerial Vehicles (UAVs). Remote Sens. 2019, 11, 1853. [Google Scholar] [CrossRef] [Green Version]
  102. Padró, J.C.; Muñoz, F.J.; Ávila, L.Á.; Pesquer, L.; Pons, X. Radiometric correction of Landsat-8 and Sentinel-2A scenes using drone imagery in synergy with field spectroradiometry. Remote Sens. 2018, 10, 1687. [Google Scholar] [CrossRef] [Green Version]
  103. Dhau, I.; Adam, E.; Mutanga, O.; Ayisi, K.; Abdel-Rahman, E.M.; Odindi, J.; Masocha, M. Testing the capability of spectral resolution of the new multispectral sensors on detecting the severity of grey leaf spot disease in maize crop. Geocarto Int. 2018, 33, 1223–1236. [Google Scholar] [CrossRef]
  104. Mahlein, A.-K.; Kuska, M.T.; Behmann, J.; Polder, G.; Walter, A. Hyperspectral Sensors and Imaging Technologies in Phyto-pathology: State of the Art. Annu. Rev. Phytopathol. 2018, 56, 535–558. [Google Scholar] [CrossRef]
  105. Liu, N.; Townsend, P.A.; Naber, M.R.; Bethke, P.C.; Hills, W.B.; Wang, Y. Hyperspectral imagery to monitor crop nutrient status within and across growing seasons. Remote Sens. Environ. 2021, 255, 112303. [Google Scholar] [CrossRef]
  106. Camino, C.; Gonzalez-Dugo, V.; Hernandez, P.; Zarco-Tejada, P.J. Radiative transfer Vcmax estimation from hyperspectral imagery and SIF retrievals to assess photosynthetic performance in rainfed and irrigated plant phenotyping trials. Remote Sens Environ. 2019, 231, 05005. [Google Scholar] [CrossRef]
  107. Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent Advances of Hyperspectral Imaging Technology and Applications in Agri-culture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
  108. Lee, K.; Cohen, W.B.; Kennedy, R.E.; Maiersperger, T.K.; Gower, S.T. Hyperspectral versus multispectral data for estimating leaf area index in four different biomes. Remote Sens. Environ. 2004, 91, 508–520. [Google Scholar] [CrossRef]
  109. Mariotto, I.; Thenkabail, P.S.; Huete, A.; Slonecker, E.T.; Platonov, A. Hyperspectral versus multispectral crop-productivity modeling and type discrimination for the HyspIRI mission. Remote Sens. Environ. 2013, 139, 291–305. [Google Scholar] [CrossRef]
  110. Marshall, M.; Thenkabail, P. Advantage of hyperspectral EO-1 Hyperion over multispectral IKONOS, GeoEye-1, WorldView-2, Landsat ETM+, and MODIS vegetation indices in crop biomass estimation. ISPRS J. Photogramm. 2015, 108, 205–218. [Google Scholar] [CrossRef] [Green Version]
  111. Lodhi, V.; Chakravarty, D.; Mitra, P. Hyperspectral Imaging System: Development Aspects and Recent Trends. Sens. Imaging 2019, 20, 1–24. [Google Scholar] [CrossRef]
  112. Mahajan, G.R.; Pandey, R.N.; Sahoo, R.N.; Gupta, V.K.; Datta, S.C.; Kumar, D. Monitoring nitrogen, phosphorus and sulphur in hybrid rice (Oryza sativa L.) using hyperspectral remote sensing. Precis. Agric. 2017, 18, 736–761. [Google Scholar] [CrossRef]
  113. Neuville, R.; Bates, J.S.; Jonard, F. Estimating Forest structure from UAV-mounted LiDAR point cloud using machine learning. Remote. Sens. 2021, 13, 352. [Google Scholar] [CrossRef]
  114. Fang, H.; Baret, F.; Plummer, S.; Schaepman-Strub, G. An overview of global leaf area index (LAI): Methods, products, vali-dation, and applications. Rev. Geophys. 2019, 57, 739–799. [Google Scholar] [CrossRef]
  115. Liu, Z.; Shen, Y.; Lakshminarasimhan, V.B.; Liang, P.P.; Zadeh, A.B.; Morency, L.-P. Efficient low-rank multimodal fusion with modality-specific factors. Cornell University, Department of Computer Science. arXiv 2018, arXiv:1806.00064. Available online: https://arxiv.org/abs/1806.00064 (accessed on 15 December 2021).
  116. Luo, S.; Chen, J.M.; Wang, C.; Gonsamo, A.; Xi, X.; Lin, Y.; Qian, M.; Peng, D.; Nie, S.; Qin, H. Comparative performances of airborne LiDAR height and intensity data for leaf area index estimation. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2018, 11, 300–310. [Google Scholar] [CrossRef]
  117. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  118. Nasirzadehdizaji, R.; Cakir, Z.; Sanli, F.B.; Abdikan, S.; Pepe, A.; Calò, F. Sentinel-1 interferometric coherence and backscat-tering analysis for crop monitoring. Comp. Electron. Agric. 2021, 185, 106118. [Google Scholar] [CrossRef]
  119. Gella, G.W.; Bijker, W.; Belgiu, M. Mapping crop types in complex farming areas using SAR imagery with dynamic time warping. ISPRS J. Photogramm. Remote Sens. 2021, 175, 171–183. [Google Scholar] [CrossRef]
  120. Cui, J.; Zhang, X.; Wang, W.; Wang, L. Integration of optical and SAR remote sensing images for crop-type mapping based on a novel object-oriented feature selection method. Int. J. Agric. Biol. Eng. 2020, 13, 178–190. [Google Scholar] [CrossRef]
  121. Adrian, J.; Sagan, V.; Maimaitijiang, M. Sentinel SAR-optical fusion for crop type mapping using deep learning and Google Earth Engine. ISPRS J. Photogramm. Remote Sens. 2021, 175, 215–235. [Google Scholar] [CrossRef]
  122. Verma, A.; Kumar, A.; Lal, K. Kharif crop characterization using combination of SAR and MSI Optical Sentinel Satellite da-tasets. J. Earth Syst. Sci. 2019, 128, 1–13. [Google Scholar] [CrossRef] [Green Version]
  123. D’Aranno, P.J.V.; Di Benedetto, A.; Fiani, M.; Marsella, M.; Moriero, I.; Palenzuela Baena, J.A. An Application of Persistent Scatterer Interferometry (PSI) Technique for Infrastructure Monitoring. Remote Sens. 2021, 13, 1052. [Google Scholar] [CrossRef]
  124. Reugdenhil, M.; Wagner, W.; Bauer-Marschallinger, B.; Pfeil, I.; Teubner, I.; Rüdiger, C.; Strauss, P. Sensitivity of Sentinel-1 Backscatter to Vegetation Dynamics: An Austrian Case Study. Remote Sens. 2018, 10, 1396. [Google Scholar] [CrossRef] [Green Version]
  125. Khabbazan, S.; Vermunt, P.; Steele-Dunne, S.; Ratering Arntz, L.; Marinetti, C.; van der Valk, D.; Iannini, L.; Molijn, R.; Westerdijk, K.; van der Sande, C. Crop Monitoring Using Sentinel-1 Data: A Case Study from The Netherlands. Remote Sens. 2019, 11, 1887. [Google Scholar] [CrossRef] [Green Version]
  126. Xu, X.; Fan, L.; Li, Z.; Meng, Y.; Feng, H.; Yang, H.; Xu, B. Estimating Leaf Nitrogen Content in Corn Based on Information Fusion of Multiple-Sensor Imagery from UAV. Remote Sens. 2021, 13, 340. [Google Scholar] [CrossRef]
  127. Sun, J.; Shi, S.; Gong, W.; Yang, J.; Du, L.; Song, S.; Chen, B.; Zhang, Z. Evaluation of hyperspectral LiDAR for monitoring rice leaf nitrogen by comparison with multispectral LiDAR and passive spectrometer. Sci. Rep. 2017, 7, 40362. [Google Scholar] [CrossRef]
  128. Li, S.; Jiao, J.; Wang, C. Research on Polarized Multi-Spectral System and Fusion Algorithm for Remote Sensing of Vegetation Status at Night. Remote Sens. 2021, 13, 3510. [Google Scholar] [CrossRef]
  129. Du, L.; Jin, Z.; Chen, B.; Chen, B.; Gao, W.; Yang, J.; Shi, S.; Song, S.; Wang, M.; Gong, W.; et al. Application of Hyperspectral LiDAR on 3-D Chlorophyll-Nitrogen Mapping of Rohdea Japonica in Laboratory. IEEE J. Sel. Top. App. Earth Obs. -Vations Remote Sens. 2021, 14, 9667–9679. [Google Scholar] [CrossRef]
  130. Hosoi, F.; Umeyama, S.; Kuo, K. Estimating 3D Chlorophyll Content Distribution of Trees Using an Image Fusion Method Between 2D Camera and 3D Portable Scanning Lidar. Remote Sens. 2019, 11, 2134. [Google Scholar] [CrossRef] [Green Version]
  131. Pipia, L.; Muñoz-Marí, J.; Amin, E.; Belda, S.; Camps-Valls, G.; Verrelst, J. Fusing optical and SAR time series for LAI gap filling with multioutput Gaussian processes. Remote Sens. Environ. 2019, 235. [Google Scholar] [CrossRef]
  132. Qi, H.; Zhu, B.; Wu, Z.; Liang, Y.; Li, J.; Wang, L.; Chen, T.; Lan, Y.; Zhang, L. Estimation of Peanut Leaf Area Index from Unmanned Aerial Vehicle Multispectral Images. Sensors 2020, 20, 6732. [Google Scholar] [CrossRef] [PubMed]
  133. Räsänen, A.; Juutinen, S.; Kalacska, M.; Aurela, M.; Heikkinen, P.; Mäenpää, K.; Rimali, A.; Virtanen, T. Peatland leaf-area index and biomass estimation with ultra-high resolution remote sensing. GISci. Remote Sens. 2020, 57, 943–964. [Google Scholar] [CrossRef]
  134. Banerjee, B.P.; Spangenberg, G.; Kant, S. Fusion of Spectral and Structural Information from Aerial Images for Improved Biomass Estimation. Remote Sens. 2020, 12, 3164. [Google Scholar] [CrossRef]
  135. Debastiani, A.; Sanquetta, C.; Corte, A.; Pinto, N.; Rex, F. Evaluating SAR-optical sensor fusion for aboveground biomass estimation in a Brazilian tropical forest. Ann. For. Res. 2019, 62, 109–122. [Google Scholar] [CrossRef]
  136. Torres, A.B.; da Rocha, A.R.; da Silva, T.L.C.; de Souza, J.N.; Gondim, R.S. Multilevel data fusion for the internet of things in smart agriculture. Comps. Electron. Agric. 2020, 171, 105309. [Google Scholar] [CrossRef]
  137. Adamchuk, V.I.; Hummel, J.W.; Morgan, M.T.; Upadhyaya, S.K. On-the-go soil sensors for precision agriculture. Comp. Electron. Agric. 2004, 44, 71–91. [Google Scholar] [CrossRef] [Green Version]
  138. Nakamura, E.F.; Loureiro, A.A.F.; Frery, A.C. Information fusion for wireless sensor networks: Methods, models, and clas-sifications. ACM Comput. Surv. 2007, 39, 9-es. [Google Scholar] [CrossRef]
  139. Zhou, Z.; Majeed, Y.; Naranjo, G.D.; Gambacorta, E.M.T. Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications. Comp. Electron. Agric. 2021, 182, 1060149. [Google Scholar] [CrossRef]
  140. Somkuti, P.; Bösch, H.; Feng, L.; Palmer, P.I.; Parker, R.J.; Quaife, T. A new space-borne perspective of crop productivity variations over the US Corn Belt. Agric. Forest Meteo. 2020, 281, 107826. [Google Scholar] [CrossRef]
  141. Ji, J.; Sang, Y.; He, Z.; Jin, X.; Wang, S. Designing an intelligent monitoring system for corn seeding by machine vision and Genetic Algorithm-optimized Back Propagation algorithm under precision positioning. PLoS ONE 2021, 16, e0254544. [Google Scholar] [CrossRef] [PubMed]
  142. Kirchgessner, N.; Liebisch, F.; Yu, K.; Pfeifer, J.; Friedli, M.; Hund, A.; Walter, A. The ETH field phenotyping platform FIP: A cable-suspended multi-sensor system. Funct. Plant Biol. 2017, 44, 154–168. [Google Scholar] [CrossRef] [PubMed]
  143. Benet, B.; Lenain, R.; Rousseau, V. Development of a sensor fusion method for crop row tracking operations. Adv. Anim. Biosci. 2017, 8, 583–589. [Google Scholar] [CrossRef]
  144. Asvadi, A.; Garrote, L.; Premebida, C.; Peixoto, P.; Nunes, U.J. Multimodal vehicle detection: Fusing 3D-LIDAR and color camera data. Pattern Recognit. Lett. 2018, 115, 20–29. [Google Scholar] [CrossRef]
  145. Chaudhury, A.; Ward, C.; Talasaz, A.; Ivanov, A.G.; Brophy, M.; Grodzinski, B. IEEE/ACM Transactions Comput. Biol. Bioinf. 2018, 16, 2009–2022. [Google Scholar] [CrossRef] [Green Version]
  146. Abbas, A.; Yufeng, G.; Santosh, P.; James, S. Robotic Technologies for High-Throughput Plant Phenotyping: Contemporary Reviews and Future Perspectives. Front. Plant Sci. 2021, 12, 1082. Available online: https://www.frontiersin.org/article/10.3389/fpls.2021.611940 (accessed on 15 December 2021).
  147. Tang, H.J. Progress and Prospect of Agricultural Remote Sensing Research. J. Agric. 2018, 8, 167–171. [Google Scholar]
  148. Schut, A.G.; Traore, P.C.S.; Blaes, X.; Rolf, A. Assessing yield and fertilizer response in heterogeneous smallholder fields with UAVs and satellites. Field Crop. Res. 2018, 221, 98–107. [Google Scholar] [CrossRef]
  149. Moeckel, T.; Safari, H.; Reddersen, B.; Fricke, T.; Wachendorf, M. Fusion of ultrasonic and spectral sensor data for improving the estimation of biomass in grasslands with heterogeneous sward structure. Remote Sens. 2017, 9, 98. [Google Scholar] [CrossRef] [Green Version]
  150. Wang, C.; Nie, S.; Xi, X.H.; Luo, S.Z.; Sun, X.F. Estimating the Biomass of Maize with Hyperspectral and LiDAR Data. Remote Sens. 2017, 9, 11. [Google Scholar] [CrossRef] [Green Version]
  151. Puliti, S.; Saarela, S.; Gobakken, T.; Ståhl, G.; Næsset, E. Combining UAV and Sentinel-2 auxiliary data for forest growing stock volume estimation through hierarchical model-based inference. Remote Sens. Environ. 2018, 204, 485–497. [Google Scholar] [CrossRef]
  152. Grüner, E.; Astor, T.; Wachendorf, M. Prediction of Biomass and N Fixation of Legume–Grass Mixtures Using Sensor Fusion. Front. Plant Sci. 2021, 11, 1–13. [Google Scholar] [CrossRef] [PubMed]
  153. de Alckmin, G.T.; Kooistra, L.; Rawnsley, R.; Lucieer, A. Comparing methods to estimate perennial ryegrass biomass: Canopy height and spectral vegetation indices. Precis. Agric. 2021, 22, 205–225. [Google Scholar] [CrossRef]
  154. Laurin, G.V.; Pirotti, F.; Callegari, M.; Chen, Q.; Cuozzo, G.; Lingua, E.; Notarnicola, C.; Papale, D. Potential of ALOS2 and NDVI to estimate forest above-ground biomass, and comparison with lidar-derived estimates. Remote Sens. 2017, 9, 18. [Google Scholar] [CrossRef] [Green Version]
  155. Belgiu, M.; Stein, A. Spatiotemporal Image Fusion in Remote Sensing. Remote Sens. 2019, 11, 818. [Google Scholar] [CrossRef] [Green Version]
  156. Zurita-Milla, R.; Clevers, J.; Van Gijsel, J.; Schaepman, M. Using MERIS fused images for land-cover mapping and vegetation status assessment in heterogeneous landscapes. Int. J. Remote Sens. 2011, 32, 973–991. [Google Scholar] [CrossRef]
  157. Wu, K.; Chen, T.; Xu, Y.; Song, D.; Li, H. A Novel Change Detection Approach Based on Spectral Unmixing from Stacked Multitemporal Remote Sensing Images with a Variability of Endmembers. Remote Sens. 2021, 13, 2550. [Google Scholar] [CrossRef]
  158. Amorós-López, J.; Gómez-Chova, L.; Alonso, L.; Guanter, L.; Zurita-Milla, R.; Moreno, J.; Camps-Valls, G. Multitemporal fusion of Landsat/TM and ENVISAT/MERIS for crop monitoring. Int. J. App. Earth Obs. Geoinf. 2013, 23, 132–141. [Google Scholar] [CrossRef]
  159. Zurita-Milla, R.; Clevers, J.; Schaepman, M. Unmixing-based Landsat TM and MERIS FR data fusion. IEEE Geosci. Remote Sens. Lett. 2008, 5, 453–457. [Google Scholar] [CrossRef] [Green Version]
  160. Zurita-Milla, R.; Kaiser, G.; Clevers, J.; Schneider, W.; Schaepman, M. Downscaling time series of MERIS full resolution data to monitor vegetation seasonal dynamics. Remote Sens. Environ. 2009, 113, 1874–1885. [Google Scholar] [CrossRef]
  161. Wang, S.; Azzari, G.; Lobell, D.B. Crop type mapping without field-level labels: Random forest transfer and unsupervised clustering techniques. Remote Sens. Environ. 2019, 222, 303–317. [Google Scholar] [CrossRef]
  162. Amorós-López, J.; Gómez-Chova, L.; Alonso, L.; Guanter, L.; Moreno, J.; Camps-Valls, G. Regularized multiresolution spatial unmixing for ENVISAT/MERIS and Landsat/TM image fusion. IEEE Geosci. Remote Sens. Lett. 2011, 8, 844–848. [Google Scholar] [CrossRef] [Green Version]
  163. Sidike, P.; Sagan, V.; Qumsiyeh, M.; Maimaitijiang, M.; Essa, A.; Asari, V. Adaptive trigonometric transformation function with image contrast and color enhancement: Application to unmanned aerial system imagery. IEEE Geosci. Remote Sens. Lett. 2018, 15, 404–408. [Google Scholar] [CrossRef]
  164. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Maw, M.J.; Shakoor, N.; Mockler, T.; Fritschi, F.B. Vegetation index weighted canopy volume model (CVMVI) for soybean biomass estimation from unmanned aerial system-based RGB imagery. ISPRS J. Photogramm. Remote Sens. 2019, 151, 27–41. [Google Scholar] [CrossRef]
  165. Kachamba, D.J.; Orka, H.O.; Naesset, E.; Eid, T.; Gobakken, T. Influence of Plot Size on Efficiency of Biomass Estimates in Inventories of Dry Tropical Forests Assisted by Photogrammetric Data from an Unmanned Aircraft System. Remote Sens. 2017, 9, 610. [Google Scholar] [CrossRef] [Green Version]
  166. White, J.C.; Wulder, M.A.; Vastaranta, M.; Coops, N.C.; Pitt, D.; Woods, M. The Utility of Image-Based Point Clouds for Forest Inventory: A Comparison with Airborne Laser Scanning. Forests 2013, 4, 518–536. [Google Scholar] [CrossRef] [Green Version]
  167. Sagan, V.; Maimaitijiang, M.; Sidike, P.; Maimaitiyiming, M.; Erkbol, H.; Hartling, S.; Peterson, K.T.; Peterson, J.; Burken, J.; Fritschi, F. UAV/Satellite Multiscale Data Fusion for Crop Monitoring and Early Stress Detection. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 2, 715–722. [Google Scholar] [CrossRef] [Green Version]
  168. Emilien, A.V.; Thomas, C.; Thomas, H. UAV & satellite synergies for optical remote sensing applications: A literature review. Sci. Remote Sens. 2021, 3, 100019. [Google Scholar] [CrossRef]
  169. Belgiu, M.; Csillik, O. Sentinel-2 cropland mapping using pixel-based and object-based time-weighted dynamic time warping analysis. Remote Sens. Environ. 2018, 204, 509–523. [Google Scholar] [CrossRef]
  170. Del Pozo, S.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; Felipe-García, B. Vicarious Radiometric Calibration of a Mul-tispectral Camera on Board an Unmanned Aerial System. Remote Sens. 2014, 6, 1918–1937. [Google Scholar] [CrossRef] [Green Version]
  171. Pechanec, V.; Vávra, A.; Machar, I. Využití UAV technologie pro získávání dat v precizním zemědělství na příkladu ploch scukrovou řepou. Listy Cukrov. Řepař. 2014, 130, 162–165. [Google Scholar]
  172. Candiago, S.; Remondino, F.; Giglio, M.D.; Dubbini, M.; Gattelli, M. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef] [Green Version]
  173. Zhao, L.; Shi, Y.; Liu, B.; Hovis, C.; Duan, Y.; Shi, Z. Finer Classification of Crops by Fusing UAV Images and Sentinel-2A Data. Remote Sens. 2019, 11, 3012. [Google Scholar] [CrossRef] [Green Version]
  174. Hong, G.; Zhang, Y.; Mercer, B. A wavelet and I integration method to fuse high resolution SAR with moderate resolution multispectral images. Photogramm. Eng. Remote Sens. 2009, 75, 1213–1223. [Google Scholar] [CrossRef]
  175. Geman, S.; Geman, D. Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of images. J. Appl. Stat. 1984, 20, 25–62. [Google Scholar] [CrossRef]
  176. Wei, L.; Yu, M.; Zhong, Y.; Zhao, J.; Liang, Y.; Hu, X. Spatial–Spectral Fusion Based on Conditional Random Fields for the Fine Classification of Crops in UAV-Borne Hyperspectral Remote Sensing Imagery. Remote Sens. 2019, 11, 780. [Google Scholar] [CrossRef] [Green Version]
  177. Wei, L.; Yu, M.; Liang, Y.; Yuan, Z.; Huang, C.; Li, R.; Yu, Y. Precise Crop Classification Using Spectral-Spatial-Location Fusion Based on Conditional Random Fields for UAV-Borne Hyperspectral Remote Sensing Imagery. Remote Sens. 2019, 11, 2011. [Google Scholar] [CrossRef] [Green Version]
  178. Chen, Y.; Wu, Z.; Zhao, B.; Fan, C.; Shi, S. Weed and Corn Seedling Detection in Field Based on Multi Feature Fusion and Support Vector Machine. Sensors 2021, 21, 212. [Google Scholar] [CrossRef] [PubMed]
  179. Hengbiao, Z.; Tao, C.; Dong, L.; Xia, Y.; Yongchao, T.; Weixing, C.; Yan, Z. Combining Unmanned Aerial Vehicle (UAV)-Based Multispectral Imagery and Ground-Based Hyperspectral Data for Plant Nitrogen Concentration Estimation in Rice. Front. Plant Sci. 2018, 9, 936. [Google Scholar] [CrossRef]
  180. Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Hellmann Santos, C.; Pekkeriet, E. Agricultural Robotics for Field Opera-tions. Sensors 2020, 20, 2672. [Google Scholar] [CrossRef]
  181. Yang, M.; Khan, F.A.; Tian, H.; Liu, Q. Analysis of the Monthly and Spring-Neap Tidal Variability of Satellite Chlorophyll-a and Total Suspended Matter in a Turbid Coastal Ocean Using the DINEOF Method. Remote Sens. 2021, 13, 632. [Google Scholar] [CrossRef]
  182. Guo, Y.; Yin, G.; Sun, H.; Wang, H.; Chen, S.; Senthilnath, J.; Wang, J.; Fu, Y. Scaling Effects on Chlorophyll Content Estimations with RGB Camera Mounted on a UAV Platform Using Machine-Learning Methods. Sensors 2020, 20, 5130. [Google Scholar] [CrossRef] [PubMed]
  183. Kimm, H.; Guan, K.; Jiang, C.; Peng, B.; Gentry, L.F.; Wilkin, S.C.; Wang, S.; Cai, Y.; Bernacchi, C.J.; Peng, J.; et al. Deriving high-spatiotemporal-resolution leaf area index for agroecosystems in the U.S. Corn Belt using Planet Labs CubeSat and STAIR fusion data. Remote Sens. Environ. 2020, 239, 111615. [Google Scholar] [CrossRef]
  184. Potgieter, A.B.; George-Jaeggli, B.; Chapman, S.C.; Laws, K.; Suárez, C.L.A.; Wixted, J.; Watson, J.; Eldridge, M.; Jordan, D.R.; Hammer, G.L. Multi-Spectral Imaging from an Unmanned Aerial Vehicle Enables the Assessment of Seasonal Leaf Area Dynamics of Sorghum Breeding Lines. Front. Plant Sci. 2017, 8, 1532. Available online: https://www.frontiersin.org/article/10.3389/fpls.2017.01532 (accessed on 15 December 2021). [CrossRef] [PubMed]
  185. Velasquez, A.E.B.; Gasparino, M.V.; Becker, M.; Higuti, V.A.H.; Sivakumar, A.N.; Chowdhary, G. Multi-Sensor Fusion based Robust Row Following for Compact Agricultural Robots. Computer Science and Robotics, Cornell University. arXiv 2021, arXiv:2106.15029. Available online: https://arxiv.org/pdf/2106.15029.pdf (accessed on 15 December 2021).
  186. Tilly, N.; Aasen, H.; Bareth, G. Fusion of Plant Height and Vegetation Indices for the Estimation of Barley Biomass. Remote Sens. 2015, 7, 11449–11480. [Google Scholar] [CrossRef] [Green Version]
  187. Kolar, P.; Benavidez, P.; Jamshidi, M. Survey of Datafusion Techniques for Laser and Vision Based Sensor Integration for Autonomous Navigation. Sensors 2020, 20, 2180. [Google Scholar] [CrossRef] [Green Version]
  188. Guerrero, A.; De Neve, S.; Mouazen, A.M. Data fusion approach for map-based variable-rate nitrogen fertilization in barley and wheat. Soil Till. Res. 2021, 205, 104789. [Google Scholar] [CrossRef]
  189. Leslie, J.E.; Weersink, A.; Yang, W.; Fox, G. Actual versus environmentally recommended fertilizer application rates: Im-plications for water quality and policy. Agric. Ecosyst. Environ. 2017, 240, 109–120. [Google Scholar] [CrossRef]
  190. Puntel, L.A.; Sawyer, J.E.; Barker, D.W.; Thorburn, P.J.; Castellano, M.J.; Moore, K.J. A systems modeling approach to forecast corn economic optimum nitrogen rate. Front. Plant Sci. 2018, 9, 436. [Google Scholar] [CrossRef] [Green Version]
  191. Chen, S.; Jiang, T.; Ma, H.; He, C.; Xu, F.; Malone, R.W.; Feng, H.; Yu, Q.; Siddique, K.H.M.; Dong, Q.G.; et al. Dynamic within-season irrigation scheduling for maize production in Northwest China: A method based on weather data fusion and yield prediction by DSSAT. Agric. For. Meteorol. 2020, 285–286, 107928. [Google Scholar] [CrossRef]
  192. Wang, X.; Miao, Y.; Batchelor, W.D.; Dong, R.; Kusnierek, K. Evaluating model-based strategies for in-season nitrogen management of maize using weather data fusion. Agric. Forest Meteorol. 2021, 308–309, 108564. [Google Scholar] [CrossRef]
  193. Hammad, H.M.; Abbas, F.; Ahmad, A.; Farhad, W.; Anothai, J.; Hoogenboomm, G. Simulating water and nitrogen require-ments for maize under semi-arid conditions using the CSM-CERES-maize model. Eur. J. Agron. 2018, 100, 56–66. [Google Scholar] [CrossRef]
  194. Qi, Z.; Bartling, P.N.S.; Jabro, J.D.; Lenssen, A.W.; Iversen, W.M.; Ahuja, L.R.; Ma, L.; Allen, B.L.; Evans, R.G. Simulating dryland water availability and spring wheat production in the northern Great Plains. Agron. J. 2013, 105, 37–50. [Google Scholar] [CrossRef] [Green Version]
  195. Wang, B.; Jia, K.; Wei, X.; Xia, M.U.; Yao, Y.; Zhang, X.; Liu, D.; Tao, G. Generating spatiotemporally consistent fractional vegetation cover at different scales using spatiotemporal fusion and multiresolution tree methods. ISPRS J. Photogramm. Remote Sens. 2020, 167, 214–229. [Google Scholar] [CrossRef]
  196. Zhang, J.; Hu, K.; Li, K.; Zheng, C.; Li, B. Simulating the effects of long-term discontinuous and continuous fertilization with straw return on crop yields and soil organic carbon dynamics using the DNDC model. Soil Tillage Res. 2017, 165, 302–314. [Google Scholar] [CrossRef]
  197. Hoogenboom, G.; Porter, C.H.; Shelia, V.; Boote, K.J.; Singh, U.; White, J.W.; Hunt, L.A.; Ogoshi, R.; Lizaso, J.I.; Koo, J.; et al. Decision Support System for Agrotechnology Transfer (DSSAT); DSSAT Foundation: Gainesville, FL, USA, 2019; Version 4.7.5; Available online: https://DSSAT.net (accessed on 15 December 2021).
  198. Porcar-Castell, A.; Malenovský, Z.; Magney, T.; Wittenberghe, S.V.; Fernández-Marín, B.; Maignan, F.; Zhang, Y.; Maseyk, K.; Atherton, J.; Albert, L.P.; et al. Chlorophyll a fluorescence illuminates a path connecting plant molecular biology to Earth-system science. Nat. Plants 2021, 7, 998–1009. [Google Scholar] [CrossRef] [PubMed]
  199. Song, D.; Qiao, L.; Gao, D.; Li, S.; Li, M.; Sun, H.; Ma, J. Development of crop chlorophyll detector based on a type of interference filter optical sensor. Comp. Electron. Agric. 2021, 187, 106260. [Google Scholar] [CrossRef]
  200. Xiaoyan, W.; Zhiwei, L.; Wenjun, W.; Jiawei, W. Chlorophyll content for millet leaf using hyperspectral imaging and an at-tentionconvolutional neural network. Ciência Rural. Crop Prod. 2020, 50, e20190731. [Google Scholar] [CrossRef]
  201. Mao, Z.H.; Deng, L.; Sun, J.; Zhang, A.W.; Chen, X.Y.; Zhao, Y. Research on the Application of UAV Multispectral Remote Sensing in the Maize Chlorophyll Prediction. Spectrosc. Spectr. Anal. 2018, 38, 2923–2931. [Google Scholar] [CrossRef]
  202. Zheng, T.; Liu, N.; Wu, L.; Li, M.; Sun, H.; Zhang, Q.; Wu, J. Estimation of Chlorophyll Content in Potato Leaves Based on Spectral Red Edge Position. IFAC Pap. OnLine 2018, 51, 602–606. [Google Scholar] [CrossRef]
  203. Chen, C.L.; Jin, Y.; Cao, Y.L.; Yu, F.H.; Feng, S.; Zhou, C.X. Analysis of Chlorophyll Contents in Maize Leaf based on GA-BP Neural Network Hyperspectral Inversion Model. J. Shenyang Agric. Univ. 2018, 49, 626–632. Available online: https://caod.oriprobe.com/journals/synydxxb/Journal_of_Shenyang_Agricultural_University.htm (accessed on 15 December 2021).
  204. Srivastava, P.K.; Gupta, M.; Singh, U.; Prasad, R.; Pandey, P.C.; Raghubanshi, A.S.; Petropoulos, G.P. Sensitivity analysis of artificial neural network for chlorophyll prediction using hyperspectral data. Environ. Dev. Sustain 2021, 23, 5504–5519. [Google Scholar] [CrossRef]
  205. Sun, J.; Shi, S.; Wang, L.; Li, H.; Wang, S.; Gong, W.; Tagesson, T. Optimizing LUT-based inversion of leaf chlorophyll from hyperspectral lidar data: Role of cost functions and regulation strategies. Int. J. App. Earth Obs. Geoinf. 2021, 105, 102602. [Google Scholar] [CrossRef]
  206. Karaca, C.; Büyüktaş, D. Variation of The Leaf Area Index of Some Vegetables Commonly Grown in Greenhouse Conditions with Cultural Practices. Hort. Stud. 2021, 38, 56–61. Available online: https://dergipark.org.tr/en/pub/hortis/issue/61153/902525 (accessed on 15 December 2021). [CrossRef]
  207. Sadeh, Y.; Zhu, X.; Dunkerley, D.; Walker, J.P.; Zhang, Y.; Rozenstein, O.; Manivasagam, V.S.; Chenu, K. Fusion of Sentinel-2 and PlanetScope time-series data into daily 3 m surface reflectance and wheat LAI monitoring. Int. J. App. Earth Obs. Geoinf. 2021, 96, 102260. [Google Scholar] [CrossRef]
  208. Sadeh, Y.; Zhu, X.; Chenu, K.; Dunkerley, D. Sowing date detection at the field scale using CubeSats remote sensing. Comput. Electron. Agric. 2019, 157, 568–580. [Google Scholar] [CrossRef]
  209. Raj, R.; Walker, J.P.; Pingale, R.; Nandan, R.; Naik, B.; Jagarlapudi, A. Leaf area index estimation using top-of-canopy airborne RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 96, 102282. [Google Scholar] [CrossRef]
  210. Pasqualotto, N.; Delegido, J.; Van Wittenberghe, S.; Rinaldi, M.; Moreno, J. Multi-crop green LAI estimation with a new simple Sentinel-2 LAI Index (SeLI). Sensors 2019, 19, 904. [Google Scholar] [CrossRef] [Green Version]
  211. Djamai, N.; Fernandes, R.; Weiss, M.; McNairn, H.; Goïta, K. Validation of the Sentinel Simplified Level 2 Product Prototype Processor (SL2P) for mapping cropland biophysical variables using Sentinel-2/MSI and Landsat-8/OLI data. Remote Sens. En-Viron. 2019, 225, 416–430. [Google Scholar] [CrossRef]
  212. Leach, N.; Coops, N.C.; Obrknezev, N. Normalization method for multi-sensor high spatial and temporal resolution satellite imagery with radiometric inconsistencies. Comput. Electron. Agric. 2019, 164, 104893. [Google Scholar] [CrossRef]
  213. da Costa, V.A.M.; de Oliveira, A.D.F.; dos Santos, J.G.; Bovo, A.A.A.; de Almeida, D.R.; Gorgens, E.B. Assessing the utility of airborne laser scanning derived indicators for tropical forest management. South. For. J. For. Sci. 2020, 82, 352–358. [Google Scholar] [CrossRef]
  214. Valbuena, R.; O’Connor, B.; Zellweger, F.; Simonson, W.; Vihervaara, P.; Maltamo, M.; Silva, C.A.; Almeida, D.R.A.; Danks, F.; Morsdorf, F.; et al. Standardizing ecosystem morphological traits from 3D in-formation sources. Trends Ecol. Evol. 2020, 35, 656–667. [Google Scholar] [CrossRef] [PubMed]
  215. Dalagnol, R.; Wagner, F.H.; Galvão, L.S.; Streher, A.S.; Phillips, O.L.; Gloor, E.; Pugh, T.A.M.; Ometto, J.P.H.B.; Aragão, L.E.O.C. Large-scale variations in the dynamics of Amazon forest canopy gaps from airborne lidar data and opportunities for tree mortality estimates. Sci. Rep. 2021, 11, 1388. [Google Scholar] [CrossRef] [PubMed]
  216. Adhikari, H.; Valbuena, R.; Pellikka, P.K.E.; Heiskanen, J. Mapping Forest structural heterogeneity of tropical montane forest remnants from airborne laser scanning and Landsat time series. Ecol. Indic. 2020, 108, 105739. [Google Scholar] [CrossRef]
  217. Jia, W.; Pang, Y.; Tortini, R.; Schläpfer, D.; Li, Z.; Roujean, J.L. A kernel-driven BRDF approach to correct airborne hyperspectral imagery over forested areas with rugged topography. Remote Sens. 2020, 12, 432. [Google Scholar] [CrossRef] [Green Version]
  218. Han, L.; Ding, J.; Zhang, J.; Chen, P.; Wang, J.; Wang, Y.; Wang, J.; Ge, X.; Zhang, Z. Precipitation events determine the spati-otemporal distribution of playa surface salinity in arid regions: Evidence from satellite data fused via the enhanced spatial and temporal adaptive reflectance fusion model. CATENA 2021, 206, 105546. [Google Scholar] [CrossRef]
  219. Bei, X.; Yao, Y.; Zhang, L.; Lin, Y.; Liu, S.; Jia, K.; Zhang, X.; Shang, K.; Yang, J.; Chen, X.; et al. Estimation of Daily Terrestrial Latent Heat Flux with High Spatial Resolution from MODIS and Chinese GF-1 Data. Sensors 2020, 20, 2811. [Google Scholar] [CrossRef]
  220. Tao, G.; Jia, K.; Wei, X.; Xia, M.; Wang, B.; Xie, X.; Jiang, B.; Yao, Y.; Zhang, X. Improving the spatiotemporal fusion accuracy of fractional vegetation cover in agricultural regions by combining vegetation growth models. Int. J. App. Earth Obs. Geoinf. 2021, 101, 102362. [Google Scholar] [CrossRef]
  221. Zeng, N.A.; He, H.; Ren, X.; Zhang, L.I.; Zeng, Y.; Fan, J.; Li, Y.; Niu, Z.; Zhu, X.; Chang, Q. The utility of fusing multi-sensor data spatio-temporally in estimating grassland aboveground biomass in the three-river headwaters region of China. Int. J. Remote Sens. 2020, 41, 7068–7089. [Google Scholar] [CrossRef]
  222. Miura, Y.; Qureshi, H.; Ryoo, C.; Dinenis, P.C.; Li, J.; Mandli, K.T.; Deodatis, G.; Bienstock, D.; Lazrus, H.; Morss, R. A meth-odological framework for determining an optimal coastal protection strategy against storm surges and sea level rise. Nat. Hazards 2021, 107, 1821–1843. [Google Scholar] [CrossRef]
  223. Mao, P.; Qin, L.; Hao, M.; Zhao, W.; Luo, J.; Qiu, X.; Xu, L.; Xiong, Y.; Ran, Y.; Yan, C.; et al. An improved approach to estimate above-ground volume and biomass of desert shrub communities based on UAV RGB images. Ecol. Indic. 2021, 125, 107494. [Google Scholar] [CrossRef]
  224. Yue, J.; Guijun, Y.; Tian, Q.; Feng, H.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh- ground-resolution image textures and vegetation indices. ISPRS-J. Photogramm. Remote Sens. 2019, 150, 226–244. [Google Scholar] [CrossRef]
  225. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice above-ground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  226. Malambo, L.; Popescu, S.C.; Murray, S.C.; Putman, E.; Pugh, N.A.; Horne, D.W.; Richardson, G.; Sheridan, R.; Rooney, W.L.; Avant, R.; et al. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 31–42. [Google Scholar] [CrossRef]
  227. Lu, N.; Zhou, J.; Han, Z.; Li, D.; Cao, Q.; Yao, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cheng, T. Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system. Plant Methods 2019, 15, 17. [Google Scholar] [CrossRef] [Green Version]
  228. Colorado, J.D.; Calderon, F.; Mendez, D.; Petro, E.; Rojas, J.P.; Correa, E.S.; Mondragon, I.F.; Rebolledo, M.C.; Jaramillo-Botero, A. A novel NIR-image segmentation method for the precise estimation of above-ground biomass in rice crops. PLoS ONE 2020, 15, e0239591. [Google Scholar] [CrossRef] [PubMed]
  229. Jimenez-Sierra, D.A.; Benítez-Restrepo, H.D.; Vargas-Cardona, H.D.; Chanussot, J. Graph-Based Data Fusion Applied to: Change Detection and Biomass Estimation in Rice Crops. Remote Sens. 2020, 12, 2683. [Google Scholar] [CrossRef]
  230. Devia, C.A.; Rojas, J.P.; Petro, E.; Martinez, C.; Mondragon, I.F.; Patino, D.; Rebolledo, M.C.; Colorado, J. High-throughput biomass estimation in rice crops using UAV multispectral imagery. J. Intell. Robot. Syst. 2019, 96, 573–589. [Google Scholar] [CrossRef]
  231. Yue, J.; Feng, H.; Jin, X.; Yuan, H.; Li, Z.; Zhou, C.; Yang, G.; Tian, Q. A comparison of crop parameters estimation using images from UAV-mounted snapshot hyperspectral sensor and high-definition digital camera. Remote Sens. 2018, 10, 1138. [Google Scholar] [CrossRef] [Green Version]
  232. Xiong, J.; Po, L.M.; Cheung, K.W.; Xian, P.; Zhao, Y.; Rehman, Y.A.U.; Zhang, Y. Edge-Sensitive Left Ventricle Segmentation Using Deep Reinforcement Learning. Sensors 2021, 21, 2375. [Google Scholar] [CrossRef]
  233. Jimenez-Sierra, D.A.; Correa, E.S.; Benítez-Restrepo, H.D.; Calderon, F.C.; Mondragon, I.F.; Colorado, J.D. Novel Fea-ture-Extraction Methods for the Estimation of Above-Ground Biomass in Rice Crops. Sensors 2021, 21, 4369. [Google Scholar] [CrossRef] [PubMed]
  234. Marino, S.; Alvino, A. Agronomic Traits Analysis of Ten Winter Wheat Cultivars Clustered by UAV-Derived Vegetation Indices. Remote Sens. 2020, 12, 249. [Google Scholar] [CrossRef] [Green Version]
  235. Marino, S.; Alvino, A. Vegetation Indices Data Clustering for Dynamic Monitoring and Classification of Wheat Yield Crop Traits. Remote Sens. 2021, 13, 541. [Google Scholar] [CrossRef]
  236. Carbone, A.; Ayllon, N.; Cipriani, E.; Farhat, L.; Fonseca, N.J.G.; Gomanne, S.A.; Jankovic, P.; Martin-Iglesias, P.; Sedehi, M.; Heliere, F.; et al. Biomass SAR Instrument: Architectural overview and hardware development status. In Proceedings of the 13th European Conference on Synthetic Aperture Radar, EUSAR, Online, 29 March–1 April 2021; pp. 1–6. [Google Scholar]
  237. Ha, N.T.; Manley-Harris, M.; Pham, T.D.; Hawes, I. The use of radar and optical satellite imagery combined with advanced machine learning and metaheuristic optimization techniques to detect and quantify above ground biomass of intertidal seagrass in a New Zealand estuary. Int. J. Remote Sens. 2021, 42, 4712–4738. [Google Scholar] [CrossRef]
  238. Theofanous, N.; Chrysafis, I.; Mallinis, G.; Domakinis, C.; Verde, N.; Siahalou, S. Aboveground Biomass Estimation in Short Rotation Forest Plantations in Northern Greece Using ESA’s Sentinel Medium-High Resolution Multispectral and Radar Imaging Missions. Forests 2021, 12, 902. [Google Scholar] [CrossRef]
  239. Pande, C.B.; Moharir, K.N.; Singh, S.K.; Varade, A.M.; Elbeltagi, A.; Khadri, S.F.R.; Choudhari, P. Estimation of crop and forest biomass resources in a semi-arid region using satellite data and GIS. J. Saudi Soc. Agric. Sci. 2021, 20, 302–311. [Google Scholar] [CrossRef]
  240. Santoro, M.; Cartus, O.; Carvalhais, N.; Rozendaal, D.M.A.; Avitabile, V.; Araza, A.; de Bruin, S.; Herold, M.; Quegan, S.; Rodríguez-Veiga, P.; et al. The global forest above-ground biomass pool for 2010 estimated from high-resolution satellite observations. Earth Syst. Sci. Data 2021, 13, 3927–3950. [Google Scholar] [CrossRef]
  241. Li, X.; Zhang, M.; Long, J.; Lin, H. A Novel Method for Estimating Spatial Distribution of Forest Above-Ground Biomass Based on Multispectral Fusion Data and Ensemble Learning Algorithm. Remote Sens. 2021, 13, 3910. [Google Scholar] [CrossRef]
  242. Chen, Y.; He, X.; Xu, J.; Guo, L.; Lu, Y.; Zhang, R. Decision tree-based classification in coastal area integrating polarimetric SAR and optical data. Data Tech. App. 2021. ahead-of-print. [Google Scholar] [CrossRef]
  243. Veerabhadraswamy, N.; Devagiri, G.M.; Khaple, A.K. Fusion of complementary information of SAR and optical data for forest cover mapping using random forest algorithm. Curr. Sci. 2021, 120, 193–197. [Google Scholar] [CrossRef]
  244. Yu, J.; Liang, D.; Han, B.; Gao, H. Study on ground object classification based on the hyperspectral fusion images of ZY-1(02D) satellite. J. App. Remote Sens. 2021, 15, 042603. [Google Scholar] [CrossRef]
  245. Choudhury, M.R.; Das, S.; Christopher, J.; Apan, A.; Chapman, S.; Menzies, N.W.; Dang, Y.P. Improving Biomass and Grain Yield Prediction of Wheat Genotypes on Sodic Soil Using Integrated High-Resolution Multispectral, Hyperspectral, 3D Point Cloud, and Machine Learning Techniques. Remote Sens. 2021, 13, 3482. [Google Scholar] [CrossRef]
  246. Che, S.; Du, G.; Wang, N.; He, K.; Mo, Z.; Sun, B.; Chen, Y.; Cao, Y.; Wang, J.; Mao, Y. Biomass estimation of cultivated red algae Pyropia using unmanned aerial platform based multispectral imaging. Plant Methods 2021, 17, 12. [Google Scholar] [CrossRef] [PubMed]
  247. Choudhury, M.R.; Christopher, J.; Apan, A.A.; Chapman, S.C.; Menzies, N.W.; Dang, Y.P. Integrated high-throughput phenotyping with high resolution multispectral, hyperspectral and 3D point cloud techniques for screening wheat genotypes under sodic soils. In Proceedings of the International Tropical Agriculture Conference, (TROPAG), Brisbane, Australia, 11–13 November 2019. [Google Scholar]
  248. Baath, G.S.; Flynn, K.C.; Gowda, P.H.; Kakani, V.G.; Northup, B.K. Detecting Biophysical Characteristics and Nitrogen Status of Finger Millet at Hyperspectral and Multispectral Resolutions. Front. Agron. 2021, 2, 604598. [Google Scholar] [CrossRef]
  249. Cooper, S.; Okujeni, A.; Pflugmacher, D.; van der Linden, S.; Hostert, P. Combining simulated hyperspectral EnMAP and Landsat time series for forest aboveground biomass mapping. Int. J. App. Earth Obs. Geoinf. 2021, 98, 102307. [Google Scholar] [CrossRef]
  250. Vahtmäe, E.; Kotta, J.; Lõugas, L.; Kutser, T. Mapping spatial distribution, percent cover and biomass of benthic vegetation in optically complex coastal waters using hyperspectral CASI and multispectral Sentinel-2 sensors. Int. J. App. Earth Obs. Geoinf. 2021, 102, 102444. [Google Scholar] [CrossRef]
Table 1. Different sensors fusion methods for crop parameter estimation.
Table 1. Different sensors fusion methods for crop parameter estimation.
Sensors
OpticalThermal InfraredMultispectralHyperspectralLiDARRadarReferences
Crop Parameters
NFusion of Multiplex and Dualex with the ALS-2 NFusion of near infrared with the visible-near infraredFusion of multispectral imagery with the hyperspectral dataFusion of hyperspectral LiDAR with the multispectral LiDARFusion of LiDAR with the multispectral imaging sensorsFusion of synthetic aperture radar (SAR) imagery and optical imagery[10,27,43,48,65,126,127]
ChlorophyllFusion of ClorofiLOG and CropSpec with the SPADFusion of near infrared (NIR) with the visible-near infrared (vis-NIR)Fusion of multispectral spectrometer with the hyperspectral imageryFusion of hyperspectral LiDAR with the spectral imageryFusion of LiDAR with the 2D multispectral cameraFusion of optical with the SAR imagery[10,48,64,128,129,130,131]
LAIFusion of Crop Circle with the OptRxFusion of vis-NIR with spectral cameraFusion of multispectral with the spectral vegetation indicesFusion of hyperspectral imagery data with the high spatio-spectral imageryFusion of LiDAR sensor with the Microwave sensorFusion of SAR with the soil moisture sensor[63,79,114,132,133]
AGBFusion of OptRx with the Multiplex and Crop CircleFusion of NIR with the vis-NIR and spectral cameraFusion of multispectral maps with the photogrammetry imageryFusion of LiDAR and hyperspectral dataFusion of 3D coordinate global positioning system (GPS) with the LiDARFusion of SAR with the optical sensors[63,65,67,134,135]
Table 2. Ground, space-borne and aerial platforms fusion methods for crop parameter estimation.
Table 2. Ground, space-borne and aerial platforms fusion methods for crop parameter estimation.
Platforms
TractorRobotSatelliteUAVReferences
Crop Parameters
NFusion of tractor with the harvester, machine vision, Multiplex, Dualex and ALS-2 N systemsFusion of robots with the machine vision systemFusion of multispectral SPOT imagery within its typesFusion of unmanned aerial vehicle-based (UAV-based) vegetation indices Vis with the hyperspectral texture data[65,124,179]
ChlorophyllFusion of tractor with the autonomous vehicle guidance system, ClorofiLOG, CropSpec and SPAD systemsFusion of robots with the multisensor and spectroscopy sensorsFusion of MODIS-Aqua Chl-a with the MODIS TSM datasetsFusion of UAV-based Vis with the red–green–blue (RGB) images[10,48,64,180,181,182]
LAIFusion of tractor with the NIR, vis-NIR, spectral, LiDAR and Radar systemsFusion of robots with the GPS, vision sensors, LiDAR and optical sensorsFusion of STAIR and MODIS-Landsat imageryFusion of UAV-based hyperspectral imagery, 2D RGB and 3D surface models[63,79,183,184]
AGBFusion of tractor with the OptRx, Multiplex, Crop Circle, GPS and SARFusion of robots with the LiDAR and GPSFusion of Sentinel-1 with the Sentinel-2 with different bandsFusion of UAV-based Vis, spectral and structural imagery data [63,65,67,185,186]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ahmad, U.; Nasirahmadi, A.; Hensel, O.; Marino, S. Technology and Data Fusion Methods to Enhance Site-Specific Crop Monitoring. Agronomy 2022, 12, 555. https://doi.org/10.3390/agronomy12030555

AMA Style

Ahmad U, Nasirahmadi A, Hensel O, Marino S. Technology and Data Fusion Methods to Enhance Site-Specific Crop Monitoring. Agronomy. 2022; 12(3):555. https://doi.org/10.3390/agronomy12030555

Chicago/Turabian Style

Ahmad, Uzair, Abozar Nasirahmadi, Oliver Hensel, and Stefano Marino. 2022. "Technology and Data Fusion Methods to Enhance Site-Specific Crop Monitoring" Agronomy 12, no. 3: 555. https://doi.org/10.3390/agronomy12030555

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop