Next Article in Journal
Maltodextrin-Coated Peppermint and Caraway Essential Oils Effects on Soil Microbiota
Next Article in Special Issue
Exogenous Application of Glycine Betaine on Sweet Cherry Tree (Prunus avium L.): Effects on Tree Physiology and Leaf Properties
Previous Article in Journal
Can Lunar and Martian Soils Support Food Plant Production? Effects of Horse/Swine Monogastric Manure Fertilisation on Regolith Simulants Enzymatic Activity, Nutrient Bioavailability, and Lettuce Growth
Previous Article in Special Issue
Proteomic Analysis of Barley (Hordeum vulgare L.) Leaves in Response to Date Palm Waste Compost Application
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Water Stress Identification of Winter Wheat Crop with State-of-the-Art AI Techniques and High-Resolution Thermal-RGB Imagery

1
Agricultural Mechanization Division, ICAR—Central Institute of Agricultural Engineering, Bhopal 462038, MP, India
2
Irrigation and Drainage Engineering Division, ICAR—Central Institute of Agricultural Engineering, Bhopal 462038, MP, India
3
Department of Biological Systems Engineering, Virginia Tech Tidewater AREC, Suffolk, VA 23437, USA
4
Center for Advanced Innovation in Agriculture (CAIA), Virginia Tech, Blacksburg, VA 24061, USA
5
College of Agricultural Engineering and Technology, Anand Agricultural University, Godhra 389001, GJ, India
*
Authors to whom correspondence should be addressed.
Plants 2022, 11(23), 3344; https://doi.org/10.3390/plants11233344
Submission received: 8 October 2022 / Revised: 25 November 2022 / Accepted: 27 November 2022 / Published: 2 December 2022
(This article belongs to the Collection Feature Papers in Plant Physiology and Metabolism)

Abstract

:
Timely crop water stress detection can help precision irrigation management and minimize yield loss. A two-year study was conducted on non-invasive winter wheat water stress monitoring using state-of-the-art computer vision and thermal-RGB imagery inputs. Field treatment plots were irrigated using two irrigation systems (flood and sprinkler) at four rates (100, 75, 50, and 25% of crop evapotranspiration [ETc]). A total of 3200 images under different treatments were captured at critical growth stages, that is, 20, 35, 70, 95, and 108 days after sowing using a custom-developed thermal-RGB imaging system. Crop and soil response measurements of canopy temperature (Tc), relative water content (RWC), soil moisture content (SMC), and relative humidity (RH) were significantly affected by the irrigation treatments showing the lowest Tc (22.5 ± 2 °C), and highest RWC (90%) and SMC (25.7 ± 2.2%) for 100% ETc, and highest Tc (28 ± 3 °C), and lowest RWC (74%) and SMC (20.5 ± 3.1%) for 25% ETc. The RGB and thermal imagery were then used as inputs to feature-extraction-based deep learning models (AlexNet, GoogLeNet, Inception V3, MobileNet V2, ResNet50) while, RWC, SMC, Tc, and RH were the inputs to function-approximation models (Artificial Neural Network (ANN), Kernel Nearest Neighbor (KNN), Logistic Regression (LR), Support Vector Machine (SVM) and Long Short-Term Memory (DL-LSTM)) to classify stressed/non-stressed crops. Among the feature extraction-based models, ResNet50 outperformed other models showing a discriminant accuracy of 96.9% with RGB and 98.4% with thermal imagery inputs. Overall, classification accuracy was higher for thermal imagery compared to RGB imagery inputs. The DL-LSTM had the highest discriminant accuracy of 96.7% and less error among the function approximation-based models for classifying stress/non-stress. The study suggests that computer vision coupled with thermal-RGB imagery can be instrumental in high-throughput mitigation and management of crop water stress.

1. Introduction

Water stress forces leaf stomata closure, which reduces transpiration and increases canopy temperature (Tc) [1]. Timely estimation of those stressors may not only help precision irrigation management but also minimize yield losses [2]. The penalty gap between actual and potential yield will widen further as a result of climate change that projects a decline in rainfall frequency, and rising ambient temperatures [3]. Water stress is typically assessed using xylem water potentials [4], canopy thermometry [5], and stomatal conductance measurements [6]. However, these methods are often invasive and tend to have limited sampling accuracy due to low throughput or point data acquisitions [7]. Non-invasive proximal or remote sensing techniques have emerged as high throughput alternatives for monitoring crop water stress through color features, reflectance, and thermal emissivity of the vegetable, fruit, and specialty crops [8,9,10]. However, monitoring crop water content using visible-range RGB imaging not only requires specific leaf orientation relative to the camera but also pre-defined illumination conditions. This limits the applicability of RGB imaging to determine water content in field conditions. Using scanner-type imaging devices could be a cost and time-effective alternative [11]. Unlike thermometry, Tc from thermal infrared imagery reflects upon the entire canopy emissivity profile, which is directly proportional to the canopy water content [9]. Thermal imagery (8000–14,000 nm) also outperforms color RGB imagery (400–700 nm), and reflectance characteristics in terms of robustness to characterize crop water stress [8,9]. Nonetheless, the adaptability of thermal imaging in agricultural production management is still at a nascent stage and is consistently evolving to maintain imaging quality against drastic variations in relative humidity and wind speeds. Thermal imaging cameras are also relatively more expensive than simple-to-operate RGB cameras. It is for these reasons; thermal imaging is still limitedly adopted as a golden standard for crop stress mapping. Above all, longwave infrared wavelengths (thermal imaging) have a higher penetrating capability over visible-range wavelengths, making them more reliable and sensitive to crop water content variations. Thermal imaging is therefore a better alternative for precision irrigation management unlike RGB imaging or using standard crop coefficients coupled with reference evapotranspiration [8,9].
RGB imagery has been used to assess crop water stress using different deep learning (DL) and machine learning (ML) techniques [10,12,13]. ML techniques derive unique features from input and output datasets, which could be used for discrimination between different object types or classes. ML techniques such as Naïve Bayes, artificial neural networks (ANNs), support vector machine (SVM), and random forests (RFs) have been widely used with RGB images for weed detection, biotic and abiotic stress identification and/or classification, yield predictions, and other crop phenotyping applications [14,15]. Thermal imagery has also been used with RFs and decision trees for crop water status monitoring in the vineyard and automated irrigation scheduling [16]. However, there are several limitations associated with ML techniques. The output quality is highly dependent on input data quality, the presence of noise and outliers, and other unaccounted biases that have been reported to significantly affect the model performance. Furthermore, ML techniques also require skilled operators [17] for defining input features that may also often affect the model performances through unintentional subjectivity and bias [10,12].
DL has emerged as an advanced vision-based learning technique that enables automated feature extraction without human dependencies unlike ML [18]. Pertinent to agricultural applications, crop phenological stages have been detected using a deep convolution neural network (DCNN) trained on RGB imagery [19,20]. Similarly, different DL techniques (AlexNet, GoogLeNet, and Inception V3) have also been used to classify non-stressed and water-stressed soybean, maize, and okra crops with digital RGB images [10]. Long Short Term Memory (LSTM) is a novel DL approach (DL-LSTM) that has been used for different field applications like time series forecasting of wheat yield and productivity [21], irrigation requirement [22], predicting agricultural product sale volumes based on seasonal and historical data [23], and identification and classification of weeds [24]. Most of the image processing studies have used RGB images (or visible range imagery) to classify crop water stress [25,26]. Thermal imagery has been reported to be more robust for crop water stress characterization compared to RGB or multispectral imagery [27,28]. This is majorly due to the fact that the canopy emissivity can be highly sensitive to water content [8,9,29,30].
So far, crop water stress characterization has been carried out through traditional and destructive methods that often have restricted commercial applicability. Moreover, these techniques have been limitedly explored using robust computer-vision techniques (ML or DL models) for thermal infrared imagery inputs. Small unmanned aerial system (UAS)-based thermal and multispectral remote sensing is also being explored for high throughput crop water stress phenotyping. However, the frequency of data acquisition is limited to once or a few times a day and atmospheric interferences including weather conditions may severely impact the quality of thermal imaging. Additionally, onboard data processing potential for complex and robust algorithms is still limited for small UASs. Contrarily, proximal thermal imaging is subjected to the least atmospheric interference and imaging frequency constraints. These systems can continuously collect data at critical growth stages and also offer flexibility for custom modification to implement onboard edge processing algorithms for real-time decision support and management actuation. On the cost side, thermal and RGB imaging sensors and the UASs are still far more expensive than the proximal imaging systems, which can be custom-assembled using miniature sensing modules. However, such miniature sensing modules neither offer sufficient resolution nor desired image quality when integrated with UASs.
Obtaining robust data handling and the analytical pipeline is the major obstacle to deriving real-time decision support for crop management but is achievable using custom-assembled edge devices. This study is a step toward alleviating those obstacles and focuses on the evaluation of non-invasive and cost-effective thermal-RGB imaging with robust ML and DL models for stress characterization in winter wheat crops. This could be critical from a precision irrigation scheduling and management perspective and may potentially have high grower adaptability. Specific objectives for this two-year study were to (a) non-invasively assess the crop responses to two irrigation systems and four deficit irrigation treatments, and (b) identify the water-stressed and non-stressed crops by feature extraction using thermal-RGB imagery and function approximation approaches using crop physiological parameters and ambient weather inputs.

2. Materials and Methods

2.1. Experiment Design

Winter wheat (Triticum aestivum L., cv. HI 1544) was planted (November to April, 2019–2020, and 2020–2021) in the research farm (77.24° E, 23.18° N) of the Central Institute of Agricultural Engineering (CIAE), ICAR Bhopal, India (Figure 1). The meteorological data are being recorded at the institute observatory since 1985. According to Koppen’s classification (1934), Bhopal is a Mediterranean climatic zone with an average annual rainfall of about 1127 mm. The soil type is heavy clay (Vertisols) with clay content over 50% and moderate fertility with negligible salinity. Soil structure is sub-angular blocky with a field capacity of 29.5–32% (db) and wilting point of 18-19.5% (db). The average infiltration and percolation rates of the soil are 10–12 mm day−1 and 6.3–7.0%, respectively. The plots were irrigated using flood and sprinkler systems at four treatment rates: 100, 75, 50, and 25% of full crop evapotranspiration (ETc). Micro sprinklers of 120 lph discharge capacity (Make: Netafim) were installed at 3.5 m spacing. The reference crop evapotranspiration was calculated using weather data with the FAO56 Penman–Montieth method and standard non-stressed crop coefficient [31]. The seasonal ETc of wheat during the first and second years of growth was 380 mm and 345 mm, respectively. The application efficiencies of 0.65 for flood irrigation and 0.90 for sprinkler irrigation were used as determined from the experiment trials on the same site using the measurements of water applied and the water retained in the crop root zone. A total of six irrigation cycles were implemented for 100% ETc treatment (non-stressed) at sowing, crown root initiation (CRI), tillering, booting, flowering, and grain filling stages. One to three irrigation cycles were implemented for deficit treatments (75, 50, and 25% of ETc) at jointing, booting, and flowering stages.

2.2. Data Collection

RGB and thermal imagery were synchronously captured using a multifunctional custom integrated thermal-RGB imaging system. The system has a single board computer (B+, Raspberry Pi foundation), a thermal imaging module (8000–14,000 nm, HTPA, Heimann, Pixel resolution: 80 × 64, Horizontal and vertical FOV: 120° × 90°), an RGB imaging module (400–700 nm, Raspberry Pi V2, Sony IMX219, Raspberry Pi foundation, Pixel resolution: 3280 × 2464, HFOV: 62.2°, VFOV: 48.8°), a GPS receiver module (NEO 6M V2, Adafruit) for image geotagging, a capacitive touchscreen (LCD 800 × 400 mm, Robokit), a keypad (Robokits), and a power source (20,000 mAh, 5V/2A, MI power bank). The computer used the NOOBS operating system with module-pertinent libraries for different operations. Imagery data were collected at critical crop growth stages in the 2019 and 2020 growing seasons (five times in each season). Ground truth plant biophysical and soil parameters were also measured synchronously.

2.2.1. Imagery Data

The developed imaging system was placed 1 m from the crop and titled at 45° from the horizontal. A total of 3200 images (400 per treatment) were acquired (1600 RGB and 1600 thermal) in two seasons at Crown root initiation (20 days after sowing (DAS)), tillering (35 DAS) (Figure 2), jointing (70 DAS), flowering and milking (95 DAS), and dough (108 DAS) stages, between 11 am to 1 pm on clear sky days. Sample masked thermal and RGB canopy images for model training are shown in Figure 2.

2.2.2. Weather and Ground Truth Data

Weather data were acquired for the imaging days from a standard station (Indian Meteorological Department, Pune, India) installed at 300 m from the study site. The parameters included, pan evaporation (mm/day), rainfall (mm), maximum and minimum air temperature (°C), relative humidity (RH, %), and wind velocity (m/s). The ambient and soil ground truth data of air temperature (Ta), RH, and soil moisture content (SMC) were collected for each treatment plot during the imaging campaigns each year. The Ta and RH parameters were recorded using the DHT22 module (Adafruit, New York, NY, USA). SMC was monitored in the root zone depth (0–150 mm typical to wheat crops grown in the experimental site, soil type: vertisols) using a soil moisture meter of 200 mm sensing probe length (ICT, MPM-160-B, Armidale, Australia). The probe was inserted at five different locations in each replication for measurement of soil moisture, acquiring 15 data points of soil moisture content per measurement. The relative water content (RWC) of leaves was calculated as the crop ground truth data [33]. For this, 10 matured and fully expanded leaves from each sample plot were collected and fresh weight was recorded on each sampling date, immediately following the imagery acquisition. Collected samples were then oven-dried at 70 °C, dry weight was recorded, and RWC was calculated. A total of 30 samples were collected per treatment per campaign amounting to a total of 150 samples per treatment in each year for RWC calculations. End-season yield was also recorded from 2 × 2 m areas from three plots in each replication, making nine sample points (36 m2 area) per treatment to characterize the effects of crop water stress.

2.2.3. Statistical Analysis

The impact of irrigation type (flood and sprinkler), rate (100, 75, 50, and 25% ETc), and interaction of both on crop biophysical parameters were statistically evaluated using a one-way analysis of variance at a 5% level of significance [34].

2.3. Crop Water Stress Classification

Two different approaches (1) feature extraction-based (DL models: AlexNet, GoogLeNet, Inception V3, MobileNet V2, and ResNet50) and (2) function approximation-based ML models (Artificial neural network (ANN), K-nearest neighbors (KNN), Support vector machine (SVM), and Logistic regression (LR)); and a DL model (DL-LSTM) were adopted for crop water stress classification. Feature extraction-based models were trained on thermal as well as RGB imagery. Function approximation-based models were trained on ambient weather and soil parameters, and Tc inputs from thermal imagery.
Deep CNNs typically have complex architecture and some may require significant computational resources. All CNN model training and validation processes were performed on a desktop computer (Intel Core I7 Processor with base frequency 2.60 GHz, 16 GB RAM, 6 GB NVIDIA GeForce GTX 1660 Ti GPU) with Windows 10 operating system (64 bits). CNN models were developed in MATLAB 2019b using the deep learning and machine learning toolbox. All the models are detailed in the following sub-sections.

2.3.1. Feature Extraction-Based Approaches

Five DL models were selected as the feature extraction-based approaches (1) AlexNet; (2) GoogLeNet; (3) Inception V3; (4) MobileNet V2; and (5) ResNet50. These models were selected for their extraordinary capabilities of automated feature extraction, easy and efficient training of the raw images with optimum computation resources, and their transferability to edge computation devices [10,17]. The selected models ranged from the simplest architecture (AlexNet and MobileNet V2) to the most complex architecture (GoogLeNet, Inception V3, and ResNet50) in order to evaluate their robustness and efficiencies for crop water stress prediction. Successful application of these models has been reported with accuracies up to 100% for crop abiotic and biotic stress classification in recent studies [35,36,37]. Standardized architectures were used in the models for performance comparisons (Table 1).
The DL-based classification includes steps of pre-trained model selection, data pre-processing using morphological operators, data splitting, setting the training hyper-parameters, model training, model tuning, cross-validation, evaluation, and model testing (Figure 3). DL models were developed in MATLAB (version 2019a, Mathworks, Natick, Boston, MA, USA) using libraries of AlexNet, GoogLeNet, Inception V3, MobileNet V2, and ResNet50. The convolutional kernels in AlexNet were extracted using the cost function optimized by a stochastic gradient descent with momentum (Sgdm) algorithm. While GoogLeNet processes and classifies images by alternately factorizing the convolutions and regularization layers. To train a GoogLeNet model, the model’s loss 3-classifier, prob, and output layers were replaced by fully connected, softmax and output class layers which connected with other traditional layers. The inception V3 extracted (a) local features of the stressed crop by using small convolutions and (b) high abstracted features with large convolutions. The last three prediction layers in inception V3 were replaced by three new layers; fully connected, softmax, and classification output layer. These layers were interconnected with average pooling and a fully connected layer of the pre-trained DL. MobileNet V2 and ResNet50 are the very recently proposed DL models for classification problems. MobileNet V2 requires less computational power compared to conventional CNN. ResNet50 is a deep residual network that uses the shortcut connections by reducing the convolutional layers and by also solving the vanishing gradient issue typical to CNN. The residual modules in ResNet50 were used to connect different layers of CNN to improve the model performance [38].
Generalization can be poor for feature extraction-based models when the number of epochs and batch sizes are more than the optimum [39]. This is because the model can overlearn when trained on a specific dataset at large epochs and batch sizes, and may lose its performance and generalization capability when trained on new datasets. Conversely, the smaller epochs and bath sizes may lead to insufficient learning and underfitting of the model and hence may not perform as expected with the new datasets [40]. Therefore, to maximize the model performance and minimize their overfitting, optimum hyperparameter tuning is required. In this study, all the selected feature extraction-based models were extensively tuned with learning rates, solvers, epochs, and batch sizes as detailed in Table 2.
Collected thermal and RGB images (1600 each) were labeled into stressed and non-stressed classes by the domain experts based on the values of crop water stress indicators of SMC, RWC, and Tc (Table 3). After this, 80% of the labeled dataset (separately for thermal and RGB images) was used for DL model training based on features of object dimensions, pixel intensity, pixel values (Tc), edges, etc. The remaining 20% of the labeled dataset was used for model validations and testing.

2.3.2. Function Approximation-Based Approaches

Four ML models; ANN, KNN, LR, and SVM and a DL-based LSTM (DL-LSTM) were selected as the function approximation approaches for crop water stress classification. The ML models were selected as those provide an opportunity to analyze numerous features simultaneously unlike traditional methods. ANN is effective in learning complex nonlinear functions and segmenting data based on the learned weights. The input layer had four variables to extract features from 1600 samples while the output layer had one neuron to calculate the probability of each class [47]. KNN classifies a data point based on its distance from the maximum number of training data points in the neighborhood. Typically, KNN uses Euclidean, Minkowski, Manhattan, or Hamming distances out of which Minkowski distance has been reported to be more reliable [48] and was therefore selected in the model. LR classifies data points into discrete classes based on probability using a sigmoid or logistic function [49]. SVM shifts data points to a higher dimension using linear, non-linear, and radial kernels to achieve linear separability [50] and then identifies a hyperplane for the highest possible distance between data points of the two classes. DL-LSTM uses a chain of repeated modules comprising memory cells with a backpropagation algorithm to solve the classification problems. This model solves premature overfitting and vanishing gradient issues by using the previously stored information in the memory cell. The information is then used to generate the features during the training process to predict the output class [51]. ML models automatically tuned their hyperparameter values by using Bayesian optimization. The optimization minimizes the model loss based on the hyperparameter combination and yields the best possible set of parameters. Further, the models were trained and validated on these tuned hyperparameters. All function-approximation models were deployed on crop environment (Ta, RH, and SMC) and temperature (Tc, from thermal images) inputs for classification into stressed and non-stressed through binary outputs (0 or 1, Table 3). The models (operating parameters in Table 4) were developed in Python 3.7 with Keras and TensorFlow libraries.

2.4. Model Performance Evaluation

The performance of both feature extraction and function approximation-based models was evaluated through accuracy (A), sensitivity (Se), specificity (Sp), precision (P), and F1 score parameters (Equations (1)–(5)). Accuracy is the correct prediction rate of non-stressed and stressed crops, precision is the fraction of true positive (TS) or correctly predicted stressed crop from an overall prediction of the stressed crop (PS), specificity is the true negative (TN) or correctly predicted non-stressed crops from the actual non-stressed crops (AN). Sensitivity represents a fraction of the correctly predicted stressed crops (TS) from the actual stressed crops (AS) and the F1 score is the harmonic mean of precision and sensitivity. The F1 score evaluates the accuracy of a binary classification problem as in this study, which aims to classify the crops into two classes (stressed and non-stressed). Often, the accuracy estimate is affected by true negatives and therefore F1 score is highly used over accuracy to seek a balance between the precision and recall (sensitivity) parameters and when there is an uneven class distribution (a large number of actual negatives).
A = T S + T N T T
S e = T S A S
S p = T N A N
P = T S P S
F 1 = 2 * T S A S + P S
where TT is the total number of predictions. Stress/non-stress misclassification was represented by type1 (TE1) (false positive) and type2 errors (TE2) (false negative). TE1 is the number of actual stressed crops misclassified as non-stressed (row 1-column 2 of the confusion matrix) while TE2 is the number of actual non-stressed crops misclassified as stressed (row 2-column 1 of the confusion matrix).

3. Results

3.1. Plant Water Stress Indicators

The thermal imagery derived canopy temperatures (Tc (°C)) under sprinkler irrigation at 100, 75, 50 and 25% of ETc irrigation levels were 22.1 (2.0) (Mean, standard deviation (SD)), 25.6 (1.6), 26.4 (2.2), and 27.9 °C (3.0 °C), respectively (Figure 4). While Tc for flood irrigation at the above irrigation levels were 23.2 (2.0), 25.9 (1.5), 26.8 (2.4), and 28.1 °C (3.1 °C), respectively. Similarly, mean RWC (%) at selected sprinkler irrigation rates were 90.4 (2.7), 87.7 (4.2), 75.8 (9.4), and 74.2% (8.2%) while at corresponding irrigation levels in flood irrigation were 89.8 (2.7), 87.2 (4.3), 75.0 (9.4) and 73.9% (8.3%), respectively. The mean SMCs (%) for respective sprinkler irrigation were 26.6 (2.3), 26.2 (2.7), 22.5 (3.3), and 21.1% (3.0%) while those for respective flood irrigation were 24.9 (2.0), 24.4 (2.5), 21.4 (3.0), and 20.4% (3.1%). When analyzed statistically, Tc, RWC, and SMC were significantly affected by the irrigation method (flood and sprinkler), irrigation rate (100, 75, 50, and 25% ETc), as well as their interaction (One-way ANOVA, p < 0.001). The RWC and SMC decreased with the decrease in irrigation level, while the Tc increased. Based on the categories detailed in Table 3, the mean Tc for the stressed crop was 26.6 °C (±2.6), and that for the non-stressed crop was 21.2 °C (±1.4). The mean RWC for the non-stressed crop was 92.2% (±1.5) and for the stressed crop was 78.9% (±9.2) while the mean SMC for the non-stressed crop was 27.1% (1.6) and for the stressed crop was 21.2% (±2.4).

3.2. Water Stress Prediction

3.2.1. Feature Extraction-Based Approaches

The performances of AlexNet, GoogLeNet, Inception V3, MobileNet V2, and ResNet50 models for RGB and thermal imagery were tested for different combinations of epochs and batch sizes (Table 5). The model training accuracies increased with the increase in the number of epochs from 5 to 10 and over-fitting was observed for all the models when epochs increased to 20. Over different epochs, accuracy increased with the increase in batch size from 5 to 20. For the batch size of 20 and 250 iterations, overfitting was observed in Inception and ResNet50 with RGB imagery inputs and in AlexNet, GoogLeNet, and ResNet50 with thermal imagery inputs (Table 5). Extensive hyperparameter tuning was performed with parameters listed in Table 2 to minimize overfitting and maximize the model accuracies. Post-tuning, the maximum training accuracies of 94.6%, 96.7%, and 95.6% were observed for AlexNet, GoogLeNet, and MobileNet V2, respectively with RGB imagery inputs at 10 epochs and batch size of 20 (Figure 5). While the Inception V3 and ResNet50 for RGB imagery inputs converged at 10 epochs, batch size of 15, and 300 iterations with respective accuracies of 92.7% and 97.1% (Figure 5c,e). For the thermal imagery inputs, the optimum hyperparameters were 10 epochs and a batch size of 15, which yielded maximum accuracies of 96.4%, 97.2%, and 98.5% for AlexNet, GoogLeNet, and ResNet50, respectively. Furthermore, 10 epochs and a batch size of 20 were found optimum for Inception V3 and MobileNet V2 models, and pertinent maximum accuracies were 98.0% and 95.3%, respectively (Figure 6). During hyperparameter tuning, the model overfitting reduced significantly at 10 epochs without sacrificing accuracy. The training accuracies fell below 50% for learning rates of 1 × 10−4 and 4 × 10−4 and went over 50% for the learning rate of 3 × 10−4. Moreover, the model overfitting was reduced when the solver was shifted from Sgdm to Adam. All the models converged with training accuracies > 90% at the learning rate of 3 × 10−4 and Adam as the solver (Figure 5 and Figure 6).
The training time elapsed for AlexNet, GoogLeNet, Inception V3, MobileNet V2, and ResNet50 was 76, 92, 609, 149, and 217 min with RGB imagery inputs, and 42, 88, 287, 134, 168 min with thermal imagery inputs, respectively. While the classification of an independent image using trained models into stressed/non-stressed class consumed less than 5 s. The overall validation accuracies (combined for stressed and non-stressed classes) for AlexNet, GoogLeNet, Inception V3, MobileNet V2, and ResNet50 models were 93.4%, 95.9%, 92.5%, 94.4%, and 96.9%, respectively with RGB imagery inputs (Figure 7). The highest precision (100%) and F1 score (96.6%) were observed for GoogLeNet and ResNet50, respectively while maximum sensitivity was achieved for MobileNet V2 (Table 6). Pertinent to thermal imagery inputs, overall validation accuracies (combined for stressed and non-stressed classes) with AlexNet, GoogLeNet, Inception V3, MobileNet V2, and ResNet50 models were 96.2%, 96.9%, 97.5%, 94.7%, and 98.4%, respectively. Alike RGB imagery, ResNet50 for thermal imagery had the highest precision (96.7%), sensitivity (100%), and F1 score (98.3%) (Table 6). Additionally, the accuracies were higher for the models with thermal imagery inputs compared to those with RGB imagery inputs. The individual accuracy and errors for all the feature extraction-based models with validation datasets are shown in Figure 7. The mean errors were higher for the RGB imagery compared to the thermal imagery irrespective of the selected models.

3.2.2. Function Approximation-Based Approaches

Amongst the function approximation approaches, the highest prediction accuracy was obtained with the DL-LSTM model (96.7%) followed by ANN (93.5%), SVM (91.4%), LR (89.2%), and KNN models (88.1%) (Table 6). Moreover, the precision, sensitivity, and F1 score were also highest for the DL-LSTM (96.0, 97.9, and 97.0%, respectively) compared to other ML models. The training and validation accuracies with DL-LSTM showed early convergence for which the loss on the validation dataset reached minima at 40 epochs (Figure 8). The TE1 for ANN, KNN, LR, SVM, and LSTM were 3.2, 4.3, 2.2, 2.2, and 2.1%, respectively and TE2 were 3.2, 7.5, 8.6, 6.5, and 1.1%, respectively (Figure 9). The DL-LSTM outperformed ML models with the lowest mean error (Figure 9).

4. Discussion

Sprinkler irrigation applies a predetermined quantity of water and wets the entire canopy, unlike traditional flood irrigation. This cools down the microclimate and increases relative air humidity to reduce the microclimate’s water demand [52]. This could be the reason for lower Tc in all the sprinkler irrigation treatments compared to the corresponding flood irrigation treatments (Figure 4). Lowered microclimate water demand could have also resulted in lower soil moisture depletion from the root zone and therefore higher SMC with sprinkler irrigation [53]. In addition to sufficient SMC, sprinkler irrigation results in lower deep percolation and nutrient leaching compared to conventional flood irrigation [54,55,56]. This could have resulted in a higher average yield for sprinkler irrigation treatment plots (5719 kg/ha) compared to the flood irrigation treatment plots (4898 kg/ha). With the projected future climate change impacts in the form of low rainfall frequencies and high ambient temperatures, crop water stresses are further expected to multiply, which will multiply the penalties in yield potentials [3]. Therefore, stress-tolerant crop cultivars need to be developed and planted for uncompromised yield goals. As also reported in our prior work based on canopy reflectance [57], water stress started to occur before the CRI stage in both methods of irrigation. RWCs were lower at late jointing and flowering stages in case of flood irrigation. Water stress at the flowering stage can result in significant yield and biomass reductions [58] suggesting that it is also influenced by the phenological growth stage. For robust analysis of this aspect, large datasets are being collected at each phenological growth stage. Water stress lowers CO2 availability due to stomatal closure, thereby affecting photosynthesis and ultimately growth, yield, and biomass [59,60].
CNNs have been increasingly used for plant phenotyping applications over the past decade for their capability of modeling complicated plant processes by distinguishing and extracting regularized data patterns [61,62]. It is for this reason; CNN models were highly accurate in predicting stressed and non-stressed crops using thermal and RGB imagery. Chlorophyll is vital for photosynthesis, while carotenoids are critical non-enzymatic antioxidants. Water stress reduces chlorophyll and carotenoid contents, as well as the ratio of chlorophyll ‘a’ to ‘b’, leading to leaf coloration changes. This is the reason for RGB images also yielding satisfactory accuracy of up to 94.6% for tracing leaf color changes [63]. Compared to RGB imagery, thermal imagery is a more detailed and better presenter of the crop stress that alters the emissivity patterns proportionally [64,65]. The canopy temperature is affected by the microclimate conditions and the available soil moisture [53]. This is the reason for the relatively lower accuracy of water stress detection with RGB images (94.6%) than with thermal images (96.7%) irrespective of the selected DL models (Table 5). A similar observation is reported in a prior study [64] where higher accuracy was obtained with thermal imagery (89%) compared to RGB imagery (82%) for wheat ear counting using DCNN models. Since thermal imaging is often affected by the wind or RH factors of the environment, the quality of data will be critical for training the DL models, especially when acquired using aerial platforms [30,64]. Therefore, to maintain thermal data quality, imaging campaigns were launched when wind velocities were below 5 kmph. The ResNet50 had relatively the highest accuracy among the feature extraction models. Although it is the basic version of GoogLeNet and Inception V3, the performance would be highly impacted by the quality of input imagery, size, and robustness of the dataset, especially for the agricultural environments. ResNet50 addresses the neural network degradation problem by introducing identity mapping, which results in the disappearance of gradient parameters and the non-ideal convergence effect on the deeper networks [66,67]. This feature contributed to the enhanced performance of ResNet50 compared to the other models thereby suggesting the suitability of ResNet50 for agricultural applications for various crop biotic and abiotic stress characterizations. CNN models were also applied to thermal imagery for water stress classification in maize under well-irrigated, moderately irrigated, and water-stressed treatments, obtaining an overall accuracy of 89% [68]. Color and grey images of maize were also used as inputs to the DCNN model for water stress identification where stress identification and classification accuracies were 98% and 95%, respectively [26]. The inception-ResNet V2 framework utilized for water stress identification in sugarcane yielded an accuracy of 83% with available soil water capacity as input [65]. Thus far, most of the computer vision models have utilized single-dimensional data inputs, unlike this study which advances water-stress identification in wheat crops using multidimensional data inputs. Multidimensional data modeling enhances the robustness and applicability of developed approaches across various agroclimatic conditions.
Crop growth or its water stress response is not necessarily linear to the weather or soil conditions [69]. Therefore, the linear (LR) and non-linear function approximation approaches (ANN, KNN, SVM, and DL-LSTM) were evaluated to predict the stress class of the crop. ANN and SVM had a better stress prediction accuracy (Table 6) compared to KNN and LR possibly due to two reasons (1) KNN or LR either use locally linear segments or a generalized linear approach for making predictions [66,69] and (2) KNN and LR models train on an unsupervised learning approach, unlike ANN and SVM, which train on a supervised learning approach [18]. ANN and SVM had comparable accuracies for crop stress prediction. However, SVM suits small datasets; while ANN can process relatively larger datasets. Therefore, ANN would have more confidence in prediction classes.
Crop phenotyping with traditional function approximation approaches (ML models) is often subjective compared to the advanced DL-LSTM approach as those require manual feature selection of Tc, Ta, RH, and SMC. This restricts the robustness and accuracy of the ML models. Therefore, DL-LSTM outperformed the traditional ML models (Figure 9) due to its automated and stabilized feature selection advantage [12,70]. This was supported by minimum model loss compared to other function approximation-based ML models. DL-LSTM not only integrates the thermal imagery features employed in DL models but also combines the auxiliary soil and weather data inputs, of function approximation models. This eventually led to its superior performance over the other ML models evaluated in this study as well as in prior studies of crop stress and yield phenotyping [51] or irrigation forecasting [22]. However, GoogleNet, Inception V3, and ResNet50 provided comparable or higher stress prediction accuracy compared to the DL-LSTM model (Table 6). Stress/non-stress misclassification could be minimized through improved data sampling, increasing training data size, and optimizing hyper-parameters, or by merging different ML and DL models for crop’s thermal emissivity and environment data inputs. Among the feature extraction and function approximation-based approaches, the feature extraction-based models outperformed all the function approximation-based models for water stress classification.
The CNN models evaluated in this study can be adopted for water stress identification in other wheat cultivars while for other crops and their cultivars, sufficient data acquisition, model training, and validations would be required. Along similar lines, gathering sufficient data at different crop phenological stages will enable growth stage-wise accuracy evaluation of ML and DL models in future studies. The developed algorithms required below 5 sec to be successfully implemented on independent images for classification into stressed/non-stressed classes. This is critical from a real-time stress diagnosis and management perspective. Trained algorithms are therefore transferrable to handheld or edge devices for real-time stress detection by breeders, researchers, farmers, and students, among others. For commercial adoption of the developed and tested approaches, capital investment would be initially required following which high returns may be expected through improvements in crop stress mitigation and management at reduced costs [11].

5. Conclusions

The canopy temperatures, relative water content, soil moisture content, and grain yield for the wheat crop were significantly affected by the irrigation type and rates. Lower Tc and higher RWC, SMC, and yield were observed for irrigation at 100% of ETc compared to deficit irrigation (75, 50, and 25% of ETc). Moreover, a comparable or higher yield was observed for sprinkler irrigation compared to conventional flood irrigation and amounted to 20% of the water savings.
Thermal images resulted in higher crop water stress classification accuracy (94.7–98.4%) compared to RGB imagery (92.5–96.9%). Moreover, the DL models (including DL-LSTM) performed better than the ML models for stressed and non-stressed crop classification. Among the function approximation-based approaches, DL-LSTM had the highest accuracy (96.7%). Among the feature extraction-based methods, ResNet50 had the highest accuracy of 96.9% and 98.4% with RGB and thermal imagery inputs, respectively.
Overall, DL models with thermal imagery inputs could be highly efficient for crop water stress phenotyping. As a future scope, feature extraction-based DL models could be implemented on edge-computing devices for real-time water stress monitoring and actuation of irrigation systems through the internet of things.

Author Contributions

Conceptualization, Y.A.R. and N.S.C.; methodology, N.S.C. and M.K.T.; software, K.D. and A.S.; validation, A.S., K.D., N.S.C. and A.K.C.; resources, Y.A.R.; data curation, K.D., M.K.T. and A.K.C.; writing—original draft preparation, N.S.C., Y.A.R., and A.K.C.; writing—review and editing, A.K.C. and M.K.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Indian Council of Agricultural Research-Central Institute of Agricultural Engineering Bhopal, India, project# 824.

Data Availability Statement

Data will be made available on personalized requests due to restrictions from the parent organization.

Acknowledgments

The authors would like to thank C.R. Mehta and P.S. Tiwari from ICAR-CIAE Bhopal, for their technical support in the conduct of this study.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

AbbreviationExpanded Form
AdamAdaptive Moment Estimation
ANActual Non-Stressed Crop
ANNArtificial Neural Network
ANOVAAnalysis of Variance
CIAECentral Institute of Agricultural Engineering
CNNConvolution Neural Network
DASDay After Sowing
DCNNDeep Convolution Neural Network
DLDeep Learning
DL-LSTMDeep Learning-Long Short Term Memory
ETcEvapotranspiration
F1F1 Score
ICARIndian Council of Agricultural Research
KNNKernel Nearest Neighbor
LRLogistic Regression
LSTMLong Short Term Memory
MAEMean Absolute Error
MLMachine Learning
PPrecision
PSCorrectly Predicted Stressed Crop From all the predictions
RBFRadial Basis Function
RFRandom Forest
RGBRed Green Blue
RHRelative Humidity
RWCRelative Water Content
SDStandard Deviation
SeSensitivity
SgdmStochastic Gradient Descent with Momentum
SMCSoil Moisture Content
SSpecificity
VMSupport Vector Machine
TaAir Temperature
TcCanopy Temperature
TE1Type 1 Error
TE2Type 2 Error
TNTrue Negative
TSTrue Positive
UASUnmanned Aerial System

References

  1. Mega, R.; Abe, F.; Kim, J.-S.; Tsuboi, Y.; Tanaka, K.; Kobayashi, H.; Sakata, Y.; Hanada, K.; Tsujimoto, H.; Kikuchi, J. Tuning Water-Use Efficiency and Drought Tolerance in Wheat Using Abscisic Acid Receptors. Nat. Plants 2019, 5, 153–159. [Google Scholar] [CrossRef]
  2. Ihuoma, S.O.; Madramootoo, C.A. Recent Advances in Crop Water Stress Detection. Comput. Electron. Agric. 2017, 141, 267–275. [Google Scholar] [CrossRef]
  3. Seifikalhor, M.; Niknam, V.; Aliniaeifard, S.; Didaran, F.; Tsaniklidis, G.; Fanourakis, D.; Teymoorzadeh, M.; Mousavi, S.H.; Bosacchi, M.; Li, T. The Regulatory Role of γ-Aminobutyric Acid in Chickpea Plants Depends on Drought Tolerance and Water Scarcity Level. Sci. Rep. 2022, 12, 7034. [Google Scholar] [CrossRef]
  4. Oletic, D.; Bilas, V. How Thirsty the Crops Are: Emerging Instrumentation for Plant-Based Field Measurement of Water Stress. IEEE Instrum. Meas. Mag. 2020, 23, 37–46. [Google Scholar] [CrossRef]
  5. Zhang, L.; Niu, Y.; Zhang, H.; Han, W.; Li, G.; Tang, J.; Peng, X. Maize Canopy Temperature Extracted from UAV Thermal and RGB Imagery and Its Application in Water Stress Monitoring. Front. Plant Sci. 2019, 10, 1270. [Google Scholar] [CrossRef]
  6. Agam, N.; Cohen, Y.; Berni, J.A.J.; Alchanatis, V.; Kool, D.; Dag, A.; Yermiyahu, U.; Ben-Gal, A. An Insight to the Performance of Crop Water Stress Index for Olive Trees. Agric. Water Manag. 2013, 118, 79–86. [Google Scholar] [CrossRef]
  7. Elsayed, S.; Elhoweity, M.; Ibrahim, H.H.; Dewir, Y.H.; Migdadi, H.M.; Schmidhalter, U. Thermal Imaging and Passive Reflectance Sensing to Estimate the Water Status and Grain Yield of Wheat under Different Irrigation Regimes. Agric. Water Manag. 2017, 189, 98–110. [Google Scholar] [CrossRef]
  8. Chandel, A.K.; Khot, L.R.; Osroosh, Y.; Peters, T.R. Thermal-RGB Imager Derived in-Field Apple Surface Temperature Estimates for Sunburn Management. Agric. For. Meteorol. 2018, 253, 132–140. [Google Scholar] [CrossRef]
  9. Chandel, A.K.; Khot, L.R.; Molaei, B.; Peters, R.T.; Stöckle, C.O.; Jacoby, P.W. High-Resolution Spatiotemporal Water Use Mapping of Surface and Direct-Root-Zone Drip-Irrigated Grapevines Using Uas-Based Thermal and Multispectral Remote Sensing. Remote Sens. 2021, 13, 954. [Google Scholar] [CrossRef]
  10. Chandel, N.S.; Chakraborty, S.K.; Rajwade, Y.A.; Dubey, K.; Tiwari, M.K.; Jat, D. Identifying Crop Water Stress Using Deep Learning Models. Neural Comput. Appl. 2021, 33, 5353–5367. [Google Scholar] [CrossRef]
  11. Taheri-Garavand, A.; Mumivand, H.; Fanourakis, D.; Fatahi, S.; Taghipour, S. An Artificial Neural Network Approach for Non-Invasive Estimation of Essential Oil Content and Composition through Considering Drying Processing Factors: A Case Study in Mentha Aquatica. Ind. Crops Prod. 2021, 171, 113985. [Google Scholar] [CrossRef]
  12. Singh, A.K.; Ganapathysubramanian, B.; Sarkar, S.; Singh, A. Deep Learning for Plant Stress Phenotyping: Trends and Future Perspectives. Trends Plant Sci. 2018, 23, 883–898. [Google Scholar] [CrossRef] [Green Version]
  13. Goldstein, A.; Fink, L.; Meitin, A.; Bohadana, S.; Lutenberg, O.; Ravid, G. Applying Machine Learning on Sensor Data for Irrigation Recommendations: Revealing the Agronomist’s Tacit Knowledge. Precis. Agric. 2018, 19, 421–444. [Google Scholar] [CrossRef]
  14. Petrie, P.R.; Wang, Y.; Liu, S.; Lam, S.; Whitty, M.A.; Skewes, M.A. The Accuracy and Utility of a Low Cost Thermal Camera and Smartphone-Based System to Assess Grapevine Water Status. Biosyst. Eng. 2019, 179, 126–139. [Google Scholar] [CrossRef]
  15. Subeesh, A.; Bhole, S.; Singh, K.; Chandel, N.S.; Rajwade, Y.A.; Rao, K.V.R.; Kumar, S.P.; Jat, D. Deep Convolutional Neural Network Models for Weed Detection in Polyhouse Grown Bell Peppers. Artif. Intell. Agric. 2022, 6, 47–54. [Google Scholar] [CrossRef]
  16. Gutiérrez, S.; Diago, M.P.; Fernández-Novales, J.; Tardaguila, J. Vineyard Water Status Assessment Using On-the-Go Thermal Imaging and Machine Learning. PLoS ONE 2018, 13, e0192037. [Google Scholar] [CrossRef]
  17. Ghosal, S.; Blystone, D.; Singh, A.K.; Ganapathysubramanian, B.; Singh, A.; Sarkar, S. An Explainable Deep Machine Vision Framework for Plant Stress Phenotyping. Proc. Natl. Acad. Sci. USA 2018, 115, 4613–4618. [Google Scholar] [CrossRef] [Green Version]
  18. Schmidhuber, J. Deep Learning in Neural Networks: An Overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [CrossRef] [Green Version]
  19. Chakraborty, S.K.; Chandel, N.S.; Jat, D.; Tiwari, M.K.; Rajwade, Y.A.; Subeesh, A. Deep Learning Approaches and Interventions for Futuristic Engineering in Agriculture. Neural Comput. Appl. 2022. [Google Scholar] [CrossRef]
  20. Yalcin, H. Plant Phenology Recognition Using Deep Learning: Deep-Pheno. In Proceedings of the 2017 6th International Conference on Agro-Geoinformatics, Fairfax VA, USA, 7–10 August 2017; pp. 1–5. [Google Scholar]
  21. Haider, S.A.; Naqvi, S.R.; Akram, T.; Umar, G.A.; Shahzad, A.; Sial, M.R.; Khaliq, S.; Kamran, M. LSTM Neural Network Based Forecasting Model for Wheat Production in Pakistan. Agronomy 2019, 9, 72. [Google Scholar] [CrossRef]
  22. Mouatadid, S.; Adamowski, J.F.; Tiwari, M.K.; Quilty, J.M. Coupling the Maximum Overlap Discrete Wavelet Transform and Long Short-Term Memory Networks for Irrigation Flow Forecasting. Agric. Water Manag. 2019, 219, 72–85. [Google Scholar] [CrossRef]
  23. Yoo, T.-W.; Oh, I.-S. Time Series Forecasting of Agricultural Products’ Sales Volumes Based on Seasonal Long Short-Term Memory. Appl. Sci. 2020, 10, 8169. [Google Scholar] [CrossRef]
  24. Arif, S.; Kumar, R.; Abbasi, S.; Mohammadani, K.; Dev, K. Weeds Detection and Classification Using Convolutional Long-Short-Term Memory; Research Square: Durham, NC, USA, 2021. [Google Scholar]
  25. Zhuang, S.; Wang, P.; Jiang, B.; Li, M.; Gong, Z. Early Detection of Water Stress in Maize Based on Digital Images. Comput. Electron. Agric. 2017, 140, 461–468. [Google Scholar] [CrossRef]
  26. An, J.; Li, W.; Li, M.; Cui, S.; Yue, H. Identification and Classification of Maize Drought Stress Using Deep Convolutional Neural Network. Symmetry 2019, 11, 256. [Google Scholar] [CrossRef] [Green Version]
  27. Niu, Y.; Zhang, H.; Han, W.; Zhang, L.; Chen, H. A Fixed-Threshold Method for Estimating Fractional Vegetation Cover of Maize under Different Levels of Water Stress. Remote Sens. 2021, 13, 1009. [Google Scholar] [CrossRef]
  28. Biju, S.; Fuentes, S.; Gupta, D. The Use of Infrared Thermal Imaging as a Non-Destructive Screening Tool for Identifying Drought-Tolerant Lentil Genotypes. Plant Physiol. Biochem. 2018, 127, 11–24. [Google Scholar] [CrossRef]
  29. Chandel, A.K.; Khot, L.R.; Yu, L.-X. Alfalfa (Medicago Sativa L.) Crop Vigor and Yield Characterization Using High-Resolution Aerial Multispectral and Thermal Infrared Imaging Technique. Comput. Electron. Agric. 2021, 182, 105999. [Google Scholar] [CrossRef]
  30. Prashar, A.; Jones, H.G. Infra-Red Thermography as a High-Throughput Tool for Field Phenotyping. Agronomy 2014, 4, 397–417. [Google Scholar] [CrossRef]
  31. Allen, R.G.; Pereira, L.S.; Raes, D.; Smith, M. Crop Evapotranspiration-Guidelines for Computing Crop Water Requirements-FAO Irrigation and Drainage Paper 56. FAO: Rome, Italy, 1998; Volume 300, p. D05109. [Google Scholar]
  32. Google Experimental Layout of Winter Wheat Crop at Different Rates Using Sprinkler and Flood Irrigation. 2022. Available online: https://www.google.com/maps/ (accessed on 8 October 2022).
  33. Panigrahi, N.; Das, B.S. Canopy Spectral Reflectance as a Predictor of Soil Water Potential in Rice. Water Resour. Res. 2018, 54, 2544–2560. [Google Scholar] [CrossRef]
  34. Gomez, K.A.; Gomez, A.A. Statistical Procedures for Agricultural Research; John Wiley & Sons: Hoboken, NJ, USA, 1984; ISBN 978-0-471-87092-0. [Google Scholar]
  35. Türkoğlu, M.; Hanbay, D. Plant Disease and Pest Detection Using Deep Learning-Based Features. Turk. J. Electr. Eng. Comput. Sci. 2019, 27, 1636–1651. [Google Scholar] [CrossRef]
  36. Hendrawan, Y.; Damayanti, R.; Al Riza, D.F.; Hermanto, M.B. Classification of Water Stress in Cultured Sunagoke Moss Using Deep Learning. ℡KOMNIKA (Telecommun. Comput. Electron. Control) 2021, 19, 1594–1604. [Google Scholar] [CrossRef]
  37. Esgario, J.G.; Krohling, R.A.; Ventura, J.A. Deep Learning for Classification and Severity Estimation of Coffee Leaf Biotic Stress. Comput. Electron. Agric. 2020, 169, 105162. [Google Scholar] [CrossRef] [Green Version]
  38. Fulton, L.V.; Dolezel, D.; Harrop, J.; Yan, Y.; Fulton, C.P. Classification of Alzheimer’s Disease with and without Imagery Using Gradient Boosted Machines and ResNet-50. Brain Sci. 2019, 9, 212. [Google Scholar] [CrossRef] [Green Version]
  39. Turkoglu, M.; Hanbay, D.; Sengur, A. Multi-Model LSTM-Based Convolutional Neural Networks for Detection of Apple Diseases and Pests. J Ambient Intell Hum. Comput 2022, 13, 3335–3345. [Google Scholar] [CrossRef]
  40. Kandel, I.; Castelli, M. The Effect of Batch Size on the Generalizability of the Convolutional Neural Networks on a Histopathology Dataset. ICT Express 2020, 6, 312–315. [Google Scholar] [CrossRef]
  41. Blum, A.; Shpiler, L.; Golan, G.; Mayer, J. Yield Stability and Canopy Temperature of Wheat Genotypes under Drought-Stress. Field Crops Res. 1989, 22, 289–296. [Google Scholar] [CrossRef]
  42. Rashid, A.; Stark, J.C.; Tanveer, A.; Mustafa, T. Use of Canopy Temperature Measurements as a Screening Tool for Drought Tolerance in Spring Wheat. J. Agron. Crop Sci. 1999, 182, 231–238. [Google Scholar] [CrossRef]
  43. DeJonge, K.C.; Taghvaeian, S.; Trout, T.J.; Comas, L.H. Comparison of Canopy Temperature-Based Water Stress Indices for Maize. Agric. Water Manag. 2015, 156, 51–62. [Google Scholar] [CrossRef]
  44. Olsovska, K.; Kovar, M.; Brestic, M.; Zivcak, M.; Slamka, P.; Shao, H.B. Genotypically Identifying Wheat Mesophyll Conductance Regulation under Progressive Drought Stress. Front. Plant Sci. 2016, 7, 1111. [Google Scholar] [CrossRef] [Green Version]
  45. Laxa, M.; Liebthal, M.; Telman, W.; Chibani, K.; Dietz, K.-J. The Role of the Plant Antioxidant System in Drought Tolerance. Antioxidants 2019, 8, 94. [Google Scholar] [CrossRef]
  46. Wang, X.; Vignjevic, M.; Jiang, D.; Jacobsen, S.; Wollenweber, B. Improved Tolerance to Drought Stress after Anthesis Due to Priming before Anthesis in Wheat (Triticum Aestivum L.) Var. Vinjett. J. Exp. Bot. 2014, 65, 6441–6456. [Google Scholar] [CrossRef] [Green Version]
  47. Kukanov, I.; Hautamäki, V.; Lee, K.A. Recurrent Neural Network and Maximal Figure of Merit for Acoustic Event Detection. In Proceedings of the Proceedings of the Workshop on Detection and Classification of Acoustic Scenes and Events, Munich, Germany, 16–17 November 2017. [Google Scholar]
  48. Castro-Zunti, R.; Park, E.H.; Choi, Y.; Jin, G.Y.; Ko, S. Early Detection of Ankylosing Spondylitis Using Texture Features and Statistical Machine Learning, and Deep Learning, with Some Patient Age Analysis. Comput. Med. Imaging Graph. 2020, 82, 101718. [Google Scholar] [CrossRef]
  49. Fan, Y.; Bai, J.; Lei, X.; Zhang, Y.; Zhang, B.; Li, K.-C.; Tan, G. Privacy Preserving Based Logistic Regression on Big Data. J. Netw. Comput. Appl. 2020, 171, 102769. [Google Scholar] [CrossRef]
  50. Rehman, T.U.; Mahmud, M.S.; Chang, Y.K.; Jin, J.; Shin, J. Current and Future Applications of Statistical Machine Learning Algorithms for Agricultural Machine Vision Systems. Comput. Electron. Agric. 2019, 156, 585–605. [Google Scholar] [CrossRef]
  51. Zhang, J.; Zhu, Y.; Zhang, X.; Ye, M.; Yang, J. Developing a Long Short-Term Memory (LSTM) Based Model for Predicting Water Table Depth in Agricultural Areas. J. Hydrol. 2018, 561, 918–929. [Google Scholar] [CrossRef]
  52. Fanourakis, D.; Aliniaeifard, S.; Sellin, A.; Giday, H.; Körner, O.; Nejad, A.R.; Delis, C.; Bouranis, D.; Koubouris, G.; Kambourakis, E. Stomatal Behavior Following Mid-or Long-Term Exposure to High Relative Air Humidity: A Review. Plant Physiol. Biochem. 2020, 153, 92–105. [Google Scholar] [CrossRef]
  53. Zhang, W.-Z.; Han, Y.-D.; Du, H.-J. Relationship between Canopy Temperature at Flowering Stage and Soil Water Content, Yield Components in Rice. Rice Sci. 2007, 14, 67–70. [Google Scholar] [CrossRef]
  54. Cavero, J.; Medina, E.T.; Puig, M.; Martínez-Cob, A. Sprinkler Irrigation Changes Maize Canopy Microclimate and Crop Water Status, Transpiration, and Temperature. Agron. J. 2009, 101, 854–864. [Google Scholar] [CrossRef]
  55. Home, P.G.; Panda, R.K.; Kar, S. Effect of Method and Scheduling of Irrigation on Water and Nitrogen Use Efficiencies of Okra (Abelmoschus Esculentus). Agric. Water Manag. 2002, 55, 159–170. [Google Scholar] [CrossRef]
  56. Wang, P.; Song, X.; Han, D.; Zhang, Y.; Zhang, B. Determination of Evaporation, Transpiration and Deep Percolation of Summer Corn and Winter Wheat after Irrigation. Agric. Water Manag. 2012, 105, 32–37. [Google Scholar] [CrossRef]
  57. Chandel, N.S.; Rajwade, Y.A.; Golhani, K.; Tiwari, P.S.; Dubey, K.; Jat, D. Canopy Spectral Reflectance for Crop Water Stress Assessment in Wheat (Triticum Aestivum, L.). Irrig. Drain. 2021, 70, 321–331. [Google Scholar] [CrossRef]
  58. Gupta, N.K.; Gupta, S.; Kumar, A. Effect of Water Stress on Physiological Attributes and Their Relationship with Growth and Yield of Wheat Cultivars at Different Stages. J. Agron. Crop Sci. 2001, 186, 55–62. [Google Scholar] [CrossRef]
  59. Yousefzadeh, K.; Houshmand, S.; Shiran, B.; Mousavi-Fard, S.; Zeinali, H.; Nikoloudakis, N.; Gheisari, M.M.; Fanourakis, D. Joint Effects of Developmental Stage and Water Deficit on Essential Oil Traits (Content, Yield, Composition) and Related Gene Expression: A Case Study in Two Thymus Species. Agronomy 2022, 12, 1008. [Google Scholar] [CrossRef]
  60. Osakabe, Y.; Osakabe, K.; Shinozaki, K.; Tran, L.-S. Response of Plants to Water Stress. Front. Plant Sci. 2014, 5, 86. [Google Scholar] [CrossRef] [Green Version]
  61. Nasiri, A.; Taheri-Garavand, A.; Fanourakis, D.; Zhang, Y.-D.; Nikoloudakis, N. Automated Grapevine Cultivar Identification via Leaf Imaging and Deep Convolutional Neural Networks: A Proof-of-Concept Study Employing Primary Iranian Varieties. Plants 2021, 10, 1628. [Google Scholar] [CrossRef]
  62. Taheri-Garavand, A.; Rezaei Nejad, A.; Fanourakis, D.; Fatahi, S.; Ahmadi Majd, M. Employment of Artificial Neural Networks for Non-Invasive Estimation of Leaf Water Status Using Color Features: A Case Study in Spathiphyllum Wallisii. Acta Physiol. Plant. 2021, 43, 78. [Google Scholar] [CrossRef]
  63. Zomorrodi, N.; Rezaei Nejad, A.; Mousavi-Fard, S.; Feizi, H.; Tsaniklidis, G.; Fanourakis, D. Potency of Titanium Dioxide Nanoparticles, Sodium Hydrogen Sulfide and Salicylic Acid in Ameliorating the Depressive Effects of Water Deficit on Periwinkle Ornamental Quality. Horticulturae 2022, 8, 675. [Google Scholar] [CrossRef]
  64. Grbovic, Z.; Panic, M.; Marko, O.; Brdar, S.; Crnojevic, V. Wheat Ear Detection in RGB and Thermal Images Using Deep Neural Networks. Environments 2019, 11, 13. [Google Scholar]
  65. de Melo, L.L.; de Melo, V.G.M.L.; Marques, P.A.A.; Frizzone, J.A.; Coelho, R.D.; Romero, R.A.F.; Barros, T.H. da S. Deep Learning for Identification of Water Deficits in Sugarcane Based on Thermal Images. Agric. Water Manag. 2022, 272, 107820. [Google Scholar] [CrossRef]
  66. Mhapsekar, M.; Mhapsekar, P.; Mhatre, A.; Sawant, V. Implementation of Residual Network (ResNet) for Devanagari Handwritten Character Recognition. In Advanced Computing Technologies and Applications; Springer: Berlin/Heidelberg, Germany, 2020; pp. 137–148. [Google Scholar]
  67. Wang, F.; Qiu, J.; Wang, Z.; Li, W. Intelligent Recognition of Surface Defects of Parts by Resnet. J. Phys. Conf. Ser. 2021, 1883, 012178. [Google Scholar] [CrossRef]
  68. Zhuang, S.; Wang, P.; Jiang, B.; Li, M. Learned Features of Leaf Phenotype to Monitor Maize Water Status in the Fields. Comput. Electron. Agric. 2020, 172, 105347. [Google Scholar] [CrossRef]
  69. Archontoulis, S.V.; Miguez, F.E. Nonlinear Regression Models and Applications in Agricultural Research. Agron. J. 2015, 107, 786–798. [Google Scholar] [CrossRef] [Green Version]
  70. Wijaya, D.R.; Sarno, R.; Zulaika, E. DWTLSTM for Electronic Nose Signal Processing in Beef Quality Monitoring. Sens. Actuators B Chem. 2021, 326, 128931. [Google Scholar] [CrossRef]
Figure 1. Experimental layout of the winter wheat crop irrigated at different rates using sprinkler and flood irrigation systems [32]. R—Replicates. Layout is prepared over google map.
Figure 1. Experimental layout of the winter wheat crop irrigated at different rates using sprinkler and flood irrigation systems [32]. R—Replicates. Layout is prepared over google map.
Plants 11 03344 g001
Figure 2. Sample raw and canopy masked RGB images ((a) non-stressed; (b) stressed) and thermal images ((c) non-stressed, (d) stressed) captured 35 days after sowing. Pseudo-color thermal images here are only for presentation and were scaled between 10 °C (RGB: [25, 25, 113]) and 80 °C (RGB: [235, 246, 255]).
Figure 2. Sample raw and canopy masked RGB images ((a) non-stressed; (b) stressed) and thermal images ((c) non-stressed, (d) stressed) captured 35 days after sowing. Pseudo-color thermal images here are only for presentation and were scaled between 10 °C (RGB: [25, 25, 113]) and 80 °C (RGB: [235, 246, 255]).
Plants 11 03344 g002
Figure 3. Data processing pipeline for stress prediction using selected deep learning and machine learning models.
Figure 3. Data processing pipeline for stress prediction using selected deep learning and machine learning models.
Plants 11 03344 g003
Figure 4. Variations in (a) canopy temperature; (b) soil moisture content; (c) relative water content; and (d) grain yield from wheat plots irrigated at different rates. S and F represent sprinkler and flood irrigations, respectively and the numbers followed by these letters denote irrigation rates levels as % of full crop evapotranspiration (ETc).
Figure 4. Variations in (a) canopy temperature; (b) soil moisture content; (c) relative water content; and (d) grain yield from wheat plots irrigated at different rates. S and F represent sprinkler and flood irrigations, respectively and the numbers followed by these letters denote irrigation rates levels as % of full crop evapotranspiration (ETc).
Plants 11 03344 g004
Figure 5. Accuracy and loss curves for (a) AlexNet; (b) GoogLeNet; (c) Inception V3; (d) MobileNet V2; and (e) ResNet50 models with RGB imagery inputs for crop water stress identification.
Figure 5. Accuracy and loss curves for (a) AlexNet; (b) GoogLeNet; (c) Inception V3; (d) MobileNet V2; and (e) ResNet50 models with RGB imagery inputs for crop water stress identification.
Plants 11 03344 g005
Figure 6. Accuracy and loss curves for (a) AlexNet; (b) GoogLeNet; (c) Inception V3; (d) MobileNet V2; and (e) ResNet50 models with thermal imagery inputs for crop water stress identification.
Figure 6. Accuracy and loss curves for (a) AlexNet; (b) GoogLeNet; (c) Inception V3; (d) MobileNet V2; and (e) ResNet50 models with thermal imagery inputs for crop water stress identification.
Plants 11 03344 g006
Figure 7. Confusion matrices for AlexNet, GoogLeNet, Inception V3, MobileNet V2, and ResNet50 models pertinent validation datasets of RGB imagery (ae) and thermal imagery (fj). Cell values (%) in row 1 column 2 represent type 1 error (TE1) while those in row 2 column 1 represent type 2 error (TE2) (details in Section 2.4). Numbers (% and actual counts) in green color indicate prediction accuracy while those in red color are prediction errors for stressed and non-stressed crop classes. Numbers in green box represent correct prediction and those in red box represent misclassification of non-stressed/stressed classes.
Figure 7. Confusion matrices for AlexNet, GoogLeNet, Inception V3, MobileNet V2, and ResNet50 models pertinent validation datasets of RGB imagery (ae) and thermal imagery (fj). Cell values (%) in row 1 column 2 represent type 1 error (TE1) while those in row 2 column 1 represent type 2 error (TE2) (details in Section 2.4). Numbers (% and actual counts) in green color indicate prediction accuracy while those in red color are prediction errors for stressed and non-stressed crop classes. Numbers in green box represent correct prediction and those in red box represent misclassification of non-stressed/stressed classes.
Plants 11 03344 g007
Figure 8. Accuracy and loss corves for Long Short Term memory based deep learning model for crop water stress identification.
Figure 8. Accuracy and loss corves for Long Short Term memory based deep learning model for crop water stress identification.
Plants 11 03344 g008
Figure 9. The confusion matrices for function approximation-based (a) ANN; (b) KNN; (c) LR; (d) SVM; and (e) DL-LSTM models. Cell values (%) in row 1 column 2 represent type 1 error (TE1) while those in row 2 column 1 represent type 2 error (TE2) (details in Section 2.4). Numbers (% and actual counts) in green color indicate prediction accuracy while those in red color are prediction errors for stressed and non-stressed crop classes. Numbers in green box represent correct prediction and those in red box represent misclassification of non-stressed/stressed classes.
Figure 9. The confusion matrices for function approximation-based (a) ANN; (b) KNN; (c) LR; (d) SVM; and (e) DL-LSTM models. Cell values (%) in row 1 column 2 represent type 1 error (TE1) while those in row 2 column 1 represent type 2 error (TE2) (details in Section 2.4). Numbers (% and actual counts) in green color indicate prediction accuracy while those in red color are prediction errors for stressed and non-stressed crop classes. Numbers in green box represent correct prediction and those in red box represent misclassification of non-stressed/stressed classes.
Plants 11 03344 g009
Table 1. Architecture parameters in the feature extraction-based models for crop water stress classification.
Table 1. Architecture parameters in the feature extraction-based models for crop water stress classification.
Architecture ParametersAlexNetGoogLeNetInception V3MobileNet V2ResNet50
Input image size227 × 227 × 3224 × 224 × 3299 × 299 × 3224 × 224 × 3224 × 224 × 3
No. of layers25141316154177
Relu layer757953549
Max Pooling layer3134-01
Convolutional layers557943553
Dropout layer21---
Fully connected31111
Fully connected layer FunctionFC8Loss3 classifierPredictionsLogitsFC1000
Depth822485350
Parameters61 × 1067 × 10623.9 × 1063.5 × 10625.6 × 106
Table 2. Hyperparameter tuning considerations to reduce overfitting and performance enhancement of the feature extraction-based models.
Table 2. Hyperparameter tuning considerations to reduce overfitting and performance enhancement of the feature extraction-based models.
ParametersValue
Epoch5, 10, and 20
Batchsize5, 10, 15, and 20
Iterations250 and 300
SolverSgdm and Adam
Learning rate1 × 10−4, 2 × 10−4, and 3 × 10−4
Sgdm: stochastic gradient descent with momentum; Adam: adaptive moment estimation.
Table 3. Crop and auxiliary data ranges for stressed and non-stressed labeling.
Table 3. Crop and auxiliary data ranges for stressed and non-stressed labeling.
Crop LabelParameterOutputReferences
StressedCanopy temperature (Tc): >23 °C &
Relative water content (RWC): <90% &
Soil moisture content (SMC): <25%
0[41,42,43,44,45,46]
Non-Stressedneither of the “stressed” conditions1
Table 4. Training parameters of function approximation-based classification models.
Table 4. Training parameters of function approximation-based classification models.
Function Approximation ModelParameters
Artificial Neural Network (ANN)Hidden layers: 2
Neurons: 64, 32
Learning rate (alpha): 0.01
Activation functions: sigmoid
Batch size: 8
Number of epochs: 300
Optimizer: Adam
Loss function: binary cross entropy
Kernel Nearest Neighbour (KNN)Number of Neighbors (K): 8
Distance Metric: minkowski
Weights: uniform
Algorithm: ball-tree
Logistic Regression (LR)Penalty parameter: L1
Inverse of regularization parameter (C): 5
Maximum iteration: 100
Tolerance: 0.0001
Support Vector Machine (SVM)Kernel Type (Kernel): RBF (Radial basis Function)
Penalty parameter (C): 100
bandwidth parameter (gamma): 0.001
Degree of the polynomial kernel: 3
Deep Learning-Long Short Term Memory (DL-LSTM)Number of neurons: 180
Epochs: 200
Batch size: 10
Optimizer: Adam
Number of hidden layers: 2
Loss activation function: MAE (Mean absolute error)
Adam: adaptive moment estimation.
Table 5. Training accuracies of feature extraction-based models to characterize wheat water stress using RGB and thermal imagery inputs under different epoch and batch size combinations.
Table 5. Training accuracies of feature extraction-based models to characterize wheat water stress using RGB and thermal imagery inputs under different epoch and batch size combinations.
Accuracy (%)
Epochs Batch SizeAlexNetGoogLeNetInception V3MobileNet V2ResNet50
Feature Extraction-Based Approaches with RGB Imagery Inputs
5590.495.291.995.196.3
51089.494.390.593.092.7
51592.394.690.492.393.5
52092.695.090.892.794.6
10593.895.592.293.195.8
101092.795.792.494.295.1
101593.495.992.794.497.1
102094.696.793.695.697.2
20595.397.293.894.492.3
201095.697.594.295.597.2
201596.6 *98.094.596.7 *97.9 *
202096.298.2 *95.0 *96.195.8
Feature extraction-based approaches with thermal imagery inputs
5594.496.297.094.895.9
51095.995.896.894.795.9
51596.495.495.993.198.4
52092.796.096.292.797.6
10594.596.597.293.596.7
101096.096.797.494.297.3
101596.497.297.594.798.5
102096.597.298.095.398.7
20597.297.698.297.299.0
201097.498.198.597.599.2
201598.2 *98.5 *98.7 *98.1 *99.5 *
202098.098.098.597.999.0
* Highest accuracy for the epoch and batch size combinations.
Table 6. Validation performance of feature extraction and function approximation models to characterize wheat water stress.
Table 6. Validation performance of feature extraction and function approximation models to characterize wheat water stress.
ModelsAccuracy (%)Precision (%)Sensitivity (%)F1 Score (%)
Feature Extraction-Based Approaches with only RGB Imagery Inputs
AlexNet93.491.494.592.2
GoogLeNet95.910091.195.3
Inception V392.594.489.491.8
MobileNet V294.489.0100.094.1
ResNet5096.995.997.396.6
Feature extraction-based approaches with only thermal imagery inputs
AlexNet96.295.995.995.9
GoogLeNet96.996.696.696.6
Inception V397.596.698.097.3
MobileNet V294.794.094.794.3
ResNet5098.496.7100.098.3
Function approximation-based approaches (with RWC, SMC, Tc, and RH inputs)
ANN93.592.792.793.0
KNN88.190.284.186.9
LR89.295.182.988.6
SVM91.495.186.790.8
DL-LSTM96.796.097.997.0
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chandel, N.S.; Rajwade, Y.A.; Dubey, K.; Chandel, A.K.; Subeesh, A.; Tiwari, M.K. Water Stress Identification of Winter Wheat Crop with State-of-the-Art AI Techniques and High-Resolution Thermal-RGB Imagery. Plants 2022, 11, 3344. https://doi.org/10.3390/plants11233344

AMA Style

Chandel NS, Rajwade YA, Dubey K, Chandel AK, Subeesh A, Tiwari MK. Water Stress Identification of Winter Wheat Crop with State-of-the-Art AI Techniques and High-Resolution Thermal-RGB Imagery. Plants. 2022; 11(23):3344. https://doi.org/10.3390/plants11233344

Chicago/Turabian Style

Chandel, Narendra S., Yogesh A. Rajwade, Kumkum Dubey, Abhilash K. Chandel, A. Subeesh, and Mukesh K. Tiwari. 2022. "Water Stress Identification of Winter Wheat Crop with State-of-the-Art AI Techniques and High-Resolution Thermal-RGB Imagery" Plants 11, no. 23: 3344. https://doi.org/10.3390/plants11233344

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop