Next Article in Journal
A Systematic Review of Food Allergy: Nanobiosensor and Food Allergen Detection
Next Article in Special Issue
A Dual Immunological Raman-Enabled Crosschecking Test (DIRECT) for Detection of Bacteria in Low Moisture Food
Previous Article in Journal
BODIPY Dyes as Probes and Sensors to Study Amyloid-β-Related Processes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Proximal Methods for Plant Stress Detection Using Optical Sensors and Machine Learning

Department of Biosystems Engineering, The University of Arizona, Tucson, AZ 85721, USA
*
Author to whom correspondence should be addressed.
Biosensors 2020, 10(12), 193; https://doi.org/10.3390/bios10120193
Submission received: 25 September 2020 / Revised: 10 November 2020 / Accepted: 26 November 2020 / Published: 29 November 2020
(This article belongs to the Special Issue Biosensors for Food and Agricultural Research)

Abstract

:
Plant stresses have been monitored using the imaging or spectrometry of plant leaves in the visible (red-green-blue or RGB), near-infrared (NIR), infrared (IR), and ultraviolet (UV) wavebands, often augmented by fluorescence imaging or fluorescence spectrometry. Imaging at multiple specific wavelengths (multi-spectral imaging) or across a wide range of wavelengths (hyperspectral imaging) can provide exceptional information on plant stress and subsequent diseases. Digital cameras, thermal cameras, and optical filters have become available at a low cost in recent years, while hyperspectral cameras have become increasingly more compact and portable. Furthermore, smartphone cameras have dramatically improved in quality, making them a viable option for rapid, on-site stress detection. Due to these developments in imaging technology, plant stresses can be monitored more easily using handheld and field-deployable methods. Recent advances in machine learning algorithms have allowed for images and spectra to be analyzed and classified in a fully automated and reproducible manner, without the need for complicated image or spectrum analysis methods. This review will highlight recent advances in portable (including smartphone-based) detection methods for biotic and abiotic stresses, discuss data processing and machine learning techniques that can produce results for stress identification and classification, and suggest future directions towards the successful translation of these methods into practical use.

1. Introduction

New and innovative management techniques are needed to ensure a sustainable future for the agricultural industry as the world population continues to increase. There will be over 9 billion people inhabiting the earth by 2050 [1]. Feeding such a large population is a complex problem that will require the utilization of a variety of ideas and techniques across disciplines. Among these is ensuring maximum crop yields with minimal losses from plant stresses such as drought, lack of nutrients, and disease. If proper attention is not given to the mitigation of these yield losses, several components of food security such as availability and economic access may be affected [2].
Pathogens and biotic stresses have received considerable attention in plant stress studies. About 20–30% of crops are lost due to pests and pathogens globally, with many of these losses occurring almost every growing season [3]. Furthermore, many species of plant pathogens can travel over long distances, whether by wind, water, or human activities such as trade and travel [4]. The distribution of pathogens may also be shifted due to the effects of climate change [5]. The detection of plant diseases is therefore essential not only to implement appropriate disease management strategies and mitigate potential losses but also to monitor changes in pathogen distribution. Most diseases can be detected relatively easily when symptoms are fully developed due to noticeable changes in the plant’s appearance; however, early detection is essential in preventing large yield losses. It is also needful to detect abiotic stresses such as water and nutrient deficiencies at an early stage before damages significantly affect crop yields; furthermore, these stresses will become more prevalent in the agriculture industry due to climate change effects such as drought stress and increased salinity [6], which will produce a need for increased environmental monitoring for more refined management practices.
Bioreceptor-based direct detection methods such as polymerase chain reaction (PCR) [7], enzyme-linked immunosorbent assay (ELISA) [8], and flow cytometry (FC) [9] are widely available for the detection of plant diseases; however, these methods require specialized training and can be time-consuming and labor-intensive. An alternative detection method that can be used both for biotic and abiotic stresses is a simple visual observation by an expert [10], but this technique can be prone to bias, with varying results based on the experience of the evaluator.
Optical techniques hold considerable advantages over the previously mentioned techniques, such as a greater potential for rapid disease detection (with some methods producing results in near-real-time [11]), standardized results that are not subject to individual biases, and the ability to detect both biotic and abiotic stresses. Techniques using proximal (near to the target) sensing methods have been utilizing optical sensors that are becoming increasingly smaller and more portable. Although optical sensors provide greater simplicity in the data collection process, the data itself can be complex and large in size (especially in regard to hyperspectral imaging), requiring the use of sophisticated data processing and statistical methods. Furthermore, images and spectroscopic data are not very specific to particular stresses, as opposed to bioreceptor-based (e.g., chemical ligands, antibodies, nucleic acids, etc.) direct detection methods. Despite these limitations, stress specificity and complex data analysis can still be achieved using machine learning techniques, which can analyze the data provided to find patterns that are specific to the plant stress in question. Many studies have successfully utilized machine learning to interpret optical sensor data for the detection of specific stresses. This review aims to provide an outline of current optical sensor types and machine learning methods used to proximally detect plant stresses.

2. Spectral Properties of Plant Tissues

Many physiological and chemical properties of plants influence the way their tissues reflect and absorb light. These properties can change when a plant is subjected to stress and alter the reflectance spectrum of its leaves (Figure 1).
Chlorophyll is a pigment that is involved in the photosynthesis process. Due to its important role in absorbing light, changes in chlorophyll content resulting from stress will alter the way the plant interacts with light energy. A decrease in chlorophyll content may occur when the plant is subjected to stress, which can be characterized in various ways including an increase of reflectance near 700 nm [12] and decreased reflectance in the 530–630 nm range [13]. Other pigments besides chlorophyll, such as carotenes [14] and xanthophylls [15], can also alter a plant’s reflectance properties.
In addition to pigmentation, leaf anatomical properties (Figure 2) such as the convexity of epidermal cells [16], surface texture and thickness of the leaf cuticle [17], and high trichome density [18] can be altered under stress and consequently affect a leaf’s spectral properties. For example, exposure to UV radiation can result in changes to chlorophyll content and increased leaf thickness, which can alter chlorophyll fluorescence levels [19]. Reflectance in the 950–970 nm range was found by Peñuelas et al. (1993) to be influenced by cell wall elasticity, which decreases in response to drought stress [20].
Small openings on plant leaves (stomata) can also affect leaf properties under stress [21]. These pores are important to regulate moisture and control gas exchange in the leaves; however, microorganisms such as bacteria and fungi can use them to enter and infect a plant. Plants can recognize these pathogens using pathogen- or microbe-associated molecular patterns (PAMPs or MAMPs), which can then trigger stomatal closure to prevent entry [22]. Stomatal closure can lead to an increase in leaf temperature, which can be detected in the infrared region of the electromagnetic spectrum.
The biochemical properties of leaves, such as cellulose, hemicellulose, lignin, protein, sugar, and starch can also change under various stresses and affect the reflectance properties of leaves [23]. For example, salt stress can result in spectral changes by damaging leaf mesophyll cells and altering polysaccharide and lignin composition in the cell wall [24]. Leaf water content can also influence reflectance spectra as light absorption in the infrared region (>1300 nm) is primarily due to water absorption [25].

3. Sensors and Data Collection

A variety of optical sensors have been used to evaluate plant health, including hyperspectral, multispectral, thermal, and fluorescence sensors (Table 1). The reflectance data collected by these devices can be represented using images acquired by imaging techniques or spectral graphs produced using spectroscopic methods. An important element that can affect a device’s success in stress detection is the sensor’s sensitivity to areas in the plant’s reflectance spectrum that are altered by biotic and abiotic stresses. Generally, the most sensitive region in the electromagnetic spectrum for evaluating plant health is the visible region [26], but other regions can also be influenced by stress. A diagram displaying various wavelength regions in the electromagnetic spectrum is presented in Figure 3 [27].

3.1. Hyperspectral Imaging

Hyperspectral imaging utilizes both imaging and spectroscopy methods to produce multi-dimensional data. Spectral information for a wide range of individual wavelengths is assigned to every pixel in an image [28]. Rather than collecting spectra from an entire image or an entire plant leaf, where spectra from the stressed and unaffected areas are mixed together, hyperspectral imaging can provide more sophisticated data that can isolate spectra only from the affected area and identify specific imaging patterns and characteristics. This method has become increasingly popular for plant phenotyping and stress detection in agriculture [29,30,31] and has been used to identify plant responses to both abiotic and biotic stresses, such as drought stress in maize [32] and barley [33], yellow rust [34] and powdery mildew [35] in wheat, salt stress in okra [36], and Black Sigatoka disease in banana plants [37].
Hyperspectral imaging for plant status evaluation typically uses a wavelength range of about 250–2500 nm, i.e., UV (ultraviolet), visible, and NIR (near-infrared), with the most important areas in the visible and NIR ranges [38]. Other areas of the spectrum are still being explored in terms of their capability for plant stress detection. For example, Brugger et al. (2019) used hyperspectral imaging in the UV range to detect salt stress in barley [39]. Due to the sensors’ ability to detect a wide range of wavelengths in the electromagnetic spectrum, many possibilities remain for evaluating new combinations of wavelengths for plant stress detection.
The data acquired using hyperspectral techniques are often used to compute and create vegetation indices (VIs). VIs are computed using ratios and combinations of reflectance measurements at a few specific wavelengths and have been used extensively for plant stress monitoring [40,41,42]. In addition to VIs, hyperspectral data can be used to develop spectral disease indices (SDIs) with the purpose of discriminating between specific plant diseases [43] (Table 2). Some examples include indices for detecting powdery mildew in wheat [44] and sugar beet [45], cercospora leaf spot in sugar beet [45], leaf rust in wheat [46], and myrtle rust [47]. Notable vegetation indices include the normalized difference vegetation index (NDVI) [48], water index (WI) [49], and photochemical reflectance index (PRI) [50]. The vast amount of spectral data that is collected using hyperspectral imaging provides great potential in developing new VIs and SDIs for the detection of highly specific plant stresses.
The main advantages of hyperspectral imaging include its robustness and ability to provide a large amount of data for analysis; however, this can result in instruments being relatively expensive. In addition, traditional hyperspectral imaging sensors can be bulky and large, which limits their portability and range of applications; however, the development of handheld spectroradiometers and small hyperspectral cameras (Figure 4) has largely addressed this problem. While these instruments typically have a more limited spectral range than a standard hyperspectral sensor, they have the capacity to be used with real-time detection applications [51,52]. Spectroradiometers are unable to capture hyperspectral images; however, they have been used in many studies to detect plant stresses, such as peanut leaf spot disease [53] and powdery mildew in barley [52].
Hyperspectral imaging sensors have become increasingly smaller and less expensive; however, considerable progress still remains to create a device that costs less than a few hundred U.S. dollars. Currently, the cost of these cameras is in the thousands of U.S. dollars, which can make them cost-prohibitive to many. Future advances in imaging technology over the coming years should be able to produce a hyperspectral camera or spectrophotometer that is cheaper and more accessible.

3.2. Multispectral Imaging and Spectroscopy

Multispectral techniques utilize data from ranges of wavelengths, rather than hundreds of individual wavelengths or narrow wavebands as demonstrated in hyperspectral techniques. A few wavelengths or wavebands of interest can be chosen for incorporation into a device that uses either imaging or spectroscopic techniques. Multispectral imaging involves data collection using a camera or other sensing device to produce image data in specified wavelength or waveband regions, while multispectral spectroscopy produces spectral data for specified wavebands. Both multispectral imaging and multispectral spectroscopy have been successfully used to identify plant stresses; for example, multispectral imaging was used to detect leaf spot disease in oilseed rape [54], gray mold in tomato leaves [55], and nutrient deficiencies in tomato plants [56], while multispectral spectroscopy was used to detect nitrogen deficiency stress in maize [57], drought stress in tomato plants [58], and nitrogen deficiency in canola plants [59]. Multispectral techniques offer more affordable sensors than their hyperspectral counterparts; however, they do not provide as much information about the plant and its environment due to the broader wavebands. Nevertheless, other advantages multispectral methods have are their portability and flexibility, which can aid in the creation of customized devices. Band-pass filters could be used in conjunction with a camera or other imaging device to acquire data in desired spectral ranges at a low cost. Recent modifications in smartphone cameras now permit the capture of NIR wavelengths; Chung et al. (2018) utilized an 800 nm high-pass filter attached to a smartphone to acquire both NIR and red images towards detection of plant stress [60].

3.3. RGB Imaging

RGB (visible or red-green-blue) imaging employs sensors that utilize the red, green, and blue regions of the spectrum to produce image data (which is the standard working principle of digital cameras). The wavelengths captured are approximately 400–499 nm for blue light (maximum at 475 nm), 500–549 nm for green light (maximum at 520 nm), and 550–750 nm for red light (maximum at 650 nm) [38]. In this sense, RGB imaging may be considered as a special case of multispectral imaging. However, as RGB imaging data are typically acquired using a digital camera or smartphone while multispectral imaging requires more specific equipment or instrumentation, they are typically treated separately.
The main advantages of RGB imaging are its affordability and small, portable sensor size. RGB image sensors are already present on smartphones and have been used to successfully evaluate plant stresses (Figure 5), such as iron deficiency chlorosis in soybean [11], various nutrient deficiencies in black gram [61], early and late blight in potato plants [62], and biotic stresses in wheat [63]. Furthermore, RGB imaging (especially with smartphones) does not require much technical expertise on the user’s side since they typically make use of commonly used devices such as digital cameras and smartphones. Smartphones also have enough computing power to process the captured data, which enables rapid assessments of plant stresses. However, many factors can complicate RGB data, such as lighting, environmental conditions, time of day, and spectral resolution [64,65]. Illumination is a particularly important concern in terms of field applications since it can vary greatly depending on the season and weather conditions. Diseases with various symptoms and complex image backgrounds can create further complications in processing the data; however, many of these difficulties can be overcome using image processing and machine learning techniques [11].

3.4. Thermal Imaging/Thermography

The main difference between thermography and other methods is its measurement of emitted radiation from an object, rather than reflected radiation [66]. Thermal cameras detect radiation in the infrared wavelength range, with the resulting measurements being displayed as false-color images (Figure 6) where the pixels contain the temperature values. Thermographic methods for plant stress detection primarily exploit changes in surface temperature being a notable stress symptom. Small openings on plant leaves (stomata) that control water loss from transpiration may close under stress, causing the temperature of the plant to increase [67]. Thermography has been used to detect a variety of biotic and abiotic stresses, such as Aspergillus carbonarius infection in grapes [66], drought stress in maize [68], apple scab disease [69], and drought stress in sesame plants [70].
Thermography is a relatively simple method that can be incorporated into systems designed for the rapid detection of plant stress. Thermal cameras are often very portable, and attachments have been developed that can be used with smartphones. Among these is the FLIR One, which was used by Petrie et al. (2019) to assess the water status of grapevines [71]. However, thermographic methods are highly affected by varying environmental conditions [70], which may make them more applicable in controlled environment applications rather than an open field. Furthermore, thermography lacks specificity and therefore provides a more general solution to plant stress detection. It is recommended to combine thermography with other methods when specific diseases need to be identified since this method is not able to distinguish between different stresses and diseases on its own [69].

3.5. Fluorescence Spectroscopy

The above-mentioned imaging methods (hyperspectral, multispectral, and RGB imaging) quantify the attenuations of incident light by the samples (plant leaves in this case) over the range of wavelengths, i.e., spectrophotometric detection. Since many components in plant leaves exhibit colorations and subsequently spectrophotometric responses, the resulting spectrophotometric images tend to be quite complex. Fluorescence-based methods can fix this issue, as only a small number of components in plant leaves exhibit fluorescence. Fluorescent molecules (e.g., chlorophyll, fluorescent dyes, etc.) absorb light at a specific wavelength (excitation) and emit at a specific, longer wavelength (emission), thus incident and emitted light can be separated. The two main types of fluorescence emitted by vegetation are blue-green fluorescence (400–600 nm) [72] and chlorophyll fluorescence (650–800 nm) [73]. The latter can be useful in evaluating photosynthetic activity, which can decrease under pathogenic stresses [74].
Although several techniques are available, two major methods for acquiring fluorescence data in plants are pulse-amplitude modulation (PAM) of the measuring light and continuous illumination [75]. Pulse-amplitude modulation devices use a pulsed measuring light source, an actinic light source, and a saturating light to obtain fluorescence signals [76]. In contrast, light is not pulsed when continuous illumination is utilized.
Fluorescence can be measured as a spectrum from a single point in time [77], or the change in fluorescence over time can be monitored (chlorophyll fluorescence kinetics). The basic principle behind chlorophyll fluorescence techniques is a lowered rate of photosynthesis from stresses and subsequent dissipation of chlorophyll fluorescence [78]. Fluorescence kinetics measurements require the use of dark adaptation, which consists of placing a plant (or the part of the plant to be measured) in the dark for a certain period of time before fluorescence measurements are taken. Dark adaptation allows for the measurement of the minimum level of fluorescence [79], which is a fundamental value in kinetics analysis since it provides a baseline for the other fluorescence measurements taken after the excitation light has been introduced. Plants are usually dark-adapted for a period of 30 min [80,81,82]. Regardless of whether dark adaptation is utilized or not, it is essential to give plants the necessary time to adapt to light conditions before measurements (for kinetics applications or standard spectra) are taken.
Fluorescence ratios are often used to analyze fluorescence data (both images and spectra) for evaluating plant stresses. Common ratios involving UV-induced (320–400 nm) fluorescence include F440/F520, F440/F690, F440/F740, and F690/F740; F440/F690 and F440/F740 are particularly useful for early stress detection applications (F represents fluorescence and the numbers represent emission wavelengths) [83]. Bürling et al. (2011) used red/far-red and blue/green amplitude ratios acquired from spectral signatures to differentiate between nitrogen deficiency, leaf rust, and powdery mildew stresses [84]. Although the ratios mentioned above are relatively well-established in fluorescence research, there is still room for exploration in determining other ratios that could be used to process data.
Fluorescence spectroscopy can identify the location and amount of a specific component from the sample through applying a narrow-range excitation light and detecting a narrow-range emission from such component. Figure 7 is an example of fluorescence spectroscopy, where the plant leaves are excited at 488 nm (blue color) and a spectrum with wavelengths of >500 nm (green and red colors) is collected. There is a clear difference between the healthy and virus-infected plant leaves. Fluorescence spectroscopy has been used in many other studies to detect both biotic and abiotic stresses, including drought stress in passion fruit [80]; nutrient stresses in maize [81], tomato [81], and rapeseed [82] crops; and citrus canker on grapefruit plants [85].
Fluorescence spectroscopy has advantages such as simplicity of use, low cost, and an ability to be incorporated into hand-held devices for screening applications [79]. In addition, the use of laser light as an excitation light source can be more reliable than other optical methods, as excitation exactly at the sample’s peak excitation wavelength can generate stronger and more specific fluorescent emission (as opposed to passive measurements) [86]. Fluorescence data can be collected across multiple wavelengths, which can provide more information than fluorescence captured at a single targeted wavelength. However, fluorescence spectroscopy alone still lacks specificity [85] because changes in fluorescence can be indicative of a wide variety of stresses. Therefore, it is necessary to combine this method with others if discrimination between specific stresses is to be achieved. Another challenge related to chlorophyll fluorescence kinetics is the reduction of fluorescence intensity over time (photoquenching or photobleaching); however, Saleem et al. (2020) were able to mitigate its effects by measuring fluorescence spectra quickly (about 15 s) after the excitation light was introduced [85].

3.6. Fluorescence Imaging

Fluorescence imaging utilizes a camera to obtain images of fluorescence (Figure 8). It is considered an improvement over spectroscopy since it obtains fluorescence data with higher dimensions, which can provide more information than single spectra. Rather than collecting a spectrum from an area of interest (i.e., fluorescence spectroscopy), fluorescence imaging can isolate the area of interest from that of non-interest. For example, Su et al. (2019) used fluorescence imaging to successfully discriminate crops from weeds [87]. One category of continuous fluorescence imaging is multicolor fluorescence imaging, which typically uses UV excitation light and collects fluorescence data from multiple bands, such as red (F680), far-red (F740), green (F520), and blue (F440) [83]. Multicolor fluorescence imaging is conceptually similar to multispectral imaging since only certain fluorescence wavebands are collected and combined to produce the image. Fluorescence imaging can also be used with dark adaptation and chlorophyll fluorescence kinetics applications.
Fluorescence imaging has been used in many studies to detect both biotic and abiotic stresses, such as herbicide stress in soybeans [88], cold stress in tomato seedlings [89], and biotic and abiotic stresses in barley, grapevine, and sugar beet [90]. A relatively simple and portable option for fluorescence image acquisition could consist of a smartphone and band-pass filters (as demonstrated in [91]); however, it is currently difficult to find methods with this type of setup for plant stress applications.
One advantage of fluorescence-based techniques is their sensible cost of equipment [92]; however, they do not always produce a clear distinction of healthy and diseased plant tissues at the early stage of a disease, so additional methods may be necessary to complement fluorescence for early disease detection [93]. Fluorescence-related methods could benefit from an increased sensitivity that could allow them to be used for stress discrimination applications rather than simple stress identification.

3.7. Combination of Sensors

Combining two or more of the methods mentioned above can provide more information on plant health as opposed to using just one method. The merging of data from multiple sensors has been successful in plant stress detection; for example, Moshou et al. (2011) used a combination of multispectral and hyperspectral imaging to detect yellow rust in wheat [94]. Many advantages are offered by using multiple sensors, including higher accuracy and decreased sensitivity to changes in the environment [94]; however, a major challenge is the merging of different data types. One possible solution is a discriminant analysis, which was used by Berdugo et al. (2014) to combine thermographic, hyperspectral, and chlorophyll fluorescence data to differentiate between cucumber mosaic virus, green mottle mosaic virus, and powdery mildew in cucumber plants [95]. Sensor combination shows great potential in producing accurate, highly specific data; however, more research is needed in methods to combine data from multiple sources with different properties and work with larger amounts of data [95]. Machine learning could be a pivotal tool in analyzing such combinatory sensor data.
A variety of sensors have been used to identify stresses in agricultural crops [96,97,98,99,100,101]; however, their detection capabilities could be greatly enhanced by incorporating machine learning techniques, which are discussed in the following sections.

4. Machine Learning for Data Processing

Machine learning has opened possibilities for new data analysis methods in a myriad of fields, including medicine, environmental science, and economics. Fundamentally, machine learning employs techniques to learn from the given data without providing explicit programming commands [102], which can result in the detection of new patterns that may otherwise be overlooked using traditional analytical methods. Major processes in a machine learning procedure include data acquisition and storage, preprocessing, classification, and trait extraction [103]. Figure 9 [104] outlines a simplified pathway for machine learning data analysis methods.
Machine learning is advantageous in agriculture-related fields because it can detect patterns using simultaneous combinations of multiple factors instead of examining traits individually [102]. The use of multiple factors is important due to the frequently high complexity of the environment surrounding plants, where variables such as changing light intensity, direction, and leaf angle can alter results. Machine learning can be used not only for classification purposes but also for pre-processing steps such as feature extraction and dimensionality reduction.
The assessment of plant health includes stress identification, discrimination, and quantification. Identification involves looking for symptoms (early or late) of a specific stress, discrimination consists of both identifying a specific stress and separating the symptoms from those of other stresses, and quantification is a measurement of the severity of the stress. Machine learning has been utilized for all these applications, as outlined in Table 3.
The selection of a machine learning method or pathway depends on the specific problem being addressed; as such, there is currently no specific approach that can be recommended for all applications. The following sections will provide an overview of machine learning data processing techniques that have been used for various agricultural applications.

4.1. Preprocessing

Data preprocessing is essential to ensure the accuracy and reproducibility of classification results [105]. Preprocessing consists of one or more operations that aim to improve the performance of the classification algorithms by providing data in a more accessible and normalized format. Image preprocessing techniques may include image cropping, background removal, contrast enhancement, image thresholding, noise removal with filters, clustering, and principal component analysis (PCA) [102]. Although this section deals mostly with imaging techniques, spectral data may also be processed using some of the listed methods, such as PCA. Outlined below are some preprocessing steps that are commonly applied to imaging data.

4.1.1. Color Space Conversion

Color space conversion is a data processing technique that can be used with RGB images as another way to represent color. Color spaces can be used to acquire additional color features from images to aid in feature extraction and image classification. Several studies have used features obtained from color space conversion to process RGB data for plant stress detection, including L*a*b* (L* = lightness from black to white, a* = from green to red, and b* = from blue to yellow) to detect bacterial blight, fruit spot, fruit rot, and leaf spot in pomegranate plants [106]; HSI (hue, saturation, intensity) to detect early scorch, late scorch, cottony mold, ashen mold, and tiny whiteness in plants [107]; and YCbCr (Y = luma component; Cb and Cr = blue- and red-differences of chroma components) to detect diseases in soybean [108]. A few alternative color spaces are outlined in Figure 10 [109].

4.1.2. Dimensionality Reduction

Dimensionality reduction is a process that aims to provide a more compact representation of data while preserving as much information as possible. A common method for dimensionality reduction is principal component analysis (PCA), which geometrically projects data onto lower dimensions (principal components) that act as feature summaries [110]. PCA can combine dependent (or highly correlated) variables into a common variable while minimizing the loss of information. By doing so, the dimensionality of data can be reduced. The first principal component (PC1) is evaluated from the data set. Then PC2 is evaluated from the remainders, and the process is repeated, e.g., PC3, PC4, etc. The principal components (PCs) represent data variances, and these can be plotted in 2D or 3D plots (in the case of two or three PCs) known as PCR score plots.
All PCs can also be fed into the various machine learning models as a pre-processing step of dimensionality reduction. PCA has been used in many studies as an important preprocessing step to manage both imaging and spectral data. For example, PCA was used in an image preprocessing pipeline by Lu et al. (2017) to aid in acquiring feature maps [111]. While better dimensionality reduction methods have recently emerged, e.g., linear discriminant analysis (LDA) that can maximize the class separation, PCA is often preferred over the recent methods as an unbiased dimensionality reduction method. PCA can be a valuable tool to aid in data interpretation, but one disadvantage of this method is its ability to be influenced by outliers in the data [112].

4.1.3. Segmentation

Image segmentation is a process that can organize an image into key areas, such as the object and its background. This technique is useful in agricultural applications due to its ability to reduce errors or misclassifications resulting from noise in the background. Notable methods include clustering-based approaches such as k-means, which can be useful in identifying stressed areas of a plant in an image [107]. Disease detection applications may require other techniques such as pixel removal and masking [113]. For example, Ma et al. (2018) used excess red index (ExR), H from the HSV (hue, saturation, value) color space, and b* from the L*a*b* color space to discriminate between disease spots and background in images [114]. An example of segmentation being used to separate plants from the background of an image is demonstrated in Figure 11 [115].

4.1.4. Feature Extraction

Feature extraction can be used to express data in a format that is more accessible to machine-learning algorithms [105]. It consists of reducing redundant data and collecting a set of extracted features; for images, available techniques include Global Color Histogram [116], Local Binary Patterns [117], and Color Coherence Vector [118]. Features can include color-related characteristics such as the variance of color channels and texture features such as contrast and channel homogeneity [114]. These acquired features are then analyzed using the classification algorithms.

4.2. Machine Learning Algorithms for Classification

Once the necessary preprocessing steps are complete, the data can be fed into a machine learning algorithm for classification. These algorithms attempt to find patterns in data to use in assigning classes (e.g., stressed vs. healthy) to unlabeled data [29]. Machine learning algorithms can be divided into supervised, weakly-supervised, and unsupervised categories, all of which can be used for classification [119,120]. The major difference among these algorithms is supervised learning involves the use of labeled training data to predict the labels of testing data; weakly-supervised learning can use smaller datasets, coarse labels, or misclassified labels for training, and unsupervised learning uses only unlabeled data [120]. One of the most prominent examples of unsupervised learning is clustering algorithms, which create clusters consisting of samples with similar traits [121].
Many machine learning algorithms have been used in agriculture to classify data; however, the most common methods include artificial neural networks (ANNs) [122] and support vector machines (SVMs) [29]. This review will primarily focus on SVM, ANN, and deep learning methods; however, other algorithms such as random forest [123] have been successfully used for plant stress identification applications.
Machine learning techniques can be very robust classifiers, yet one drawback is their tendency to overfit the data (especially when the data set is small), which results in incorrect classifications. In addition, machine learning can be time-consuming, especially when large image files are involved. Both issues, however, can be mitigated using some of the following processes. One method that has been used to mitigate overfitting errors in image classification is data augmentation, which consists of slightly distorting the images using techniques such as rotation [124], mirroring [125], and color variation [126]. If data augmentation and image manipulation are deemed necessary in the data processing pathway, they must be performed before running the data through the classification algorithm.

4.2.1. Support Vector Machine (SVM)

SVM is a supervised learning method, i.e., requiring training data set to identify classes of unknown data. Let us assume a simple case that most (e.g., >90%) of the training data set can be reduced to two dimensions through dimensionality reduction methods such as PCA. These data can be plotted on a 2D coordinate system (i.e., PCA score plot). With known classes (e.g., stressed vs. healthy) of the data, it is possible to draw a line that can best separate all of the data into two classes; this line is called a decision boundary (demonstrated in Figure 12 [127]). The procedure can also be used for three or more dimensions of data, where the boundary becomes a plane for three dimensions or a hyperplane for dimensions higher than three. It may be necessary to use about 10 principal components from PCA, but this dimension number is still substantially small compared to the dimensions of the raw data, which could range from hundreds (for spectra) to millions (for images). Testing data is fed into the same data processing pathway as the training data, and the decision boundary formed during training determines the class of testing data. While SVM is inherently a linear method, non-linear separation is also possible using non-linear kernels. Classification into multiple classes is also possible using multiple decision boundaries.
SVMs are one of the most common machine learning algorithms used in agriculture applications. They have been successfully used in many studies relating to plant stress detection, such as identifying Huanglongbing (HLB; also known as citrus greening disease) and nutrient stresses in citrus leaves [100], as well as rating the severity of iron deficiency chlorosis in soybeans [11]. A similar method, relevance vector machine (RVM), was used to identify stripe rust and powdery mildew in wheat [63].
While SVM is simple in principle and works quite well with very high dimensions of data (such as spectra and images), it does not explain how close or far away errors are from the true class identification. This is particularly problematic when the data set is noisy, where a distinct decision boundary cannot be determined clearly.

4.2.2. Artificial Neural Network (ANN)

An artificial neural network (ANN) is a machine learning model that mimics the function of a biological neural network [128]. The basic architecture consists of artificial neurons that process several inputs weighted according to their importance and produce a corresponding output [124].
ANNs have been used successfully in many studies for the identification and classification of various plant stresses. These include detecting powdery mildew and soft rot in zucchini [129], classifying biotic stresses in pomegranate [106], detecting orange spotting disease in oil palm [130], and identifying crown rot in wheat [131]. A major advantage of ANNs is their ability to be used without specialized knowledge on the data and its interpretation; however, disadvantages include being prone to overfitting and requiring greater amounts of computational resources [132]. Several types of ANNs exist, some of which are outlined in Figure 13 [133].

4.2.3. Deep Learning

Deep learning is a subcategory of machine learning that utilizes ANNs and consists of more advanced models with multiple layers (“deep” indicates the depth of layers). A common model used in agriculture is the convolutional neural network (CNN), which performs convolutions on data for image classification [134]. CNNs and their variations have been frequently used in plant stress studies that utilize machine learning, such as detecting the breaking virus in tulips [135], identifying potato Y virus [136], gauging the severity of apple black spot [119], classifying biotic stresses on cucumber leaves [114], and rating the severity of biotic stresses on coffee leaves [126]. Pretrained CNN models such as GoogleLeNet [137], AlexNet [114], ResNet [138], and VGG [139] have also been used. For instances where an extensive array of training data is required, many studies utilize databases such as PlantVillage [140] and the Wheat Disease Database [141], both of which have been used in conjunction with deep learning models.
One advantage of deep learning techniques is that they work well with raw data [142], which therefore cuts down on time spent in data preprocessing (color space conversion, dimensionality reduction, segmentation, and feature extraction). In addition, feature extraction is sometimes performed in the deep learning model without the need for an outside processing step [143]. However, a major disadvantage is a need for large datasets (often numbering in the thousands [139,144]) to produce accurate results [111].
Table 3. Machine Learning Algorithms Used for Plant Stress Detection.
Table 3. Machine Learning Algorithms Used for Plant Stress Detection.
PurposeData TypePlantStressAlgorithmAccuracyReferences
IdentificationFluorescence imagingZucchiniSoft rotANN100%[129]
SVM90%
Logistic regression analysis60%
Powdery mildewANN71.2%
SVM48.1%
Logistic regression analysis73.1%
IdentificationHyperspectralOil palmOrange spotting diseaseMultilayer perceptron neural network-[130]
IdentificationHyperspectralWheatCrown rotANN74.14%[131]
Logistic regression53.45%
K nearest-neighbors58.62%
Decision trees56.90%
Extreme random forest58.62%
SVM50%
IdentificationRGB imagesTulipTulip breaking virusFaster R-CNN86% *[135]
IdentificationHyperspectralPotatoPotato virus YFully convolutional neural network92% *[136]
ClassificationRGB images from smartphoneWheatPowdery mildew, stripe rustRVM 88.89%[63]
SVM77.78%
ClassificationRGB images from databasePomegranateFruit spot, bacterial blight, fruit rot, leaf spotMultilayer perceptron90%[106]
ClassificationRGB imagesCucumberAnthracnose, downy mildew, powdery mildew, target leaf spotsDeep CNN92.2%[114]
SVM81.9%
AlexNet92.6%
Random Forest84.8%
ClassificationHyperspectralSugar beetCercospora leaf spot, sugar beet rust, powdery mildewSVM86.42%[29]
ClassificationRGB images from databaseWheatPowdery mildew, smut, black chaff, stripe rust, leaf blotch, leaf rustVGG-CNN-S73%[141]
VGG-FCN-S95.12%
VGG-CNN-VD1693.27%
VGG-FCN-VD1697.95%
QuantificationHyperspectralBarleyDrought stress Ordinal SVM67.9%[33]
QuantificationRGB images from digital cameraSoybeanIron deficiency chlorosisHierarchical SVM-SVM99.2%[11]
Hierarchical LDA-SVM98.3%
Decision tree99.7%
Quadratic discriminant analysis98.5%
Naïve Bayes98.4%
K-Nearest-Neighbors99.5%
Random forest99.1%
Gaussian mixture model99.4%
Linear discriminant analysis (LDA)98.5%
SVM97.3%
QuantificationRGB images from databaseAppleBlack rotVGG1690.4%[119]
ResNet5080%
QuantificationRGB images from smartphoneCoffeeLeaf miner, rust, brown leaf spot, cercospora leaf spotAlexNet84.13%[126]
GoogleLeNet82.94%
VGG1686.51%
ResNet5084.13%
MobileNetV284.52%
* Indicates a recall value, not an accuracy value.

5. Concluding Remarks

A variety of optical sensing methods and machine learning techniques have been used to recognize both biotic and abiotic stresses, especially plant diseases. One observation is that machine learning is commonly used to process imaging data (especially RGB images), but spectroscopic methods more frequently utilize traditional statistical methods. In the future, machine learning methods could be further incorporated into spectroscopic data analysis pathways.
Currently, many of the studies mentioned are producing detection results that are specific to just a few plants. Leaf reflectance properties can differ greatly between plant species, so it is difficult to produce results that are generalizable to several plants in different circumstances. The development of more generalized (rather than species-specific) results is likely a future direction in plant stress detection; however, more research is needed to find features and parameters that can lead to such results. Methods such as smartphone imaging, thermography, and fluorescence imaging have the potential to be scaled up to larger-scaled systems to analyze plant canopies in open fields or controlled environments.
Imaging devices (especially multispectral/RGB sensors) have improved in quality and become more compact over recent years. Optical resolutions of recent smartphones’ cameras are comparable to most standalone digital cameras, effectively eliminating the bulk of digital camera markets and only leaving the high-end markets. Sensitivity has also improved dramatically; the white LED flash is rarely necessary with recent smartphones. Computing power and memory have also improved significantly for recent smartphones, which has enabled on-board image processing to become a reality. Cloud storage and computing for remote file management and execution also complements the smartphone’s computing power and memory capacity, allowing for more advanced data processing operations to be performed. Optical zooms (which magnify images mechanically using optical lenses) are possible with recent smartphones, although limited at 2x − 4x at the time of writing. Furthermore, smartphones have the data processing power needed to run machine learning algorithms and thus can provide a rapid, on-site assessment of plant stresses.
The discrimination of specific stresses (especially stresses from specific nutrients) remains a challenge. Discrimination may become more feasible with improvements in the sensitivity of optical devices; however, this increased sensitivity may result in data being more prone to noise from the surrounding environment. Environmental noise could be overcome by the use of image segmentation and machine learning models to help distinguish between noise and the targeted characteristic.
Many improvements are being made with imaging technology and data processing techniques that will enable the development of robust, portable devices for plant stress detection. Although research is still needed in many areas such as the fusion of data from multiple sensors and discrimination between specific biotic and abiotic stresses, current developments have great potential to be deployed as useful tools for the agriculture industry.

Author Contributions

Conceptualization, A.V.Z. and J.-Y.Y.; literature survey and data collection; A.V.Z.; literature analysis: A.V.Z.; writing—original draft preparation, A.V.Z.; writing—review and editing, J.-Y.Y.; supervision and project administration, J.-Y.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

We would like to thank the reviewers for their detailed comments that helped us improve the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. The Future of Food and Agriculture: Trends and Challenges; Food and Agriculture Organization of the United Nations: Rome, Italy, 2017; ISBN 978-92-5-109551-5.
  2. Savary, S.; Bregaglio, S.; Willocquet, L.; Gustafson, D.; Mason D’Croz, D.; Sparks, A.; Castilla, N.; Djurle, A.; Allinne, C.; Sharma, M.; et al. Crop health and its global impacts on the components of food security. Food Secur. 2017, 9, 311–327. [Google Scholar] [CrossRef]
  3. Savary, S.; Willocquet, L.; Pethybridge, S.J.; Esker, P.; McRoberts, N.; Nelson, A. The global burden of pathogens and pests on major food crops. Nat. Ecol. Evol. 2019, 3, 430–439. [Google Scholar] [CrossRef] [PubMed]
  4. McDonald, B.A.; Stukenbrock, E.H. Rapid emergence of pathogens in agro-ecosystems: Global threats to agricultural sustainability and food security. Philos. Trans. R. Soc. B Biol. Sci. 2016, 371, 20160026. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Shaw, M.W.; Osborne, T.M. Geographic distribution of plant pathogens in response to climate change: Pathogen distributions and climate. Plant Pathol. 2011, 60, 31–43. [Google Scholar] [CrossRef]
  6. Yeo, A. Predicting the interaction between the effects of salinity and climate change on crop plants. Sci. Hortic. 1998, 78, 159–174. [Google Scholar] [CrossRef]
  7. Sui, X.; Zheng, Y.; Li, R.; Padmanabhan, C.; Tian, T.; Groth-Helms, D.; Keinath, A.P.; Fei, Z.; Wu, Z.; Ling, K.-S. Molecular and Biological Characterization of Tomato mottle mosaic virus and Development of RT-PCR Detection. Plant Dis. 2017, 101, 704–711. [Google Scholar] [CrossRef] [Green Version]
  8. Cimmino, A.; Iannaccone, M.; Petriccione, M.; Masi, M.; Evidente, M.; Capparelli, R.; Scortichini, M.; Evidente, A. An ELISA method to identify the phytotoxic Pseudomonas syringae pv. actinidiae exopolysaccharides: A tool for rapid immunochemical detection of kiwifruit bacterial canker. Phytochem. Lett. 2017, 19, 136–140. [Google Scholar] [CrossRef]
  9. Andolfi, A.; Cimmino, A.; Evidente, A.; Iannaccone, M.; Capparelli, R.; Mugnai, L.; Surico, G. A New Flow Cytometry Technique to Identify Phaeomoniella chlamydospora Exopolysaccharides and Study Mechanisms of Esca Grapevine Foliar Symptoms. Plant Dis. 2009, 93, 680–684. [Google Scholar] [CrossRef] [Green Version]
  10. McKenzie, D.B.; Hossner, L.R.; Newton, R.J. Sorghum cultivar evaluation for iron chlorosis resistance by visual scores. J. Plant Nutr. 1984, 7, 677–685. [Google Scholar] [CrossRef]
  11. Naik, H.S.; Zhang, J.; Lofquist, A.; Assefa, T.; Sarkar, S.; Ackerman, D.; Singh, A.; Singh, A.K.; Ganapathysubramanian, B. A real-time phenotyping framework using machine learning for plant stress severity rating in soybean. Plant Methods 2017, 13, 23. [Google Scholar] [CrossRef] [Green Version]
  12. Zhu, J.; He, W.; Yao, J.; Yu, Q.; Xu, C.; Huang, H.; Mhae, B.; Jandug, C. Spectral Reflectance Characteristics and Chlorophyll Content Estimation Model of Quercus aquifolioides Leaves at Different Altitudes in Sejila Mountain. Appl. Sci. 2020, 10, 3636. [Google Scholar] [CrossRef]
  13. Lichtenthaler, H.K.; Gitelson, A.; Lang, M. Non-Destructive Determination of Chlorophyll Content of Leaves of a Green and an Aurea Mutant of Tobacco by Reflectance Measurements. J. Plant Physiol. 1996, 148, 483–493. [Google Scholar] [CrossRef]
  14. Gitelson, A.A.; Zur, Y.; Chivkunova, O.B.; Merzlyak, M.N. Assessing Carotenoid Content in Plant Leaves with Reflectance Spectroscopy. Photochem. Photobiol. 2002, 75, 272–281. [Google Scholar] [CrossRef]
  15. Vilfan, N.; Van der Tol, C.; Yang, P.; Wyber, R.; Malenovský, Z.; Robinson, S.A.; Verhoef, W. Extending Fluspect to simulate xanthophyll driven leaf reflectance dynamics. Remote Sens. Environ. 2018, 211, 345–356. [Google Scholar] [CrossRef]
  16. Bone, R.A.; Lee, D.W.; Norman, J.M. Epidermal cells functioning as lenses in leaves of tropical rain-forest shade plants. Appl. Opt. 1985, 24, 1408. [Google Scholar] [CrossRef] [PubMed]
  17. Grant, L.; Daughtry, C.S.T.; Vanderbilt, V.C. Polarized and specular reflectance variation with leaf surface features. Physiol. Plant. 1993, 88, 1–9. [Google Scholar] [CrossRef]
  18. Ehleringer, J.; Bjorkman, O.; Mooney, H.A. Leaf Pubescence: Effects on Absorptance and Photosynthesis in a Desert Shrub. Science 1976, 192, 376–377. [Google Scholar] [CrossRef]
  19. Bornman, J.F.; Vogelmann, T.C. Effect of UV-B Radiation on Leaf Optical Properties Measured with Fibre Optics. J. Exp. Bot. 1991, 42, 547–554. [Google Scholar] [CrossRef]
  20. Peñuelas, J.; Filella, I.; Biel, C.; Serrano, L.; Savé, R. The reflectance at the 950–970 nm region as an indicator of plant water status. Int. J. Remote Sens. 1993, 14, 1887–1905. [Google Scholar] [CrossRef]
  21. Liew, O.; Chong, P.; Li, B.; Asundi, A. Signature Optical Cues: Emerging Technologies for Monitoring Plant Health. Sensors 2008, 8, 3205–3239. [Google Scholar] [CrossRef] [Green Version]
  22. Sawinski, K.; Mersmann, S.; Robatzek, S.; Böhmer, M. Guarding the Green: Pathways to Stomatal Immunity. Mol. Plant-Microbe Interact. 2013, 26, 626–632. [Google Scholar] [CrossRef] [PubMed]
  23. Fourty, T.; Baret, F.; Jacquemoud, S.; Schmuck, G.; Verdebout, J. Leaf optical properties with explicit description of its biochemical composition: Direct and inverse problems. Remote Sens. Environ. 1996, 56, 104–117. [Google Scholar] [CrossRef]
  24. de Lima, R.B.; dos Santos, T.B.; Vieira, L.G.E.; de Lourdes Lúcio Ferrarese, M.; Ferrarese-Filho, O.; Donatti, L.; Boeger, M.R.T.; de Oliveira Petkowicz, C.L. Salt stress alters the cell wall polysaccharides and anatomy of coffee (Coffea arabica L.) leaf cells. Carbohydr. Polym. 2014, 112, 686–694. [Google Scholar] [CrossRef] [PubMed]
  25. Allen, W.A.; Richardson, A.J. Interaction of Light with a Plant Canopy. J. Opt. Soc. Am. 1968, 58, 1023. [Google Scholar] [CrossRef]
  26. Carter, G.A. Responses of Leaf Spectral Reflectance to Plant Stress. Am. J. Bot. 1993, 80, 239–243. [Google Scholar] [CrossRef]
  27. Rosique, F.; Navarro, P.J.; Fernández, C.; Padilla, A. A Systematic Review of Perception System and Simulators for Autonomous Vehicles Research. Sensors 2019, 19, 648. [Google Scholar] [CrossRef] [Green Version]
  28. Pandey, P.; Ge, Y.; Stoerger, V.; Schnable, J.C. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging. Front. Plant Sci. 2017, 8, 1348. [Google Scholar] [CrossRef] [Green Version]
  29. Rumpf, T.; Mahlein, A.-K.; Steiner, U.; Oerke, E.-C.; Dehne, H.-W.; Plümer, L. Early detection and classification of plant diseases with Support Vector Machines based on hyperspectral reflectance. Comput. Electron. Agric. 2010, 74, 91–99. [Google Scholar] [CrossRef]
  30. Yang, W.; Yang, C.; Hao, Z.; Xie, C.; Li, M. Diagnosis of Plant Cold Damage Based on Hyperspectral Imaging and Convolutional Neural Network. IEEE Access 2019, 7, 118239–118248. [Google Scholar] [CrossRef]
  31. Zovko, M.; Žibrat, U.; Knapič, M.; Kovačić, M.B.; Romić, D. Hyperspectral remote sensing of grapevine drought stress. Precis. Agric. 2019, 20, 335–347. [Google Scholar] [CrossRef]
  32. Asaari, M.S.M.; Mertens, S.; Dhondt, S.; Inzé, D.; Wuyts, N.; Scheunders, P. Analysis of hyperspectral images for detection of drought stress and recovery in maize plants in a high-throughput phenotyping platform. Comput. Electron. Agric. 2019, 162, 749–758. [Google Scholar] [CrossRef]
  33. Behmann, J.; Steinrücken, J.; Plümer, L. Detection of early plant stress responses in hyperspectral images. ISPRS J. Photogramm. Remote Sens. 2014, 93, 98–111. [Google Scholar] [CrossRef]
  34. Zhang, J.; Yuan, L.; Pu, R.; Loraamm, R.W.; Yang, G.; Wang, J. Comparison between wavelet spectral features and conventional spectral features in detecting yellow rust for winter wheat. Comput. Electron. Agric. 2014, 100, 79–87. [Google Scholar] [CrossRef]
  35. Cao, X.; Luo, Y.; Zhou, Y.; Duan, X.; Cheng, D. Detection of powdery mildew in two winter wheat cultivars using canopy hyperspectral reflectance. Crop Prot. 2013, 45, 124–131. [Google Scholar] [CrossRef]
  36. Feng, X.; Zhan, Y.; Wang, Q.; Yang, X.; Yu, C.; Wang, H.; Tang, Z.; Jiang, D.; Peng, C.; He, Y. Hyperspectral imaging combined with machine learning as a tool to obtain high-throughput plant salt-stress phenotyping. Plant J. 2020, 101, 1448–1461. [Google Scholar] [CrossRef] [PubMed]
  37. Ochoa, D.; Cevallos, J.; Vargas, G.; Criollo, R.; Romero, D.; Castro, R.; Bayona, O. Hyperspectral Imaging System for Disease Scanning on Banana Plants. In Proceedings of the Sensing for Agriculture and Food Quality and Safety VIII, Baltimore, MD, USA, 20–21 April 2016. [Google Scholar] [CrossRef]
  38. Lowe, A.; Harrison, N.; French, A.P. Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress. Plant Methods 2017, 13, 80. [Google Scholar] [CrossRef]
  39. Brugger, A.; Behmann, J.; Paulus, S.; Luigs, H.-G.; Kuska, M.T.; Schramowski, P.; Kersting, K.; Steiner, U.; Mahlein, A.-K. Extending Hyperspectral Imaging for Plant Phenotyping to the UV-Range. Remote Sens. 2019, 11, 1401. [Google Scholar] [CrossRef] [Green Version]
  40. Ryu, J.-H.; Jeong, H.; Cho, J. Performances of Vegetation Indices on Paddy Rice at Elevated Air Temperature, Heat Stress, and Herbicide Damage. Remote Sens. 2020, 12, 2654. [Google Scholar] [CrossRef]
  41. Liu, Q.; Zhang, F.; Chen, J.; Li, Y. Water stress altered photosynthesis-vegetation index relationships for winter wheat. Agron. J. 2020, 112, 2944–2955. [Google Scholar] [CrossRef]
  42. Ihuoma, S.O.; Madramootoo, C.A. Sensitivity of spectral vegetation indices for monitoring water stress in tomato plants. Comput. Electron. Agric. 2019, 163, 104860. [Google Scholar] [CrossRef]
  43. Meng, R.; Lv, Z.; Yan, J.; Chen, G.; Zhao, F.; Zeng, L.; Xu, B. Development of Spectral Disease Indices for Southern Corn Rust Detection and Severity Classification. Remote Sens. 2020, 12, 3233. [Google Scholar] [CrossRef]
  44. Huang, W.; Guan, Q.; Luo, J.; Zhang, J.; Zhao, J.; Liang, D.; Huang, L.; Zhang, D. New Optimized Spectral Indices for Identifying and Monitoring Winter Wheat Diseases. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2516–2524. [Google Scholar] [CrossRef]
  45. Mahlein, A.-K.; Rumpf, T.; Welke, P.; Dehne, H.-W.; Plümer, L.; Steiner, U.; Oerke, E.-C. Development of spectral indices for detecting and identifying plant diseases. Remote Sens. Environ. 2013, 128, 21–30. [Google Scholar] [CrossRef]
  46. Ashourloo, D.; Mobasheri, M.; Huete, A. Developing Two Spectral Disease Indices for Detection of Wheat Leaf Rust (Pucciniatriticina). Remote Sens. 2014, 6, 4723–4740. [Google Scholar] [CrossRef] [Green Version]
  47. Heim, R.H.J.; Wright, I.J.; Allen, A.P.; Geedicke, I.; Oldeland, J. Developing a spectral disease index for myrtle rust (Austropuccinia psidii). Plant Pathol. 2019, 68, 738–745. [Google Scholar] [CrossRef]
  48. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Plains with ERTS. NASA Goddard Space Flight Cent. 3d ERTS-1 Symp. 1974, 1, 9. [Google Scholar]
  49. Penuelas, J.; Pinol, J.; Ogaya, R.; Filella, I. Estimation of plant water concentration by the reflectance Water Index WI (R900/R970). Int. J. Remote Sens. 1997, 18, 2869–2875. [Google Scholar] [CrossRef]
  50. Gamon, J.A.; Peñuelas, J.; Field, C.B. A narrow-waveband spectral index that tracks diurnal changes in photosynthetic efficiency. Remote Sens. Environ. 1992, 41, 35–44. [Google Scholar] [CrossRef]
  51. Balasundram, S.K.; Golhani, K.; Shamshiri, R.R.; Vadamalai, G. Precision Agriculture Technologies for Management of Plant Diseases. In Plant Disease Management Strategies for Sustainable Agriculture through Traditional and Modern Approaches; Ul Haq, I., Ijaz, S., Eds.; Sustainability in Plant and Crop Protection; Springer: Cham, Switzerland, 2020; Volume 13, pp. 259–278. ISBN 978-3-030-35954-6. [Google Scholar]
  52. Behmann, J.; Acebron, K.; Emin, D.; Bennertz, S.; Matsubara, S.; Thomas, S.; Bohnenkamp, D.; Kuska, M.; Jussila, J.; Salo, H.; et al. Specim IQ: Evaluation of a New, Miniaturized Handheld Hyperspectral Camera and Its Application for Plant Phenotyping and Disease Detection. Sensors 2018, 18, 441. [Google Scholar] [CrossRef] [Green Version]
  53. Chen, T.; Zhang, J.; Chen, Y.; Wan, S.; Zhang, L. Detection of peanut leaf spots disease using canopy hyperspectral reflectance. Comput. Electron. Agric. 2019, 156, 677–683. [Google Scholar] [CrossRef]
  54. Veys, C.; Chatziavgerinos, F.; AlSuwaidi, A.; Hibbert, J.; Hansen, M.; Bernotas, G.; Smith, M.; Yin, H.; Rolfe, S.; Grieve, B. Multispectral imaging for presymptomatic analysis of light leaf spot in oilseed rape. Plant Methods 2019, 15, 4. [Google Scholar] [CrossRef]
  55. Fahrentrapp, J. Detection of Gray Mold Leaf Infections Prior to Visual Symptom Appearance Using a Five-Band Multispectral Sensor. Front. Plant Sci. 2019, 10, 628. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  56. Cardim Ferreira Lima, M.; Krus, A.; Valero, C.; Barrientos, A.; del Cerro, J.; Roldán-Gómez, J.J. Monitoring Plant Status and Fertilization Strategy through Multispectral Images. Sensors 2020, 20, 435. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  57. Kitić, G.; Tagarakis, A.; Cselyuszka, N.; Panić, M.; Birgermajer, S.; Sakulski, D.; Matović, J. A new low-cost portable multispectral optical device for precise plant status assessment. Comput. Electron. Agric. 2019, 162, 300–308. [Google Scholar] [CrossRef]
  58. Veys, C.; Hibbert, J.; Davis, P.; Grieve, B. An ultra-low-cost active multispectral crop diagnostics device. In Proceedings of the 2017 IEEE Sensors, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar]
  59. Habibullah, M.; Mohebian, M.R.; Soolanayakanahally, R.; Bahar, A.N.; Vail, S.; Wahid, K.A.; Dinh, A. Low-Cost Multispectral Sensor Array for Determining Leaf Nitrogen Status. Nitrogen 2020, 1, 67–80. [Google Scholar] [CrossRef]
  60. Chung, S.; Breshears, L.E.; Yoon, J.-Y. Smartphone near infrared monitoring of plant stress. Comput. Electron. Agric. 2018, 154, 93–98. [Google Scholar] [CrossRef]
  61. Watchareeruetai, U.; Noinongyao, P.; Wattanapaiboonsuk, C.; Khantiviriya, P.; Duangsrisai, S. Identification of Plant Nutrient Deficiencies Using Convolutional Neural Networks; IEEE: Krabi, Thailand, 2018; pp. 1–4. [Google Scholar]
  62. Islam, M.; Anh, D.; Wahid, K.; Bhowmik, P. Detection of potato diseases using image segmentation and multiclass support vector machine. In Proceedings of the 2017 IEEE 30th Canadian Conference on Electrical and Computer Engineering (CCECE), Windsor, ON, Canada, 30 April–3 May 2017; pp. 1–4. [Google Scholar]
  63. Xie, X.; Zhang, X.; He, B.; Liang, D.; Zhang, D.; Huang, L. A System for Diagnosis of Wheat Leaf Diseases Based on Android Smartphone. In Proceedings of the International Symposium on Optical Measurement Technology and Instrumentation, Beijing, China, 9–11 May 2016. [Google Scholar] [CrossRef]
  64. Mattupalli, C.; Moffet, C.; Shah, K.; Young, C. Supervised Classification of RGB Aerial Imagery to Evaluate the Impact of a Root Rot Disease. Remote Sens. 2018, 10, 917. [Google Scholar] [CrossRef] [Green Version]
  65. Tao, M.; Ma, X.; Huang, X.; Liu, C.; Deng, R.; Liang, K.; Qi, L. Smartphone-based detection of leaf color levels in rice plants. Comput. Electron. Agric. 2020, 173, 105431. [Google Scholar] [CrossRef]
  66. Mastrodimos, N.; Lentzou, D.; Templalexis, C.; Tsitsigiannis, D.I.; Xanthopoulos, G. Development of thermography methodology for early diagnosis of fungal infection in table grapes: The case of Aspergillus carbonarius. Comput. Electron. Agric. 2019, 165, 104972. [Google Scholar] [CrossRef]
  67. Jones, H.G. Use of infrared thermometry for estimation of stomatal conductance as a possible aid to irrigation scheduling. Agric. For. Meteorol. 1999, 95, 139–149. [Google Scholar] [CrossRef]
  68. Casari, R.; Paiva, D.; Silva, V.; Ferreira, T.; Souza, J.M.; Oliveira, N.; Kobayashi, A.; Molinari, H.; Santos, T.; Gomide, R.; et al. Using Thermography to Confirm Genotypic Variation for Drought Response in Maize. Int. J. Mol. Sci. 2019, 20, 2273. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  69. Oerke, E.-C.; Fröhling, P.; Steiner, U. Thermographic assessment of scab disease on apple leaves. Precis. Agric. 2011, 12, 699–715. [Google Scholar] [CrossRef]
  70. Khorsandi, A.; Hemmat, A.; Mireei, S.A.; Amirfattahi, R.; Ehsanzadeh, P. Plant temperature-based indices using infrared thermography for detecting water status in sesame under greenhouse conditions. Agric. Water Manag. 2018, 204, 222–233. [Google Scholar] [CrossRef]
  71. Petrie, P.R.; Wang, Y.; Liu, S.; Lam, S.; Whitty, M.A.; Skewes, M.A. The accuracy and utility of a low cost thermal camera and smartphone-based system to assess grapevine water status. Biosyst. Eng. 2019, 179, 126–139. [Google Scholar] [CrossRef]
  72. Lang, M.; Siffel, P.; Braunová, Z.; Lichtenthaler, H.K. Investigations of the Blue-green Fluorescence Emission of Plant Leaves. Bot. Acta 1992, 105, 435–440. [Google Scholar] [CrossRef]
  73. Krause, G.H.; Weis, E. Chlorophyll fluorescence as a tool in plant physiology: II. Interpretation of fluorescence signals. Photosynth. Res. 1984, 5, 139–157. [Google Scholar] [CrossRef]
  74. Swarbrick, P.J.; Schulze-Lefert, P.; Scholes, J.D. Metabolic consequences of susceptibility and resistance (race-specific and broad-spectrum) in barley leaves challenged with powdery mildew. Plant Cell Environ. 2006, 29, 1061–1076. [Google Scholar] [CrossRef]
  75. Lawson, T.; Vialet-Chabrand, S. Chlorophyll Fluorescence Imaging. In Photosynthesis; Covshoff, S., Ed.; Methods in Molecular Biology; Springer: New York, NY, USA, 2018; Volume 1770, pp. 121–140. ISBN 978-1-4939-7785-7. [Google Scholar]
  76. Brooks, M.D.; Niyogi, K.K. Use of a Pulse-Amplitude Modulated Chlorophyll Fluorometer to Study the Efficiency of Photosynthesis in Arabidopsis Plants. In Chloroplast Research in Arabidopsis; Jarvis, R.P., Ed.; Methods in Molecular Biology; Humana Press: Totowa, NJ, USA, 2011; Volume 775, pp. 299–310. ISBN 978-1-61779-236-6. [Google Scholar]
  77. Lei, R.; Du, Z.; Qiu, Y.; Zhu, S. The detection of hydrogen peroxide involved in plant virus infection by fluorescence spectroscopy: Detection of hydrogen peroxide in plant by fluorescence spectroscopy. Luminescence 2016, 31, 1158–1165. [Google Scholar] [CrossRef]
  78. Lichtenthaler, H.K.; Rinderle, U. The Role of Chlorophyll Fluorescence in the Detection of Stress Conditions in Plants. CRC Crit. Rev. Anal. Chem. 1988, 19, S29–S85. [Google Scholar] [CrossRef]
  79. Murchie, E.H.; Lawson, T. Chlorophyll fluorescence analysis: A guide to good practice and understanding some new applications. J. Exp. Bot. 2013, 64, 3983–3998. [Google Scholar] [CrossRef] [Green Version]
  80. Gomes, M.T.G.; da Luz, A.C.; dos Santos, M.R.; do Carmo Pimentel Batitucci, M.; Silva, D.M.; Falqueto, A.R. Drought tolerance of passion fruit plants assessed by the OJIP chlorophyll a fluorescence transient. Sci. Hortic. 2012, 142, 49–56. [Google Scholar] [CrossRef]
  81. Kalaji, H.M.; Oukarroum, A.; Alexandrov, V.; Kouzmanova, M.; Brestic, M.; Zivcak, M.; Samborska, I.A.; Cetner, M.D.; Allakhverdiev, S.I.; Goltsev, V. Identification of nutrient deficiency in maize and tomato plants by in vivo chlorophyll a fluorescence measurements. Plant Physiol. Biochem. 2014, 81, 16–25. [Google Scholar] [CrossRef] [PubMed]
  82. Kalaji, H.M.; Bąba, W.; Gediga, K.; Goltsev, V.; Samborska, I.A.; Cetner, M.D.; Dimitrova, S.; Piszcz, U.; Bielecki, K.; Karmowska, K.; et al. Chlorophyll fluorescence as a tool for nutrient status identification in rapeseed plants. Photosynth. Res. 2018, 136, 329–343. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  83. Buschmann, C.; Lichtenthaler, H.K. Principles and characteristics of multi-colour fluorescence imaging of plants. J. Plant Physiol. 1998, 152, 297–314. [Google Scholar] [CrossRef]
  84. Bürling, K.; Hunsche, M.; Noga, G. Use of blue–green and chlorophyll fluorescence measurements for differentiation between nitrogen deficiency and pathogen infection in winter wheat. J. Plant Physiol. 2011, 168, 1641–1648. [Google Scholar] [CrossRef]
  85. Saleem, M.; Atta, B.M.; Ali, Z.; Bilal, M. Laser-induced fluorescence spectroscopy for early disease detection in grapefruit plants. Photochem. Photobiol. Sci. 2020, 19, 713–721. [Google Scholar] [CrossRef]
  86. Lins, E.C.; Belasque, J.; Marcassa, L.G. Detection of citrus canker in citrus plants using laser induced fluorescence spectroscopy. Precis. Agric. 2009, 10, 319–330. [Google Scholar] [CrossRef]
  87. Su, W.H.; Fennimore, S.A.; Slaughter, D.C. Fluorescence imaging for rapid monitoring of translocation behaviour of systemic markers in snap beans for automated crop/weed discrimination. Biosyst. Eng. 2019, 186, 156–167. [Google Scholar] [CrossRef]
  88. Li, H.; Wang, P.; Weber, J.; Gerhards, R. Early Identification of Herbicide Stress in Soybean (Glycine max (L.) Merr.) Using Chlorophyll Fluorescence Imaging Technology. Sensors 2017, 18, 21. [Google Scholar] [CrossRef] [Green Version]
  89. Dong, Z.; Men, Y.; Li, Z.; Zou, Q.; Ji, J. Chlorophyll fluorescence imaging as a tool for analyzing the effects of chilling injury on tomato seedlings. Sci. Hortic. 2019, 246, 490–497. [Google Scholar] [CrossRef]
  90. Konanz, S.; Kocsányi, L.; Buschmann, C. Advanced Multi-Color Fluorescence Imaging System for Detection of Biotic and Abiotic Stresses in Leaves. Agriculture 2014, 4, 79–95. [Google Scholar] [CrossRef] [Green Version]
  91. Chung, S.; Breshears, L.E.; Perea, S.; Morrison, C.M.; Betancourt, W.Q.; Reynolds, K.A.; Yoon, J.-Y. Smartphone-Based Paper Microfluidic Particulometry of Norovirus from Environmental Water Samples at the Single Copy Level. ACS Omega 2019, 4, 11180–11188. [Google Scholar] [CrossRef] [PubMed]
  92. Takayama, K.; Nishina, H. Chlorophyll fluorescence imaging of the chlorophyll fluorescence induction phenomenon for plant health monitoring. Environ. Control Biol. 2009, 47, 101–109. [Google Scholar] [CrossRef]
  93. Pérez-Bueno, M.L.; Pineda, M.; Barón, M. Phenotyping Plant Responses to Biotic Stress by Chlorophyll Fluorescence Imaging. Front. Plant Sci. 2019, 10, 1135. [Google Scholar] [CrossRef] [PubMed]
  94. Moshou, D.; Bravo, C.; Oberti, R.; West, J.S.; Ramon, H.; Vougioukas, S.; Bochtis, D. Intelligent multi-sensor system for the detection and treatment of fungal diseases in arable crops. Biosyst. Eng. 2011, 108, 311–321. [Google Scholar] [CrossRef]
  95. Berdugo, C.A.; Zito, R.; Paulus, S.; Mahlein, A.-K. Fusion of sensor data for the detection and differentiation of plant diseases in cucumber. Plant Pathol. 2014, 63, 1344–1356. [Google Scholar] [CrossRef]
  96. Adhikari, R.; Li, C.; Kalbaugh, K.; Nemali, K. A low-cost smartphone controlled sensor based on image analysis for estimating whole-plant tissue nitrogen (N) content in floriculture crops. Comput. Electron. Agric. 2020, 169, 105173. [Google Scholar] [CrossRef]
  97. Bebronne, R.; Carlier, A.; Meurs, R.; Leemans, V.; Vermeulen, P.; Dumont, B.; Mercatoris, B. In-field proximal sensing of septoria tritici blotch, stripe rust and brown rust in winter wheat by means of reflectance and textural features from multispectral imagery. Biosyst. Eng. 2020, 197, 257–269. [Google Scholar] [CrossRef]
  98. Brambilla, M. Application of a low-cost RGB sensor to detect basil (Ocimum basilicum L.) nutritional status at pilot scale level. Precis. Agric. 2020, 20. [Google Scholar] [CrossRef]
  99. Banerjee, K.; Krishnan, P.; Das, B. Thermal imaging and multivariate techniques for characterizing and screening wheat genotypes under water stress condition. Ecol. Indic. 2020, 119, 106829. [Google Scholar] [CrossRef]
  100. Cen, H.; Weng, H.; Yao, J.; He, M.; Lv, J.; Hua, S.; Li, H.; He, Y. Chlorophyll Fluorescence Imaging Uncovers Photosynthetic Fingerprint of Citrus Huanglongbing. Front. Plant Sci. 2017, 8, 1509. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  101. Raji, S.N.; Subhash, N.; Ravi, V.; Saravanan, R.; Mohanan, C.N.; Nita, S.; Kumar, T.M. Detection of mosaic virus disease in cassava plants by sunlight-induced fluorescence imaging: A pilot study for proximal sensing. Int. J. Remote Sens. 2015, 36, 2880–2897. [Google Scholar] [CrossRef]
  102. Singh, A.; Ganapathysubramanian, B.; Singh, A.K.; Sarkar, S. Machine Learning for High-Throughput Stress Phenotyping in Plants. Trends Plant Sci. 2016, 21, 110–124. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  103. Ramos-Giraldo, P.; Reberg-Horton, C.; Locke, A.M.; Mirsky, S.; Lobaton, E. Drought Stress Detection Using Low-Cost Computer Vision Systems and Machine Learning Techniques. IT Prof. 2020, 22, 27–29. [Google Scholar] [CrossRef]
  104. Liakos, K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  105. Tsaftaris, S.A.; Minervini, M.; Scharr, H. Machine Learning for Plant Phenotyping Needs Image Processing. Trends Plant Sci. 2016, 21, 989–991. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  106. Dhakate, M.; Ingole, A.B. Diagnosis of pomegranate plant diseases using neural network. In Proceedings of the 2015 Fifth National Conference on Computer Vision, Pattern Recognition, Image Processing and Graphics (NCVPRIPG), Patna, India, 16–19 December 2015; pp. 1–4. [Google Scholar]
  107. Al Bashish, D.; Braik, M.; Bani-Ahmad, S. A framework for detection and classification of plant leaf and stem diseases. In Proceedings of the 2010 International Conference on Signal and Image Processing, Chennai, India, 15–17 December 2010; pp. 113–118. [Google Scholar]
  108. Shrivastava, S.; Singh, S.K.; Hooda, D.S. Color sensing and image processing-based automatic soybean plant foliar disease severity detection and estimation. Multimed. Tools Appl. 2015, 74, 11467–11484. [Google Scholar] [CrossRef]
  109. Kahu, S.Y.; Raut, R.B.; Bhurchandi, K.M. Review and evaluation of color spaces for image/video compression. Color Res. Appl. 2018, 44, 8–33. [Google Scholar] [CrossRef] [Green Version]
  110. Lever, J.; Krzywinski, M.; Altman, N. Principal component analysis. Nat. Methods 2017, 14, 641–642. [Google Scholar] [CrossRef] [Green Version]
  111. Lu, Y.; Yi, S.; Zeng, N.; Liu, Y.; Zhang, Y. Identification of rice diseases using deep convolutional neural networks. Neurocomputing 2017, 267, 378–384. [Google Scholar] [CrossRef]
  112. Wold, S.; Esbensen, K.; Geladi, P. Principal Component Analysis. Chemom. Intell. Lab. Syst. 1987, 2, 37–52. [Google Scholar] [CrossRef]
  113. Singh, V.; Misra, A.K. Detection of plant leaf diseases using image segmentation and soft computing techniques. Inf. Process. Agric. 2016, 4, 41–49. [Google Scholar] [CrossRef] [Green Version]
  114. Ma, J.; Du, K.; Zheng, F.; Zhang, L.; Gong, Z.; Sun, Z. A recognition method for cucumber diseases using leaf symptom images based on deep convolutional neural network. Comput. Electron. Agric. 2018, 154, 18–24. [Google Scholar] [CrossRef]
  115. Zhuang, S.; Wang, P.; Jiang, B.; Li, M.; Gong, Z. Early detection of water stress in maize based on digital images. Comput. Electron. Agric. 2017, 140, 461–468. [Google Scholar] [CrossRef]
  116. Yue, J.; Li, Z.; Liu, L.; Fu, Z. Content-based image retrieval using color and texture fused features. Math. Comput. Model. 2011, 54, 1121–1127. [Google Scholar] [CrossRef]
  117. Ojala, T.; Pietikainen, M.; Maenpaa, T. Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 971–987. [Google Scholar] [CrossRef]
  118. Vatamanu, O.A.; Frandes, M.; Ionescu, M.; Apostol, S. Content-Based Image Retrieval using Local Binary Pattern, Intensity Histogram and Color Coherence Vector. In Proceedings of the 2013 E-Health and Bioengineering Conference (EHB), Iasi, Romania, 21–23 November 2013; pp. 1–6. [Google Scholar]
  119. Wang, G.; Sun, Y.; Wang, J. Automatic Image-Based Plant Disease Severity Estimation Using Deep Learning. Comput. Intell. Neurosci. 2017, 2017, 1–8. [Google Scholar] [CrossRef] [Green Version]
  120. Zhou, Z.-H. A brief introduction to weakly supervised learning. Natl. Sci. Rev. 2018, 5, 44–53. [Google Scholar] [CrossRef] [Green Version]
  121. Rodriguez, M.Z.; Comin, C.H.; Casanova, D.; Bruno, O.M.; Amancio, D.R.; da Costa, L.F.; Rodrigues, F.A. Clustering algorithms: A comparative approach. PLoS ONE 2019, 14, e0210236. [Google Scholar] [CrossRef]
  122. Bindushree, H.B.; Sivasankari, G.G. Application of Image Processing Techniques for Plant Leaf Disease Detection. Int. J. Eng. Res. Technol. (IJERT) 2015, 3, 19. [Google Scholar]
  123. Johannes, A.; Picon, A.; Alvarez-Gila, A.; Echazarra, J.; Rodriguez-Vaamonde, S.; Navajas, A.D.; Ortiz-Barredo, A. Automatic plant disease diagnosis using mobile capture devices, applied on a wheat use case. Comput. Electron. Agric. 2017, 138, 200–209. [Google Scholar] [CrossRef]
  124. Sladojevic, S.; Arsenovic, M.; Anderla, A.; Culibrk, D.; Stefanovic, D. Deep Neural Networks Based Recognition of Plant Diseases by Leaf Image Classification. Comput. Intell. Neurosci. 2016, 2016, 1–11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  125. Ghosal, S.; Blystone, D.; Singh, A.K.; Ganapathysubramanian, B.; Singh, A.; Sarkar, S. An explainable deep machine vision framework for plant stress phenotyping. Proc. Natl. Acad. Sci. USA 2018, 115, 4613–4618. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  126. Esgario, J.G.M.; Krohling, R.A.; Ventura, J.A. Deep learning for classification and severity estimation of coffee leaf biotic stress. Comput. Electron. Agric. 2020, 169, 105162. [Google Scholar] [CrossRef] [Green Version]
  127. Cervantes, J.; Garcia-Lamont, F.; Rodríguez-Mazahua, L.; Lopez, A. A comprehensive survey on support vector machine classification: Applications, challenges and trends. Neurocomputing 2020, 408, 189–215. [Google Scholar] [CrossRef]
  128. Krenker, A.; Bester, J.; Kos, A. Introduction to the Artificial Neural Networks. In Artificial Neural Networks—Methodological Advances and Biomedical Applications; Suzuki, K., Ed.; InTech: Rijeka, Croatia, 2011; ISBN 978-953-307-243-2. [Google Scholar]
  129. Pineda, M.; Pérez-Bueno, M.L.; Paredes, V.; Barón, M. Use of multicolour fluorescence imaging for diagnosis of bacterial and fungal infection on zucchini by implementing machine learning. Funct. Plant Biol. 2017, 44, 563. [Google Scholar] [CrossRef] [Green Version]
  130. Golhani, K.; Balasundram, S.K.; Vadamalai, G.; Pradhan, B. Selection of a Spectral Index for Detection of Orange Spotting Disease in Oil Palm (Elaeis guineensis Jacq.) Using Red Edge and Neural Network Techniques. J. Indian Soc. Remote Sens. 2019, 47, 639–646. [Google Scholar] [CrossRef]
  131. Humpal, J.; McCarthy, C.; Percy, C.; Thomasson, J.A. Detection of crown rot in wheat utilising near-infrared spectroscopy: Towards remote and robotic sensing. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, Online, 27 April–8 May 2020. [Google Scholar] [CrossRef]
  132. Tu, J.V. Advantages and disadvantages of using artificial neural networks versus logistic regression for predicting medical outcomes. J. Clin. Epidemiol. 1996, 49, 1225–1231. [Google Scholar] [CrossRef]
  133. Elsheikh, A.H.; Sharshir, S.W.; Abd Elaziz, M.; Kabeel, A.E.; Guilan, W.; Haiou, Z. Modeling of solar energy systems using artificial neural network: A comprehensive review. Sol. Energy 2019, 180, 622–639. [Google Scholar] [CrossRef]
  134. Jin, K.H.; McCann, M.T.; Froustey, E.; Unser, M. Deep Convolutional Neural Network for Inverse Problems in Imaging. IEEE Trans. Image Process. 2017, 26, 4509–4522. [Google Scholar] [CrossRef] [Green Version]
  135. Polder, G.; van de Westeringh, N.; Kool, J.; Khan, H.A.; Kootstra, G.; Nieuwenhuizen, A. Automatic Detection of Tulip Breaking Virus (TBV) Using a Deep Convolutional Neural Network. IFAC-PapersOnLine 2019, 52, 12–17. [Google Scholar] [CrossRef]
  136. Polder, G.; Blok, P.M.; de Villiers, H.A.C.; van der Wolf, J.M.; Kamp, J. Potato Virus Y Detection in Seed Potatoes Using Deep Learning on Hyperspectral Images. Front. Plant Sci. 2019, 10, 209. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  137. Mohanty, S.P.; Hughes, D.P.; Salathé, M. Using Deep Learning for Image-Based Plant Disease Detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  138. Zhang, K.; Wu, Q.; Liu, A.; Meng, X. Can Deep Learning Identify Tomato Leaf Disease? Adv. Multimed. 2018, 2018, 1–10. [Google Scholar] [CrossRef] [Green Version]
  139. Ferentinos, K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018, 145, 311–318. [Google Scholar] [CrossRef]
  140. Saleem, M.H.; Potgieter, J.; Mahmood Arif, K. Plant Disease Detection and Classification by Deep Learning. Plants 2019, 8, 468. [Google Scholar] [CrossRef] [Green Version]
  141. Lu, J.; Hu, J.; Zhao, G.; Mei, F.; Zhang, C. An in-field automatic wheat disease diagnosis system. Comput. Electron. Agric. 2017, 142, 369–379. [Google Scholar] [CrossRef] [Green Version]
  142. Brahimi, M.; Boukhalfa, K.; Moussaoui, A. Deep Learning for Tomato Diseases: Classification and Symptoms Visualization. Appl. Artif. Intell. 2017, 31, 299–315. [Google Scholar] [CrossRef]
  143. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  144. Dyrmann, M.; Karstoft, H.; Midtiby, H.S. Plant species classification using deep convolutional neural network. Biosyst. Eng. 2016, 151, 72–80. [Google Scholar] [CrossRef]
Figure 1. Reflectance spectra of Quercus aquifolioides leaves at different altitudes. Vegetation reflectance curves in general typically display this kind of pattern, with low reflectance in the visible region (influenced by leaf pigments), “red edge” connecting the visible and near-infrared (NIR) region, and high reflectance in the NIR region (influenced by cell structure). After 1300 nm, reflectance characteristics are mostly influenced by leaf water content. Reprinted from [12]. ©2020 Zhu et al.
Figure 1. Reflectance spectra of Quercus aquifolioides leaves at different altitudes. Vegetation reflectance curves in general typically display this kind of pattern, with low reflectance in the visible region (influenced by leaf pigments), “red edge” connecting the visible and near-infrared (NIR) region, and high reflectance in the NIR region (influenced by cell structure). After 1300 nm, reflectance characteristics are mostly influenced by leaf water content. Reprinted from [12]. ©2020 Zhu et al.
Biosensors 10 00193 g001
Figure 2. Drawing of a cross-section of a typical leaf with labeled cell types and layers. Basic light interactions with leaf layers are annotated. Reprinted from [21]. ©2008 Liew et al.
Figure 2. Drawing of a cross-section of a typical leaf with labeled cell types and layers. Basic light interactions with leaf layers are annotated. Reprinted from [21]. ©2008 Liew et al.
Biosensors 10 00193 g002
Figure 3. Ranges of the electromagnetic spectrum that are utilized by various sensor types. Useful wavelengths for plant stress detection tend to be in the ultra-violet (UV), visible, and NIR ranges. Reprinted from [27]. ©2019 Rosique et al.
Figure 3. Ranges of the electromagnetic spectrum that are utilized by various sensor types. Useful wavelengths for plant stress detection tend to be in the ultra-violet (UV), visible, and NIR ranges. Reprinted from [27]. ©2019 Rosique et al.
Biosensors 10 00193 g003
Figure 4. SpecimIQ miniature hyperspectral camera (Specim Ltd., Oulu, Finland). Reprinted from [52]. ©2018 Behmann et al.
Figure 4. SpecimIQ miniature hyperspectral camera (Specim Ltd., Oulu, Finland). Reprinted from [52]. ©2018 Behmann et al.
Biosensors 10 00193 g004
Figure 5. Smartphone being used to evaluate the leaf color of rice plant leaves, which is applicable in detecting nitrogen deficiencies. Reprinted with permission from [65]. ©2020 Elsevier.
Figure 5. Smartphone being used to evaluate the leaf color of rice plant leaves, which is applicable in detecting nitrogen deficiencies. Reprinted with permission from [65]. ©2020 Elsevier.
Biosensors 10 00193 g005
Figure 6. Thermographic image data used to evaluate drought stress in maize plants. In (A,C) the top row consists of well-watered plants, while the bottom row is drought-stressed. Similarly, in (B,D), well-watered plants are in the left row, while drought-stressed plants are in the right. Reprinted with permission from [68]. ©2019 Casari et al.
Figure 6. Thermographic image data used to evaluate drought stress in maize plants. In (A,C) the top row consists of well-watered plants, while the bottom row is drought-stressed. Similarly, in (B,D), well-watered plants are in the left row, while drought-stressed plants are in the right. Reprinted with permission from [68]. ©2019 Casari et al.
Biosensors 10 00193 g006
Figure 7. Fluorescence emission spectra of leaves excited at 488 nm. (a) chlorotic part of tobacco leaf infected by cucumber mosaic virus; (b) green part of tobacco leaf infected with cucumber mosaic virus; (c) a healthy tobacco leaf. Reprinted with permission from [77]. ©2016 John Wiley and Sons.
Figure 7. Fluorescence emission spectra of leaves excited at 488 nm. (a) chlorotic part of tobacco leaf infected by cucumber mosaic virus; (b) green part of tobacco leaf infected with cucumber mosaic virus; (c) a healthy tobacco leaf. Reprinted with permission from [77]. ©2016 John Wiley and Sons.
Biosensors 10 00193 g007
Figure 8. Fluorescence ratios of barley leaves with nitrogen (N) deficiencies of varying severity. Reprinted from [90]. ©2014 Konanz et al.
Figure 8. Fluorescence ratios of barley leaves with nitrogen (N) deficiencies of varying severity. Reprinted from [90]. ©2014 Konanz et al.
Biosensors 10 00193 g008
Figure 9. A simplified machine learning pathway. Reprinted from [104]. ©2018 Liakos et al.
Figure 9. A simplified machine learning pathway. Reprinted from [104]. ©2018 Liakos et al.
Biosensors 10 00193 g009
Figure 10. An RGB (visible or red-green-blue) image represented using other color spaces. Adapted with permission from [109]. ©2018 John Wiley and Sons.
Figure 10. An RGB (visible or red-green-blue) image represented using other color spaces. Adapted with permission from [109]. ©2018 John Wiley and Sons.
Biosensors 10 00193 g010
Figure 11. A visualization of the image segmentation process. (a,d) are the original samples of well-watered and drought-stressed maize plants. (b,e) are preliminary segmentation images acquired using RGB pixel values and linear support vector machine (SVM), while (c,f) are the images denoised using the mathematical morphology method. Reprinted with permission from [115]. ©2017 Elsevier.
Figure 11. A visualization of the image segmentation process. (a,d) are the original samples of well-watered and drought-stressed maize plants. (b,e) are preliminary segmentation images acquired using RGB pixel values and linear support vector machine (SVM), while (c,f) are the images denoised using the mathematical morphology method. Reprinted with permission from [115]. ©2017 Elsevier.
Biosensors 10 00193 g011
Figure 12. A decision boundary is established (a line for 2D data) for the data reduced in two dimensions, e.g., through PCA. Many boundaries can be drawn, but the best separation will need to be determined. For 3D, the decision boundary is a plane. For dimensions higher than three, the decision boundary is a hyperplane. Reprinted with permission from [127]. ©2020 Elsevier.
Figure 12. A decision boundary is established (a line for 2D data) for the data reduced in two dimensions, e.g., through PCA. Many boundaries can be drawn, but the best separation will need to be determined. For 3D, the decision boundary is a plane. For dimensions higher than three, the decision boundary is a hyperplane. Reprinted with permission from [127]. ©2020 Elsevier.
Biosensors 10 00193 g012
Figure 13. Structures of four types of ANNS: (a) multilayer perceptron, where xi represents inputs, Oi represents output neurons, hi represents hidden layer neurons, and wi represents the weights between neurons; (b) wavelet, where Ψ represents the wavelet function, t represents the translation coefficient, λ represents the dilation coefficient; (c) radial basis function, where Ri represents the radial basis function; and (d) Elman, where ui represents components in the hidden and undertake layers. Reprinted with permission from [133]. ©2019 Elsevier.
Figure 13. Structures of four types of ANNS: (a) multilayer perceptron, where xi represents inputs, Oi represents output neurons, hi represents hidden layer neurons, and wi represents the weights between neurons; (b) wavelet, where Ψ represents the wavelet function, t represents the translation coefficient, λ represents the dilation coefficient; (c) radial basis function, where Ri represents the radial basis function; and (d) Elman, where ui represents components in the hidden and undertake layers. Reprinted with permission from [133]. ©2019 Elsevier.
Biosensors 10 00193 g013
Table 1. Optical Methods Used for Plant Stress Detection.
Table 1. Optical Methods Used for Plant Stress Detection.
MethodWavelengthsPlantStress TypeReferences
Hyperspectral Imaging500–850 nmMaizeDrought stress[32]
430–890 nmBarleyDrought stress[33]
350–2500 nmWheatYellow rust[34]
350–1350 nmWheatPowdery mildew[35]
380–1030 nmOkraSalt stress[36]
400–1000 nmBananaBlack Sigatoka[37]
250–430 nmBarleySalt stress[39]
400–1000 nmBarleyPowdery mildew[52]
325–1075 nmPeanutLeaf spot[53]
Multispectral Spectroscopy400–1100 nmMaizeNutrient deficiency[57]
400–980 nmTomatoDrought Stress[58]
430–870 nmCanolaNutrient deficiency[59]
Multispectral Imaging 365–960 nmOilseed RapeLight leaf spot[54]
475, 560, 668, 717, 840 nm TomatoGray Mold[55]
550, 660, 735, 790 nmTomatoNutrient deficiency
(multiple)
[56]
620, 870 nmPoinsettiaNitrogen content[96]
450–950 nmWheatStripe rust, brown rust, septoria tritici blotch[97]
RGB Imaging RGBSoybeanIron deficiency[11]
RGBBlack GramNutrient deficiency (multiple)[61]
RGBPotatoEarly blight, late blight[62]
RGBBasilNitrogen stress[98]
Thermography7.5–13 μmTable GrapesAspergillus carbonarius[66]
7.5–13 μmMaizeDrought stress[68]
8–12 μmAppleApple scab[69]
8–14 μmSesameDrought stress[70]
8–14 μmWheatDrought stress[99]
Fluorescence Spectroscopy 1650 nmPassion FruitDrought stress[80]
635 nmMaize, TomatoNutrient deficiency (multiple)[81]
650 nmRapeseedNutrient deficiency (multiple)[82]
405 nmGrapefruitCitrus canker[85]
337 nmWheatNutrient deficiency, leaf rust, powdery mildew[84]
Fluorescence Imaging 1340, 447, 550 nmBarley, Grapevine, Sugar BeetNutrient deficiency, black rot, leaf spot[90]
460 nmSoybeanHerbicide stress[88]
620 nmCitrusHuanglongbing[100]
684, 687, 757.5, 759.5 nm (emission)CassavaMosaic virus[101]
1 In fluorescence spectroscopy and fluorescence imaging, excitation wavelengths are shown except noted otherwise.
Table 2. Equations and Applications of Vegetation and Disease Indices.
Table 2. Equations and Applications of Vegetation and Disease Indices.
Index NameEquation 1ApplicationReferences
Vegetation Indices
Enhanced Vegetation Index E V I = 2.5 × R 800 R 670 R 800 + 6.0 R 670 7.5 R 479 + 1 Rate of photosynthesis, water stress detection[41]
Normalized Difference Vegetation Index N D V I = R N I R R R E D R N I R + R R E D Plant growth and development monitoring[48]
Water Index W I = R 900 R 970 Plant water content estimation[49]
Photochemical Reflectance Index P R I = R 570 R 531 R 570 + R 531 Photosynthetic efficiency[50]
Disease Indices
Powdery Mildew Index (Wheat) P M I = R 515 R 698 R 515 + R 698 0.5 R 738 Powdery mildew detection in wheat[44]
Powdery Mildew Index (Sugar Beet) P M I = R 520 R 584 R 520 + R 584 + R 724 Powdery mildew detection in sugar beet[45]
Cercospora Leaf Spot Index C L S = R 698 R 570 R 698 + R 570 R 734 Cercospora leaf spot detection in sugar beet[45]
Leaf Rust Disease Severity Index 1 L R D S I 1 = 6.9 R 605 R 455 1.2 Severity estimation of wheat leaf rust[46]
Leaf Rust Disease Severity Index 2 L R D S I 2 = 4.2 R 695 R 455 0.38 Severity estimation of wheat leaf rust[46]
Lemon Myrtle—Myrtle Rust Index L M M R = ( R 545 R 555 ) 5 3 × R 1505 R 2195 Myrtle rust detection in lemon myrtle[47]
1 R represents the measured reflectance at the wavelength or waveband specified by the subscript.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zubler, A.V.; Yoon, J.-Y. Proximal Methods for Plant Stress Detection Using Optical Sensors and Machine Learning. Biosensors 2020, 10, 193. https://doi.org/10.3390/bios10120193

AMA Style

Zubler AV, Yoon J-Y. Proximal Methods for Plant Stress Detection Using Optical Sensors and Machine Learning. Biosensors. 2020; 10(12):193. https://doi.org/10.3390/bios10120193

Chicago/Turabian Style

Zubler, Alanna V., and Jeong-Yeol Yoon. 2020. "Proximal Methods for Plant Stress Detection Using Optical Sensors and Machine Learning" Biosensors 10, no. 12: 193. https://doi.org/10.3390/bios10120193

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop