Next Article in Journal
Crystal-Based X-ray Interferometry and Its Application to Phase-Contrast X-ray Imaging, Zeff Imaging, and X-ray Thermography
Next Article in Special Issue
CFormerFaceNet: Efficient Lightweight Network Merging a CNN and Transformer for Face Recognition
Previous Article in Journal
On the Possibility of Universal Chemometric Calibration in X-ray Fluorescence Spectrometry: Case Study with Ore and Steel Samples
Previous Article in Special Issue
Influence of Hyperparameters in Deep Learning Models for Coffee Rust Detection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Use of CNN for Water Stress Identification in Rice Fields Using Thermal Imagery

1
Department of Computer Science and Engineering, National Chung Hsing University, Taichung 40227, Taiwan
2
Department of Management Information Systems, National Chung Hsing University, Taichung 40227, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(9), 5423; https://doi.org/10.3390/app13095423
Submission received: 6 April 2023 / Revised: 21 April 2023 / Accepted: 25 April 2023 / Published: 26 April 2023
(This article belongs to the Special Issue AI-Based Image Processing)

Abstract

:
Rice is a staple food in many Asian countries, but its production requires a high water demand. Moreover, more attention should be paid to the water management of rice due to global climate change and frequent droughts. To address this problem, we propose a rice water stress identification system. Since water irrigation usually affects the opening and closing of rice leaf stomata which directly affects leaf temperature, rice leaf temperature is a suitable index for evaluating rice water stress. The proposed rice water stress identification system uses a CNN (convolutional neural network) to identify water stress in thermal images of rice fields and to classify the irrigation situation into three classes: 100%, 90%, and 80% irrigation. The CNN was applied to extract the temperature level score from each thermal image based on the degree of difference between the three irrigation situations, then these scores were used to further classify the water-stress situation. In the experiments in this study, we compare CNN classification results without considering the degree between each class. The proposed method considerably improves water stress identification. Since rice leaf temperature is relative to air temperature and is not an absolute value, the background temperature is also important reference information. We combine two different methods for background processing to extract more features and achieve more accurate identification.

1. Introduction

Many studies have shown that water stress in rice reduces its yield and quality [1,2,3]. Wang et al. [4] observed that drought reduced the average rice yield by 6.8% in China. In addition, Zubaer et al. [5] evaluated the effects of three water levels (100%, 70%, and 40%) on three rice varieties, showing that the grain yield was reduced by 20.74–32.51% at the 70% water level and by 45.34–53.08% at the 40% water level. Furthermore, Hossain et al. [6] reported that under 75% and 50% water saturation, the panicle number/rice plant was approximately one, while it was five under saturated conditions. He et al. [7] also showed that reduced irrigation significantly decreased rice yield. These studies illustrate the importance of water to rice yield; however, unfortunately, many meteorologists forecast that global droughts will become more severe in future [8,9,10,11], thereby causing food supply problems in countries where rice is the staple food. Taking other plants as an example, Rani et al. [12] emphasized that chickpea yields were drastically reduced due to drought. In response to this predicament, many studies have proposed various solutions, such as the stricter management of irrigation [13]. In addition, botanists are attempting to cultivate new drought-tolerant rice varieties [14,15,16] to reduce water usage. However, these proposals require an automatic irrigation monitoring system; therefore, the purpose of this article was to develop water stress identification for rice fields.
The proposed system is based on leaf temperature. Since plant leaves contain stomata, the opening and closing of the stomata are affected by water irrigation. When there is sufficient water, the stomata often open; otherwise, the stomata are mostly closed. If the stomata are open, the leaf temperature is lower, and vice versa [17,18]. A thermal imaging camera is a suitable choice for simple measurement of leaf temperature as it can record leaf temperature by taking pictures and does not require the manual measurement of each leaf. In addition, it is a non-contact measurement, so causes no damage to the leaf. Therefore, thermal imaging cameras are widely used to measure the temperature of various plants [19].
Recently, the CNN has become the most successful method for image recognition [20] and is commonly used to identify plant phototypes. For example, the VGG16 model has been used to identify chickpea seed varieties [21] and different varieties of grape leaves [22], while DenseNet has been used to identify PlantVillage (a well-known plant disease database) [23]. These applications show that CNNs are advantageous for plant phototype identification. Typically, a CNN extracts features by training convolution kernel weights and uses these features for recognition by training fully connected layers. In this article, a CNN was used to identify water stress in rice from thermal images, adopting three well-known CNN architectures as the network backbone [24,25,26]. In addition, Taheri-Garavand et al. [27] proposed a plant leaf water content estimation system based on visible light cameras. In their method, six texture features were extracted in three color spaces and used to estimate the leaf water content. This usually requires more prior knowledge, while texture features require stricter shooting conditions; for example, the shooting distance must be sufficient to capture the texture. Since thermal images directly reflect the water content of the leaves, we do not need to consider prior knowledge or shooting conditions. Therefore, using thermal imaging is a relatively easy way to estimate the water content of plant leaves. Recently, due to the rapid development of CNNs, the technology has been applied in many studies to identify the problem of water stress in plants. Chandel et al. [28] used three common CNN architectures (AlexNet, GoogLeNet, and Inception v3) to identify stress and non-stress in RGB images of corn, okra, and soybean. In addition, Zhuanga et al. [29] built a model for water stress identification in corn using a simple six-layer convolutional network model to extract RGB image features to evaluate water stress with an SVM and an extra trees regressor. Both these methods are based on RGB images to build models; however, water stress is usually identified in such images by observing shrinkage on the surface of the plant leaf. For rice, such water stress is already very serious and greatly affects the production quality, so it is more appropriate to use thermal images to identify rice water stress.
Several previous studies have used CNN to identify plant drought problems with thermal images. First, Niu et al. [30] classified the irrigation of pomegranate trees as high (100% and 75%) and low (35% and 50%), then used a shallow (three-layer) CNN to identify the classes. Chandel et al. [31] built a water stress identification system for winter wheat. They classified irrigation conditions as stressed or non-stressed, then trained CNN models based on RGB and thermal images. They tested five state-of-the-art CNN architectures (AlexNet, GoogLeNet, Inception v3, MobileNetV2, and ResNet-50) and five well-known classifiers (logistic, KNN, ANN, SVM, and LSTM), showing that the model using thermal images performed better than using RGB images as input. Melo et al. [32] identified the water stress of sugarcane using transfer learning to train thermal images on an Inception-ResNet-v2 architecture.
In order to clarify the above literature, we list several important factors [27,28,29,30,31,32] in Table 1, as well as some remarks.
For a more detailed study on the drought tolerance ability during the development of drought-tolerant varieties, the degree of rice water stress must be classified precisely. In addition, rice depends heavily on water; therefore, if water stress is very high, rice will not survive. Consequently, in our work, 100% and 90% irrigation are regarded as well-watered and slightly stressed for rice, while 80% normal irrigation is considered mildly stressed. In addition, leaf temperature is affected by the amount of irrigation to a different degree. The above studies [30,31,32] either refer to two classes or their target plants do not need such fine classification, so they usually ignored degree information. To pay more attention to degree information and to make the identification of water stress more accurate, a framework was designed to consider the degree information. In addition, leaf temperature is not an absolute value and is affected by the surrounding environment, which is also reflected in CWSI calculations used in many studies on plant water stress. CWSI [33,34] takes the leaf temperature minus the air temperature as normalization, while this study refers to the background as the temperature reference. If acceptable accuracy can be achieved using the background as a reference, then it is not necessary to record the current temperature.
Referring to Table 1, we have sorted out the main contributions of this study:
  • We chose thermal images instead of RGB images as the input data and explain its advantages in our subsequent discussion section.
  • A CNN was selected as a feature extractor and classifier, which does not require prior knowledge of hand-crafted features. In addition, we used a scoring method to retain degree information.
  • We used two background processing methods: early fusion and late fusion which complement each other.
  • For other plants or species, a framework is provided for the assessment of the water content of other plant leaves.
This study comprises five sections: Section 2 describes the data collection and their classification methods as well as the methods used in this study; in Section 3, we designed several experiments for comparison to illustrate the effect of the method in this study; Section 4 discusses the proposed method in-depth; and the conclusion and future work are provided in Section 5.

2. Materials and Methods

2.1. Dataset

The experiments were conducted in central Taiwan rice fields, mainly for three rice varieties, Tainan 11, HVA1 gene transgenic rice, and TNG67. Referring to the irrigation amount setting in [7], the irrigation was classified as W1–W6, corresponding to 100–50% of normal irrigation. In addition, the grain yield decreased significantly after W4 (i.e., 70% irrigation). The goal of this study was to detect when water shortage is not serious; therefore, our experiment targets the relatively light water stress situations of W1–W3 (i.e., 100–80% irrigation). Well-watered, slight stress, and mild stress conditions were used to represent 100%, 90%, and 80% of the normal irrigation, respectively, and the data for each class are shown in Table 2.
We conducted experiments on shooting from March 2020 to October 2020, where the planting distance of each plant was approximately 15 to 20 cm. An AVIO G100EX/R300 infrared thermal imaging camera (AVIO, Turin, Italy) was used to capture the data in Table 2. For each picture, the thermal image (8–14 µm) and the RGB image (400–700 nm) were captured at the same time, as shown in Figure 1 (taking HVA1 as an example). The upper and lower rows are the RGB image and the thermal image, respectively, with 100% irrigation to 80% irrigation from left to right. At the far right, the temperature is represented by color, with blue to red representing low to high temperatures, respectively.
From Figure 1, it is difficult to distinguish the water stress from the rice RGB image. However, the thermal image in the lower row shows a significant temperature difference (rice fields reflect relatively low temperatures when watering is sufficient), confirming that the thermal image reflects the water irrigation situation.

2.2. An Overview of the Proposed Framework

The proposed method is shown in Figure 2. Since the background was used as a reference temperature, different background processing methods have their own advantages and disadvantages. Therefore, our architecture consists of two branches that achieve complementary effects. The processing of these two branches is early fusion and late fusion [35] according to the background, with each branch calculating a temperature level score (TLS). For convenience, we use symbols to represent TLSef (Temperature Level Score of the Early Fusion Branch) and TLSlf (Temperature Level Score of the Late Fusion Branch), and then use MLP to identify the final water stress classes according to these scores. On TLSef and TLSlf ground truth scores, well-watered is 1.0 point, slight stress is 0.5 points, and mild stress is 0.0 points. In addition, in the late fusion branch, the foreground is separated from the background, so RGB images are used to segment the thermal image.

2.3. The Early Fusion Branch

Early fusion usually refers to the fusion of data before the network input. In this study, leaf temperature and background temperature were considered as two types of data. Therefore, in this branch, we directly input each thermal image into the CNN, which can be regarded as a type of early fusion, and the leaf temperature and the background temperature are combined in the early stage. This branch plays a role when background pixels are much smaller than leaf pixels. When there are too few background pixels, the background branch of the later fusion will not obtain sufficient background temperature; therefore, this branch compensates for shortcomings in the late fusion branch.

2.4. The Late Fusion Branch

2.4.1. Foreground and Background Segmentation

In order to separate the background and the foreground, we use a simple method to separate the two. First, since rice leaves are green, we use Equation (1) to convert the RGB image to I . I mainly converts stronger green pixels. Then we use the Otsu thresholding algorithm [36] on I to get a mask, M. Finally, we use M to cover the thermal image. The segmentation process is shown in Figure 3 and several examples are shown in Figure 4
I x , y = 2 G x , y R x , y + B x , y ,
In Equation (1), G x , y , R x , y , and B x , y are the intensity values of the three channels of each pixel of the RGB image at the image coordinate x , y , while I x , y represents the converted intensity value at the image coordinate x , y . In addition, when G x , y R x , y and G x , y B x , y , I x , y will have a greater intensity; therefore, the pixel in x , y is likely to be a leaf. When G x , y < R x , y + B x , y , the pixel is probably not a leaf, therefore, we regard I x , y as 0.
We roughly cut out the foreground and background. Although the cutting is not perfect, we believe it is enough for the application in this study. In order to explain why the cutting results are sufficient for our use, as shown in Table 3, compared with the classification results in Section 2.3, the classification ability of the late fusion branch is much higher. We speculate that since the objects we identify are thermal images, if the edges contain errors, they might be weakened by the pooling operation in the CNN. If some leaves are missed, since the temperature difference between the leaves in the whole rice field is usually not large, such slight omissions do not affect the result.

2.4.2. The Architecture of the Late Fusion Branch

The late fusion branch combines different types of data after extracting features from the network. In this branch, the background and the foreground (leaves) are input into their respective CNN models to concatenate two fully connected layers together. This is a typical late fusion method, as shown in Figure 5.
The leaf temperature and background temperature are independent information, so it is more appropriate to extract leaf temperature and background temperature features separately. This also explains the results presented in Table 3.

2.5. The Combination of TLSef and TLSlf

After obtaining TLSef and TLSlf through two CNNs, their scores must be combined to determine the final irrigation classification. Simple MLP is used for this and its architecture is shown in Figure 6. This MLP has three hidden layers, with 1024, 512, and 256 neurons in each layer. The activation functions of these three layers are all ReLU. The output layer contains three neurons representing three water stress conditions and uses softmax as the activation function.

2.6. CNN Training Details

In the CNN above, we experimented with three architectures: the architecture VGG16 is simply connected to the next layer, while ResNet and DenseNet are designed to be connected across layers to alleviate the gradient problem. Each CNN was trained for 300 epochs, and SGD was used to learn the network parameters with a learning rate of 0.001. The MLP was trained for 1000 epochs, while the rest of the settings are the same as the CNN settings.

3. Experimental Results

The results of the three well-known CNN architectures, VGG16, ResNet34, and DenseNet121, are in the second column. The last layer of the early fusion branch model in Section 2.3 was changed into three output neurons and the softmax function was used for classification. This approach is similar to [30,31,32], with some minor differences such as the different CNN architectures. The third column is the result of changing the last layer to three output neurons and using the softmax function in the model in Section 2.4. The fourth column is the method proposed in this study, as shown in Figure 2. Under the three CNN architectures, the method of separating the background and the foreground (the third column of Table 3) is significantly better than in the second column of Table 3, which shows that the late fusion of the background and the foreground significantly improves identification. The results in the fourth column are much higher than those in the previous two columns and are a significant improvement on the results of the late fusion method, which shows that the early fusion branch compensates for the deficiencies of the late fusion method.
To further examine these results, we calculated precision, recall, and F1-score. The calculation of these three evaluation metrics are shown in Equations (2)–(4):
P r e c i s i o n = T P T P + F P ,
R e c a l l = T P T P + F N ,
F 1 s c o r e = 2 P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l .
TP is true positive, which means that the rice field is facing water stress and the model correctly predicted water stress. FP is false positive, which means that the rice field is well-watered but the model wrongly identified water stress. FN is false negative, which means that the rice field is facing water stress but the model wrongly predicted that it is well-watered. TN is true negative, which means that the rice field is well-watered and the model correctly predicted that it is well-watered.
The precision in Table 4 is excellent, indicating that the proposed model performs well for the misjudgment of water stress. The recall is also relatively good, indicating that the system has a good ability to detect water stress.
The confusion matrix of the three CNN architectures was drawn to describe detailed identification results, as shown in Figure 7. Most of the misjudgments were between slight stress and mild stress for ResNet34 and DenseNet121, thereby requiring further enhancements.
In addition, VGG16 performed better than the other two architectures. Since VGG16 does not have a cross-layer connection structure in the network, it makes the temperature information less mixed when it is in the convolutional layer, also indicating that the temperature data is more suitable for late fusion processing.

4. Discussion

The softmax feature was also used for classification to explore whether the degree information features considered in this study are useful. The softmax feature is the model output in the second and third columns of Table 3. Using these two results, we use the same MLP as in Section 2.5 and the same setting for the classification task. The accuracy, shown in Table 5 and compared with the results of the proposed model in the last column in Table 3, is less accurate under the three CNN architectures.
We also conducted an experiment to illustrate the necessity of MLP. By directly using the two scores of TLSef and TLSlf for water stress classification, we combined these two scores in a simple statistical way, using the mean, maximum, and minimum of the two scores. For these three scores, the following rules were used to identify the degree of water stress: if the score was within 1–0.667 we considered it well-watered; within 0.667–0.333 we considered it mildly stressed; and within 0.333–0 we considered it slightly stressed. The results are shown in Table 6. We reviewed the MLP performance in the last column of Table 3, which outperformed the results in Table 6, demonstrating that the MLP can effectively fuse the two scores in this application. Furthermore, it mostly outperformed the CNN with the softmax output, and the combination also outperformed the softmax feature-based in Table 5. This shows that our method retains more degree information.
In order to clearly compare Table 5 and Table 6 and the results of the method in this study (mean of TLSef and TLSlf), we drew them in Figure 8. Through this figure, it can be clearly seen that the mean of TLSef and TLSlf is superior to the others under the three CNN architectures.
Next, we explored several single cases to better understand the advantages and disadvantages of the proposed method and provide directions for future work. First, let us discuss a few cases where identification was correct. The TLSef and TLSlf complemented each other’s deficiencies through the use of two scores. Incidentally, although we only selected the correct cases due to the addition of TLSef in the Table 7, there are also many clearly classified cases. For the convenience of comparison, we reraised the corresponding scores again, in ground truth, where well-watered was 1.0 point, slight stress was 0.5 points, and mild stress was 0.0 points.
From Table 7, we can see the cases 1, 5, 6, and 9 all have unsatisfactory TLSlf, and all rely on the TLSef score being closer to well-watered (1.0 point) to identify the case to correct. In cases 2, 3, 7, 8, 10, 11, and also cases 4 and 12, the ground truths were slightly stressed and mildly stressed, respectively, also to be corrected by TLSef.
Then we listed several cases of wrong identification (Table 8) and found that when a wrong prediction occurred, their two scores were far from the correct result. This result indicates that to make the system more accurate, we need to start with two scoring branches.
In the MLP, we expected that the output layer uses softmax, while all other layers use ReLU. We also tested whether this setting is reasonable through our experiments. In Table 9, we replaced all the activation functions of the MLP with tanh and sigmoid, except for the output layer. Since the values were all positive after ReLU conversion, and since the two scores (TLSef and TLSlf) were all positive, we predicted that ReLU should have a better performance than tanh and sigmoid.However, there was no significant difference in the results of the replaced activation functions in Table 9, which shows that our MLP’s number of layers and neuron settings provide enough capacity for this application.
Compared with previous research, the method of this study mainly lies in the improvement of two points. First, we used two scores, TLSef and TLSlf, to retain degree information, explained by comparisons between Table 3, Table 5, and Table 6. Second, we demonstrated through experiments that using thermal images is better than using RGB images. As shown in Table 10, we used the same architecture as a MLP with TLSef and TLSlf in Table 3, and using RGB images as the input. These results show that it is difficult to directly use RGB images to identify the water content of plant leaves. A more suitable and easier solution is to use the response of water to leaf temperature and obtain temperature information through thermal imagery.

5. Conclusions

This study proposed an identification system of water stress in rice fields. Leaf temperature is an important reference index for identifying water stress in rice and can be obtained relatively simply and conveniently using a handheld thermal imaging camera. The proposed automatic rice water stress identification system was developed for thermal images based on a CNN and achieved an accuracy of 85% to 100%, thus meeting the needs of practical applications. In the past, crop temperature-based water identification systems usually ignored degree information which can provide useful features in thermal images. We proposed a framework to incorporate degree information into the system, demonstrating through experiments that our method utilizes this information effectively. Since leaf temperature is relative, background temperature was used as a reference temperature, with two background processing methods used to obtain two temperature degree scores which complemented each other to achieve better results.
In the future, the proposed system will be improved with better leaf temperature degree score extraction. Furthermore, since a MLP was used in the final score combination, other simpler methods can be considered to improve the computing speed. After further improvement, we believe that this framework may also be applied to other crops. Finally, for future field applications, we suggest replacing the handheld thermal imager with an external thermal imager connected to a mobile phone, thereby allowing for upload to the GPU host for real-time identification. This configuration is low-cost and easy to operate, making it accessible to farmers, scientists, and students.

Author Contributions

Methodology, Software and Writing—original draft, M.-W.L.; Supervision and Writing—review and editing, Y.-K.C. and S.-S.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Science and Technology, Taiwan, under grant number MOST 110-2321-B-005-005.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data supporting this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Auffhammer, M.; Ramanathan, V.; Vincent, J.R. Climate change, the monsoon, and rice yield in India. Clim. Chang. 2011, 111, 411–424. [Google Scholar] [CrossRef]
  2. Yang, X.; Wang, B.; Chen, L.; Li, P.; Cao, C. The different influences of drought stress at the flowering stage on rice physiological traits, grain yield, and quality. Sci. Rep. 2019, 9, 3742. [Google Scholar] [CrossRef] [PubMed]
  3. Pathak, H.; Kumar, M.; Molla, K.A.; Chakraborty, K. Abiotic stresses in rice production: Impacts and management. Oryza-An Int. J. Rice 2021, 58, 103–125. [Google Scholar] [CrossRef]
  4. Wang, Y.; Huang, J.; Wang, J.; Findlay, C. Mitigating rice production risks from drought through improving irrigation infrastructure and management in China. Aust. J. Agric. Resour. Econ. 2018, 62, 161–176. [Google Scholar] [CrossRef]
  5. Zubaer, M.; Chowdhury, A.; Islam, M.; Ahmed, T.; Hasan, M. Effects of water stress on growth and yield attributes of aman rice genotypes. Int. J. Sustain. Crop Prod. 2007, 2, 25–30. [Google Scholar]
  6. Hossain, M.; Sikder, S.; Husna, A.; Sultana, S.; Akhter, S.; Alim, A.; Joardar, J. Influence of Water Stress on Morphology, Physiology and Yield Contributing Characteristics of Rice. SAARC J. Agric. 2020, 18, 61–71. [Google Scholar] [CrossRef]
  7. He, J.; Ma, B.; Tian, J. Water production function and optimal irrigation schedule for rice (Oryza sativa L.) cultivation with drip irrigation under plastic film-mulched. Sci. Rep. 2022, 12, 17243. [Google Scholar] [CrossRef]
  8. Wada, Y.; Van Beek, L.; Viviroli, D.; Dürr, H.H.; Weingartner, R.; Bierkens, M.F. Global monthly water stress: 2. water demand and severity of water stress. Water Resour. Res. 2011, 47, 7. [Google Scholar] [CrossRef]
  9. Christian, J.I.; Basara, J.B.; Hunt, E.D.; Otkin, J.A.; Furtado, J.C.; Mishra, V.; Xiao, X.; Randall, R.M. Global distribution, trends, and drivers of flash drought occurrence. Nat. Commun. 2021, 12, 6330. [Google Scholar] [CrossRef]
  10. Tramblay, Y.; Koutroulis, A.; Samaniego, L.; Vicente-Serrano, S.M.; Volaire, F.; Boone, A.; Le Page, M.; Llasat, M.C.; Albergel, C.; Burak, S.; et al. Challenges for drought assessment in the Mediterranean region under future climate scenarios. Earth-Sci. Rev. 2020, 210, 103348. [Google Scholar] [CrossRef]
  11. Zhang, Y.; Hao, Z.; Feng, S.; Zhang, X.; Xu, Y.; Hao, F. Agricultural drought prediction in China based on drought propagation and large-scale drivers. Agric. Water Manag. 2021, 255, 107028. [Google Scholar] [CrossRef]
  12. Rani, A.; Devi, P.; Jha, U.C.; Sharma, K.D.; Siddique, K.; Nayyar, H. Developing Climate-Resilient Chickpea Involving Physiological and Molecular Approaches With a Focus on Temperature and Drought Stresses. Front. Plant Sci. 2020, 10, 1759. [Google Scholar] [CrossRef] [PubMed]
  13. Xu, Y.; Ge, J.; Tian, S.; Li, S.; Nguy-Robertson, A.L.; Zhan, M.; Cao, C. Effects of water-saving irrigation practices and drought resistant rice variety on greenhouse gas emissions from a no-till paddy in the central lowlands of China. Sci. Total. Environ. 2015, 505, 1043–1052. [Google Scholar] [CrossRef]
  14. Manickavelu, A.; Nadarajan, N.; Ganesh, S.K.; Gnanamalar, R.P.; Babu, R.C. Drought tolerance in rice: Morphological and molecular genetic consideration. Plant Growth Regul. 2006, 50, 121–138. [Google Scholar] [CrossRef]
  15. Venuprasad, R.; Lafitte, H.R.; Atlin, G.N. Response to Direct Selection for Grain Yield under Drought Stress in Rice. Crop. Sci. 2007, 47, 285–293. [Google Scholar] [CrossRef]
  16. Wang, W.-S.; Pan, Y.-J.; Zhao, X.-Q.; Dwivedi, D.; Zhu, L.-H.; Ali, J.; Fu, B.-Y.; Li, Z.-K. Drought-induced site-specific DNA methylation and its association with drought tolerance in rice (Oryza sativa L.). J. Exp. Bot. 2010, 62, 1951–1960. [Google Scholar] [CrossRef] [PubMed]
  17. Gonzalez-Dugo, V.; Zarco-Tejada, P.J.; Fereres, E. Applicability and limitations of using the crop water stress index as an indicator of water deficits in citrus orchards. Agric. For. Meteorol. 2014, 198, 94–104. [Google Scholar] [CrossRef]
  18. Crusiol, L.G.T.; Nanni, M.R.; Furlanetto, R.H.; Sibaldelli, R.N.R.; Cezar, E.; Mertz-Henning, L.M.; Nepomuceno, A.L.; Neumaier, N.; Farias, J.R.B. UAV-based thermal imaging in the assessment of water status of soybean plants. Int. J. Remote Sens. 2020, 41, 3243–3265. [Google Scholar] [CrossRef]
  19. Wilson, A.N.; Gupta, K.A.; Koduru, B.H.; Kumar, A.; Jha, A.; Cenkeramaddi, L.R. Recent Advances in Thermal Imaging and its Applications Using Machine Learning: A Review. IEEE Sensors J. 2023, 23, 3395–3407. [Google Scholar] [CrossRef]
  20. Alzubaidi, L.; Zhang, J.; Humaidi, A.J.; Al-Dujaili, A.; Duan, Y.; Al-Shamma, O.; Santamar, J.; Fadhel, M.A.; Al-Amidie, M.; Farhan, L. Review of deep learning: Concepts, cnn architectures, challenges, applications, future directions. J. Big Data 2021, 8, 53. [Google Scholar] [CrossRef]
  21. Taheri-Garavand, A.; Nasiri, A.; Fanourakis, D.; Fatahi, S.; Omid, M.; Nikoloudakis, N. Automated In Situ Seed Variety Identification via Deep Learning: A Case Study in Chickpea. Plants 2021, 10, 1406. [Google Scholar] [CrossRef] [PubMed]
  22. Nasiri, A.; Taheri-Garavand, A.; Fanourakis, D.; Zhang, Y.-D.; Nikoloudakis, N. Automated Grapevine Cultivar Identification via Leaf Imaging and Deep Convolutional Neural Networks: A Proof-of-Concept Study Employing Primary Iranian Varieties. Plants 2021, 10, 1628. [Google Scholar] [CrossRef] [PubMed]
  23. Kaya, Y.; Gürsoy, E. A novel multi-head cnn design to identify plant diseases using the fusion of rgb images. Ecol. Inform. 2023, 75, 101998. [Google Scholar] [CrossRef]
  24. Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
  25. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
  26. Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
  27. Taheri-Garavand, A.; Nejad, A.R.; Fanourakis, D.; Fatahi, S.; Majd, M.A. Employment of artificial neural networks for non-invasive estimation of leaf water status using color features: A case study in Spathiphyllum wallisii. Acta Physiol. Plant. 2021, 43, 78. [Google Scholar] [CrossRef]
  28. Chandel, N.S.; Chakraborty, S.K.; Rajwade, Y.A.; Dubey, K.; Tiwari, M.K.; Jat, D. Identifying crop water stress using deep learning models. Neural Comput. Appl. 2020, 33, 5353–5367. [Google Scholar] [CrossRef]
  29. Zhuang, S.; Wang, P.; Jiang, B.; Li, M. Learned features of leaf phenotype to monitor maize water status in the fields. Comput. Electron. Agric. 2020, 172, 105347. [Google Scholar] [CrossRef]
  30. Niu, H.; Wang, D.; Chen, Y. Tree-level irrigation inference using UAV thermal imagery and convolutional neural networks. In Proceedings of the 2022 International Conference on Unmanned Aircraft Systems (ICUAS), Dubrovnik, Croatia, 21–24 June 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1586–1591. [Google Scholar] [CrossRef]
  31. Chandel, N.S.; Rajwade, Y.A.; Dubey, K.; Chandel, A.K.; Subeesh, A.; Tiwari, M.K. Water Stress Identification of Winter Wheat Crop with State-of-the-Art AI Techniques and High-Resolution Thermal-RGB Imagery. Plants 2022, 11, 3344. [Google Scholar] [CrossRef]
  32. de Melo, L.L.; de Melo, V.G.M.L.; Marques, P.A.A.; Frizzone, J.A.; Coelho, R.D.; Romero, R.A.F.; Barros, T.H.D.S. Deep learning for identification of water deficits in sugarcane based on thermal images. Agric. Water Manag. 2022, 272, 107820. [Google Scholar] [CrossRef]
  33. Katimbo, A.; Rudnick, D.R.; De Jonge, K.C.; Lo, T.H.; Qiao, X.; Franz, T.E.; Nakabuye, H.N.; Duan, J. Crop water stress index computation approaches and their sensitivity to soil water dynamics. Agric. Water Manag. 2022, 266, 107575. [Google Scholar] [CrossRef]
  34. Khan, M.I.; Saddique, Q.; Zhu, X.; Ali, S.; Ajaz, A.; Zaman, M.; Saddique, N.; Buttar, N.A.; Arshad, R.H.; Sarwar, A. Establishment of Crop Water Stress Index for Sustainable Wheat Production under Climate Change in a Semi-Arid Region of Pakistan. Atmosphere 2022, 13, 2008. [Google Scholar] [CrossRef]
  35. Gadzicki, K.; Khamsehashari, R.; Zetzsche, C. Early vs late fusion in multimodal convolutional neural networks. In Proceedings of the 2020 IEEE 23rd International Conference on Information Fusion (FUSION), Rustenburg, South Africa, 6–9 July 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
  36. Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
Figure 1. RGB (upper row) and thermal (lower row) images in a rice field.
Figure 1. RGB (upper row) and thermal (lower row) images in a rice field.
Applsci 13 05423 g001
Figure 2. Overview of proposed framework.
Figure 2. Overview of proposed framework.
Applsci 13 05423 g002
Figure 3. Foreground and background segmentation process.
Figure 3. Foreground and background segmentation process.
Applsci 13 05423 g003
Figure 4. Result of foreground and background segmentation.
Figure 4. Result of foreground and background segmentation.
Applsci 13 05423 g004
Figure 5. The architecture of the late fusion branch.
Figure 5. The architecture of the late fusion branch.
Applsci 13 05423 g005
Figure 6. MLP architecture combining TLSef and TLSlf.
Figure 6. MLP architecture combining TLSef and TLSlf.
Applsci 13 05423 g006
Figure 7. The confusion matrix of the three CNN architectures.
Figure 7. The confusion matrix of the three CNN architectures.
Applsci 13 05423 g007
Figure 8. Accuracy of TLSef + TLSlf, with softmax features, and without MLP.
Figure 8. Accuracy of TLSef + TLSlf, with softmax features, and without MLP.
Applsci 13 05423 g008
Table 1. Comparison of literature on water content of plant leaves.
Table 1. Comparison of literature on water content of plant leaves.
MethodPlantsData Type of InputFeatures and ClassifiersRemark
Taheri-Garavand et al. [27]Spathiphyllum wallisiiRGB imageTexture features selected by PCA + MLPStricter shooting conditions and prior knowledge requirements.
Chandel et al. [28]Corn, okra, and soybeanRGB imageCNNRGB image lacks features and ignores degree information.
Zhuanga et al. [29]CornRGB imageCNN features + SVM
Niu et al. [30]Pomegranate treesThermal imageCNNToo rough classification of water stress and lack of information on degree of consideration
Chandel et al. [31]Winter wheatThermal and RGB image
Melo et al. [32]SugarcaneThermal image
Table 2. Dataset for our experiments.
Table 2. Dataset for our experiments.
Water Stress SituationNumber of Data
Well-watered136
Slight stress137
Mild stress130
Total403
Table 3. Comparison of the accuracy of the three CNN architectures.
Table 3. Comparison of the accuracy of the three CNN architectures.
CNN ArchitectureEarly Fusion with SoftmaxLate Fusion with SoftmaxMLP with TLSef + TLSlf
VGG1636.59%75.61%100.0%
ResNet3451.22%73.17%87.80%
DenseNet34.15%75.61%85.37%
Table 4. Precision, recall and F1-score under the three CNN architectures.
Table 4. Precision, recall and F1-score under the three CNN architectures.
CNN ArchitecturePrecisionRecallF1-Score
VGG161.00001.00001.0000
ResNet340.92001.00000.9019
DenseNet1211.00000.92310.8695
Table 5. Accuracy with softmax feature under the three CNN architectures.
Table 5. Accuracy with softmax feature under the three CNN architectures.
CNN ArchitectureVGG16ResNet34DenseNet121
Accuracy95.12%82.93%75.61%
Table 6. Result of accuracy of three statistical values.
Table 6. Result of accuracy of three statistical values.
CNN ArchitectureMean of TLSef and TLSlfMaximum of TLSef and TLSlfMinimum of TLSef and TLSlf
VGG1680.48%78.05%92.68%
ResNet3465.85%82.93%43.90%
DenseNet12168.29%68.54%60.98%
Table 7. Correct case sample TLSef and TLSlf for 3 CNN architectures.
Table 7. Correct case sample TLSef and TLSlf for 3 CNN architectures.
Case NumberCNN ArchitectureTLSefTLSlfGround Truth
Case1VGG160.944147530.41299862Well-watered
Case2VGG160.492252230.93656325Slight stress
Case3VGG160.516092600.99480104Slight stress
Case4VGG160.001446000.95988095Mild stress
Case5ResNet340.818169000.28788370Well-watered
Case6ResNet340.744975690.48903641Well-watered
Case7ResNet340.489036410.22066300Slight stress
Case8ResNet340.455082860.28079394Slight stress
Case9DenseNet1210.975303770.14977294Well-watered
Case10DenseNet1210.487038550.29539412Slight stress
Case11DenseNet1210.541886450.18642481Slight stress
Case12DenseNet1210.167647410.44175833Mild stress
Table 8. Error case samples of TLSef and TLSlf for the three CNN architectures.
Table 8. Error case samples of TLSef and TLSlf for the three CNN architectures.
CNN ArchitectureTCTSGround TruthMisjudgment
ResNet340.001040900.14355020Slight stressMild stress
ResNet340.544939460.48646522Mild stressSlight stress
DenseNet1210.994404670.99999380Slight stressWell-watered
DenseNet1210.783653380.99712139Mild stressSlight stress
DenseNet1210.674523410.99999666Mild stressSlight stress
Table 9. Accuracy of replacing MLP activation functions.
Table 9. Accuracy of replacing MLP activation functions.
CNN ArchitectureTanhSigmoid
VGG16100.00%100.00%
ResNet3487.80%80.49%
DenseNet12182.92%85.37%
Table 10. Accuracy using RGB images.
Table 10. Accuracy using RGB images.
CNN ArchitectureVGG16ResNet34DenseNet121
Accuracy48.78%56.10%46.34%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, M.-W.; Chan, Y.-K.; Yu, S.-S. Use of CNN for Water Stress Identification in Rice Fields Using Thermal Imagery. Appl. Sci. 2023, 13, 5423. https://doi.org/10.3390/app13095423

AMA Style

Li M-W, Chan Y-K, Yu S-S. Use of CNN for Water Stress Identification in Rice Fields Using Thermal Imagery. Applied Sciences. 2023; 13(9):5423. https://doi.org/10.3390/app13095423

Chicago/Turabian Style

Li, Mu-Wei, Yung-Kuan Chan, and Shyr-Shen Yu. 2023. "Use of CNN for Water Stress Identification in Rice Fields Using Thermal Imagery" Applied Sciences 13, no. 9: 5423. https://doi.org/10.3390/app13095423

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop