Next Article in Journal
Factors Affecting Local Governments’ Public–Private Partnership Adoption in Urban China
Previous Article in Journal
Selected Papers from the Eurasian Conference on Educational Innovation 2019
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimating the Leaf Area Index of Winter Wheat Based on Unmanned Aerial Vehicle RGB-Image Parameters

1
College of Resources and Environmental Science, Xinjiang University, Urumqi 830046, China
2
Key Laboratory of Oasis Ecology of Ministry of Education, Urumqi 830046, China
3
Key Laboratory for Wisdom City and Environmental Modeling, Xinjiang University, Urumqi 830046, China
4
Guangzhou Institute of Geography, Guangzhou 510070, China
*
Author to whom correspondence should be addressed.
Sustainability 2019, 11(23), 6829; https://doi.org/10.3390/su11236829
Submission received: 18 September 2019 / Revised: 19 November 2019 / Accepted: 20 November 2019 / Published: 2 December 2019
(This article belongs to the Section Sustainable Agriculture)

Abstract

:
The leaf area index (LAI) is not only an important parameter for monitoring crop growth, but also an important input parameter for crop yield prediction models and hydrological and climatic models. Several studies have recently been conducted to estimate crop LAI using unmanned aerial vehicle (UAV) multispectral and hyperspectral data. However, there are few studies on estimating the LAI of winter wheat using unmanned aerial vehicle (UAV) RGB images. In this study, we estimated the LAI of winter wheat at the jointing stage on simple farmland in Xinjiang, China, using parameters derived from UAV RGB images. According to gray correlation analysis, UAV RGB-image parameters such as the Visible Atmospherically Resistant Index (VARI), the Red Green Blue Vegetation Index (RGBVI), the Digital Number (DN) of Blue Channel (B) and the Green Leaf Algorithm (GLA) were selected to develop models for estimating the LAI of winter wheat. The results showed that it is feasible to use UAV RGB images for inverting and mapping the LAI of winter wheat at the jointing stage on the field scale, and the partial least squares regression (PLSR) model based on the VARI, RGBVI, B and GLA had the best prediction accuracy (R2 = 0.776, root mean square error (RMSE) = 0.468, residual prediction deviation (RPD) = 1.838) among all the regression models. To conclude, UAV RGB images not only have great potential in estimating the LAI of winter wheat, but also can provide more reliable and accurate data for precision agriculture management.

1. Introduction

The leaf area index (LAI), defined as the total one-sided leaf area per unit of surface area [1], leaf projection area [2] or half of the total interception area per unit of surface area [3], is an important parameter in controlling the physiological process of the vegetation canopy, which is closely related to biomass and crop yield. Therefore, the accurate and rapid estimation of crop LAI is not only conducive to better crop monitoring, but also conducive to its application in modeling, overall crop management and precision agriculture. Although the traditional LAI measurement method can obtain more accurate data, it is time-consuming and labor-intensive, and it is difficult to achieve large-scale overall monitoring [4].
Remote sensing technology has become an important tool in agriculture, because it has the great advantage of being fast, non-destructive and providing a large area of detection [5]. It has been successfully used to extract crop planting area, monitor crop growth and for parameter inversion [6,7,8,9,10,11,12]. Satellite observations used to obtain vegetation indices (VIs) contribute to the retrieving of crop LAI on earth, which is essential for achieving sustainable agriculture management. Studies conducted by Li et al. [13] showed that the four VIs (the normalized difference vegetation index (NDVI), the soil-adjusted vegetation index (SAVI), the enhanced vegetation index (EVI), and the 2-band enhanced vegetation index (EVI2)) derived from three different sensors (GF-1, HJ-1, Landsat-8) are all highly correlated with the LAI of winter wheat, and the spatial resolution must be considered in practical applications.
Despite the scientific community constantly promoting relevant research, however, there still remain inconsistencies in satellite monitoring. In recent years, unmanned aerial vehicle (UAV) remote sensing technology, as the backbone of new remote sensing technology, has sprung up, and has also been favored by agricultural workers. There have been several investigations into the potential of UAV remote sensing platforms in precision agriculture research, with their advantages being that they are more flexible, portable, and active for farmland scale research, as an alternative to traditional remote sensing platforms, such as near-earth, aviation and satellite. Many scholars have used UAVs to carry multispectral and hyperspectral sensors to study crop LAI. For example, studies such as that conducted by Xia et al. [14] have shown that the modified triangular vegetation index (MTVI2) derived from UAV multispectral images has the best performance in monitoring wheat LAI. Gao et al. [15] claimed that the application of UAV hyperspectral remote sensing technology in precision agriculture research can ensure the estimation accuracy of crop parameters and maintain high work efficiency. Their studies found that the ratio spectral index (RSI (494,610)) was highly positively correlated with the LAI and was better in retrieving the LAI. However, UAVs have certain limitations on the size and weight of sensors mounted on them and those sensors are expensive, and the process of data processing is complex, which restricts the wide application of UAV remote sensing technology in agriculture. In line with this, the popularity of consumer-grade UAVs and digital cameras has made the agricultural application of digital images possible. Many studies have demonstrated the capability of unmanned aerial vehicle RGB-image data in retrieving crop LAI and testing growth status, biomass, and yield prediction. For example, Li et al. [16] showed that it is feasible to use UAV digital images to estimate the LAI of soybean breeding materials. They can obtain the growth information of soybean breeding materials quickly, effectively and undamaged, and provide a low-cost and feasible method for screening high-yield soybean varieties. Kim et al. [17] found that it is possible to use UAV RGB images to quantify spatial and temporal variation in the biophysical characteristics of Chinese cabbage and white radish over the growing season. According to Bendig et al. [18], UAV-based RGB images have potential for estimating biomass in the future application by non-professionals, i.e., farmers. Research conducted by Guilherme et al. [19] has shown that the GRVI (Green-Red Vegetation Index) obtained by UAV RGB images can better reflect the whole condition of the crop yield. These research results show the agricultural application prospects of UAV digital images and promote the quantitative application of UAV remote sensing.
The area of interest for this research is simple farmland in Xinjiang, China, where there are few attempts to directly apply UAV digital images to estimate crop LAI. The objective of this study is to examine the capability of UAV RGB images to estimate the LAI of winter wheat at the jointing stage, which is the key growth stage of winter wheat. In order to achieve the above objectives, this study was carried out in the following steps: (1) collecting ground sampling data, mosaicing UAV images and constructing 20 digital image feature parameters; (2) determining estimation parameters and the inversion model for the LAI of Winter Wheat; (3) evaluating the accuracy of the LAI estimation results and discussing the feasibility of using UAV digital image to estimate the winter wheat LAI.

2. Materials

2.1. Study Area

The study area was located in Qitai County (89°13′ E~91°22′ E, 42°25′ N~45°29′ N), Changji Hui Autonomous Prefecture, Xinjiang Uygur Autonomous Region, China (Figure 1). It belonged to the arid climate of the continental semi-desert in the middle temperate zone, with an average annual temperature of 5.5 °C, an average annual frost-free period of 153 days and an average annual precipitation of 269.4 mm. The main crop in the study area was wheat.

2.2. Design of Field Sampling

In this study, the farmers’ wheat field, without any trial design at the jointing stage, was selected for sampling. The wheat planting mode was mechanical planting, and the irrigation mode was drip irrigation, which was seeded on 25 September, 2017. The ground observation was carried out at intervals of 20 m in the east–west direction and of 30 m in the north–south direction in a 130 m × 420 m plot. There were 78 sampling points in total. Four rows × 50 cm small areas were taken, and the five-point sampling method was used for sampling. After sampling, the sampling width was measured with a tape and the coordinates of the sampling points were recorded with GPS. The spatial distribution is shown in Figure 1.

2.3. Image Dataset

In this study, the aerial data acquisition was carried out on 11 May, 2018. Before the aerial photography began, the UAV equipment was tested first, and then route planning was carried out with aerial photograph control software of DJI Phantom 4 Advanced UAV (Dà-Jiāng Innovations Science and Technology Co., Ltd. Shenzhen, China). The flight route, altitude, overlap degree and relevant parameters of the camera (focal length, exposure degree, etc.) were determined. The flight altitude was set to 120 m, and both the heading overlap degree and the side overlap degree were set to 80%. In order to ensure the safe flight of the UAV, we ensured that the landing point of the UAV avoided ground obstacles such as cables and buildings, that the UAV transmitted the GPS signal, remote control signal and map signal normally, and that we paid close attention to the wind, wind speed and illumination in the flight area in real time. The flight was carried out in the farmland area. There were few obstacles on the ground around it. The weather was clear and the wind force was less than grade 3, which satisfied the stable flight conditions of the UAV. The aerial photography area of this wheat field is 206,500 m2. A total of 76 RGB single aerial photographs were obtained. The highest altitude in this area was 833 m, the lowest was about 813 m, and the height difference was 20 m. In this paper, Pix4Dmapper (Pix4DSA, Co. Ltd. Prilly, Switzerland) software was used to mosaic the aerial photographic data. After the software completed the mosaic operation, the mosaic results were derived in a TIFF format. The digital orthophoto map (DOM) and the digital surface model (DSM) are shown in Figure 2.

2.4. Field Measured LAI

There are two methods to measure LAI: the direct method and the indirect method. Although the direct measurement method is destructive to some extent, its measurement accuracy is relatively high, such as the Specific Leaf Weight (SLW) method; the indirect measurement method can carry out the measurement in a faster and wider range, including the optical instrument measurement method, such as the SUNSCAN canopy analysis system, and the photography method, based on the digital image processing technology.
In this paper, Liu’s [19] photographic method was slightly adjusted, and the LAI was calculated by the scanning method. According to the wheat sampling method in Section 2.2, wheat samples were collected. The sampling area was set as Ss, and the total number of stems in the sample area was set as n. Five stems were taken out from the sample area by the five-point method as the wheat samples for the scanning method (Figure 3a), and the leaves and stems were separated. We put a piece of A4 white paper on the flat writing board with an area of Sp (21 cm × 29.7 cm) and lay the leaves on the paper with glue to ensure that there was no overlap between the leaves. In this study, the ECOSYS FS-1125MFP multi-functional digital composite machine was used to scan the white paper of all samples with a scanner. The resolution was set to 600 × 600 dpi, and the scanned photos were saved as a TIFF format (Figure 3b, take the fourth sampling point as an example).
We opened the above TIFF file with Environment for Visualizing Images (ENVI), and used the classification method of “decision tree” to separate green leaves and white paper (Figure 3c), and then counted the number of pixels occupied by white paper (PZ) and the number of pixels occupied by green leaves (PY), respectively. The actual number of pixels occupied by white paper was PZ + PY. The LAI values ranged from 4 to 8.45, with the average value of 6.42. The coefficient of variation was 0.17. The calculation formula was shown in Equation (1):
LAI = PY PZ + PY × S p S s × n 5

2.5. Methods

2.5.1. RGB-Image Parameters

To discuss the feasibility of using UAV digital images to estimate the LAI of winter wheat, twenty RGB-based image parameters were selected (Table 1).

2.5.3. Establishment and Validation of the Model

In this study, RGB-based image parameters were taken as independent variables and the LAIs as dependent variables. In total, 78 field-measured LAIs were randomly divided into two parts, half as a training set and half as a validation set. Four univariate regression models (linear, exponential, power function, quadratic polynomial) and a multivariate regression model (partial least squares regression, PLSR) were compared, and used the coefficient of determination (R2), the root mean square error (RMSE) and the residual prediction deviation (RPD) to evaluate the models.

2.5.2. Grey Correlation Analysis

In the grey correlation analysis system, the correlation degree describes the relative changes of factors in the process of system development [31]. If the relative changes of the two factors are basically the same in the process of system development, it is considered that the correlation degree of the two factors is large; on the contrary, the correlation degree of the two factors is small [32]. Grey correlation analysis is a quantitative description and comparison of the development and change of a system, and a method to measure the degree of correlation between the reference sequence and the influencing factor sequence, reflecting system behavior characteristics [33]. In this study, the wheat LAIs were used as reference arrays and the RGB-based image parameters as comparison arrays for grey correlation analysis. The specific steps were as follows [34,35]:
(1)
Dimensionless processing of the wheat LAI data
(2)
Calculation of the correlation coefficient
According to Equation (2), the grey correlation coefficient between the reference arrays and the comparison arrays was calculated.
γ 0 i ( k ) = Δ min + ρ Δ max Δ 0 i ( k ) + ρ Δ max
In the formula, γ 0 i ( k ) is the correlation coefficient between the reference arrays and the comparison arrays k; Δ min and Δ max are the minimum absolute difference and the maximum absolute difference, respectively; ρ is identification coefficient with its value in [0,1], generally 0.5; Δ 0 i ( k ) is absolute difference array.
(3)
Calculation of grey correlation degree
Calculate the grey correlation degree γ i according to Equation (3):
γ i = 1 n i = 1 n γ 0 i ( k )
(4)
Ranking of grey correlation degree.
According to the correlation degree γ i , the UAV RGB-based image parameters were sorted.

3. Results

3.1. Grey Correlation Analysis between the LAI and the RGB-Based Image Parameters

The grey correlation degree between the LAI and the UAV RGB-based image parameters was calculated and sorted as the basis for establishing the model in the next step (Table 2). According to the ranking of grey correlation degree, the first four RGB-based image parameters were selected as modeling parameters.

3.2. Establishment and Validation of the Model

Four univariate regression models were built for the VARI using linear, exponential, power function and quadratic polynomial algorithms and a multivariate regression model (partial least squares regression, PLSR) was built by using the VARI, RGBVI, B, and GLA (Table 3). It is observed from Table 3 that the R2 and RMSE of each univariate model were similar, and the R2 and RMSE of the quadratic polynomial model reached the maximum and minimum, respectively. In the multivariate PLSR model, the combination of the VARI, RGBVI, B and GLA was the best, R2 was 0.767, and RMSE was 0.422. Compared with the univariate model, the modeling effect of the multivariate PLSR model was better, which shows that the LAI of winter wheat can be better estimated based on multiple UAV RGB-based image parameters.
To assess the reliability of the model, the validation data were substituted into the LAI estimation models, and the predicted LAI values were linearly fitted with the measured values (Figure 4). The prediction accuracy of each model was validated by the R2, RMSE and RPD. The results show that the validation data points are evenly distributed on both sides of the 1:1 line, R2 is 0.707 and 0.776, which are greater than 0.7, RMSE is 0.533 and 0.468, and RPD is greater than 1.4. This shows that the two models are feasible to predict the LAI of winter wheat. The stability and predictive ability of the multivariate PLSR model are better than that of the univariate quadratic polynomial model. Shown in Figure 5a,b are the spatial distribution maps of the LAI using the univariate quadratic polynomial model and the multivariate PLSR model, respectively. It can be seen from the figure that the spatial distribution trend of the LAI predicted by the two models was similar on the whole. To some extent, it reflects the growth situation of winter wheat in farmland. The larger the LAI value, the better the growth trend of winter wheat and the more vigorous the growth of winter wheat; the smaller the LAI, the poorer the growth. This was basically consistent with the field investigation. The LAI values in most areas of farmland ranged from 6.5 to 7.5, and the areas larger than 7.5 were mainly located in the northwest corner of the farmland, which may be caused by the lower elevation of these areas compared with other areas and the accumulation of rainwater. The winter wheat was sparse because of soil salinization in the middle and upper regions of the experimental fields.

4. Discussion

With the rapid development of modern agriculture, the rapid and accurate monitoring of crop growth information is the key to the precise management of agriculture. At present, some scientific achievements have been achieved in the remote sensing and monitoring of consumer UAVs, which provide evidence for the application of UAV digital images. However, there are few studies on LAI estimation using UAV digital images directly in Xinjiang, China. In this paper, the LAI of winter wheat was estimated by only using UAV RGB-based image parameters, which was basically consistent with the existing studies on estimating the LAI of soybean breeding materials, winter wheat and rice by only using UAV RGB-based image parameters [13,15,36]. The VARI based on UAV RGB images is a new parameter constructed according to the VARI calculation principle. It maintains the advantages of the VARI to a certain extent and shows a high sensitivity compared to winter wheat LAI. However, there are few studies on estimating crops’ physical and chemical parameters based on UAV RGB-image parameters constructed by the calculation principle of the vegetation index. This study provides some new ideas for estimating crops’ physical and chemical parameters using UAV RGB images in the future. Although the regression models used in this study are relatively simple, compared with physical models such as PROSAIL, these models have fewer input parameters and are easy to operate, which increases the universality of the models. In addition, the acquisition process of UAV RGB images in this study is simple, cost-saving and labor-saving, and the reliability of the data can be guaranteed by using mature flight control software and data processing software.
However, there are some shortcomings in this study. Although the data used in this experiment were obtained under stable solar radiation conditions, the problem of radiation calibration still restricts the application of UAV digital images to a certain extent. This study is limited to estimating the LAI from RGB images at the jointing stage of winter wheat. It is necessary to acquire the whole growth period data for the dynamic monitoring of the LAI in winter wheat and to establish a more universal LAI estimation model. Because this study focuses on the feasibility of estimating the LAI of winter wheat using UAV RGB images, there is no comparison between gray value and spectral reflectance in monitoring crop growth.

5. Conclusions and Future Work

In this study, we introduced a simple method for estimating the LAI of winter wheat at the jointing stage using 20 parameters derived from UAV RGB images. Firstly, it was demonstrated that UAV RGB images were more suitable for obtaining parameters related to the LAI, with a high spatial resolution of 3 cm on the field scale. The grey correlation degree between the VARI and the LAI of winter wheat is the highest among the 20 UAV RGB-image parameters, which indicates that the correlation between the VARI and the LAI is the best, and the parameters derived from UAV RGB images can be modeled with a high grey correlation at the jointing stage using UAV-based high resolution images.
Then, in the second step, four univariate and one multivariate model for estimating the LAI of winter wheat were developed and tested. The coefficients of determination, R2 = 0.707 for the quadratic polynomial model based on VARI and R2 = 0.776 for the PLSR model, illustrated that parameters derived from UAV RGB images are suitable indicators for the LAI of winter wheat at the jointing stage. The results presented here need to be evaluated in multiple-growth stage and multiple-year field-scale studies to ensure the model’s robustness and transferability.
In the next step, the results from this study will be combined with more information to explore the application of consumer-grade UAVs’ remote sensing in precision agriculture. Future works include: (1) different angles and heights of RGB images can be obtained for comparison; (2) only the canopy information of winter wheat was used in this study, and the geometric information (plant height) is needed for further study; (3) only one winter wheat field was used in this study; more wheat fields could be tested in future studies to improve the accuracy of the prediction model; (4) Future research should also focus on optimization algorithms (machine learning algorithm) to estimate the LAI of winter wheat more accurately.

Author Contributions

Project administration, M.S.; Supervision, S.C.; Writing—original draft, U.H.

Funding

This work was supported by the National Natural Science Foundation of China [41361016] and “Space-Air-Ground Remote Sensing Big Data Fusion-based Agricultural Typhoon Disaster Prediction Analysis Technology”, Minsheng Science and Technology Special Topic of Basic Research Project of Guangzhou Municipality.

Acknowledgments

First of all, I would like to thank the two anonymous reviewers for their valuable comments and suggestions. Secondly, I would like to thank my wife Xatgul for her support and encouragement for my work.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Watson, D.J. Comparative Physiological Studies on the Growth of Field Crops: I. Variation in Net Assimilation Rate and Leaf Area between Species and Varieties, and within and between Years. Ann. Bot. 1947, 11, 41–76. [Google Scholar] [CrossRef]
  2. Barclay, J.H. Conversion of total leaf area to projected leaf area in lodgepole pine and Douglas-fir. Tree Physiol. 1998, 18, 185–193. [Google Scholar] [CrossRef] [PubMed]
  3. Chen, J.M.; Black, T.A. Defining leaf area index for non-flat leaves. Plant Cell Environ. 1992, 15, 421–429. [Google Scholar] [CrossRef]
  4. Nie, S.; Wang, C.; Dong, P.; Xi, X. Estimating leaf area index of maize using airborne full-waveform lidar data. Remote Sens. Lett. 2016, 7, 111–120. [Google Scholar] [CrossRef]
  5. Mirzaie, M.; Darvishzadeh, R.; Shakiba, A.; Matkan, A.A.; Atzberger, C.; Skidmore, A. Comparative analysis of different uni- and multi-variate methods for estimation of vegetation water content using hyper-spectral measurements. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 1–11. [Google Scholar] [CrossRef]
  6. Pan, Y.Z.; Li, L.; Zhang, J.S.; Liang, S.L.; Hou, D. Crop area estimation based on MODIS-EVI time series according to distinct characteristics of key phenology phases: A case study of winter wheat area estimation in small scale area. J. Remote Sens. 2011, 15, 578–594. [Google Scholar]
  7. Deng, L.Y.; Shen, Z.F.; Ke, Y.M.; Xu, Z.Y. Winter wheat planting area extraction using multi-temporal remote sensing images based on field parcel. Trans. CSAE 2018, 34, 157–164. [Google Scholar]
  8. Pei, H.J.; Feng, H.K.; Li, C.C.; Jin, X.L.; Li, Z.H.; Yang, G.J. Remote sensing monitoring of winter wheat growth with UAV based on comprehensive index. Trans. CSAE 2017, 33, 74–82. [Google Scholar]
  9. Wang, P.; Zhou, Y.; Huo, Z.; Han, L.; Qiu, J.; Tan, Y.; Liu, D. Monitoring growth condition of spring maize in northeast china using a process-based model. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 27–36. [Google Scholar] [CrossRef]
  10. Bumsuk, S.; Jihye, L.; Kyung, D.L.; Sukyoung, H.; Sinkyu, K. Improving remotely-sensed crop monitoring by NDVI-based crop phenology estimators for corn and soybeans in Iowa and Illinois, USA. Field Crop. Res. 2019, 238, 113–128. [Google Scholar]
  11. Ritika, S.; Subrata, N.; Patel, N.R. Estimating leaf area index and light extinction coefficient using Random Forest regression algorithm in a tropical moist deciduous forest, India. Ecol. Inform. 2019, 52, 94–102. [Google Scholar]
  12. Pasolli, L.; Asam, S.; Castelli, M.; Bruzzone, L.; Wohlfahrt, G.; Zebisch, M.; Notarnicola, C. Retrieval of leaf area index in mountain grasslands in the alps from modis satellite imagery. Remote Sens. Environ. 2015, 165, 159–174. [Google Scholar] [CrossRef]
  13. Li, H.; Chen, Z.X.; Jiang, Z.W.; Wu, W.B.; Ren, J.Q.; Liu, B.; Tuya, H. Comparative analysis of GF-1, HJ-1, and Landsat-8 data for estimating the leaf area index of winter wheat. J. Integr. Agric. 2017, 16, 266–285. [Google Scholar] [CrossRef]
  14. Xia, Y.; Ni, W.; Yong, L.; Tao, C.; Tian, Y.C.; Qi, C.; Yan, Z. Estimation of Wheat LAI at Middle to High Levels Using Unmanned Aerial Vehicle Narrowband Multispectral Imagery. Remote Sens. 2017, 9, 1304. [Google Scholar]
  15. Gao, L.; Yang, G.J.; Yu, H.Y.; Xu, B.; Zhao, X.Q.; Dong, J.H.; Ma, Y.B. Retrieving winter wheat leaf area index based on unmanned aerial vehicle hyperspectral remoter sensing. Trans. CSAE 2016, 32, 113–120. [Google Scholar]
  16. Li, C.C.; Niu, Q.L.; Yang, G.J.; Feng, H.K.; Liu, J.G.; Wang, Y.J. Estimation of Leaf Area Index of Soybean Breeding Materials Based on UAV Digital Images. Trans. Chin. Soc. Agric. Mach. 2017, 48, 147–158. [Google Scholar]
  17. Kim, D.W.; Yun, H.S.; Jeong, S.J.; Kwon, Y.S.; Kim, S.G.; Lee, W.S.; Kim, H.J. Modeling and Testing of Growth Status for Chinese Cabbage and White Radish with UAV-Based RGB Imagery. Remote Sens. 2018, 10, 563. [Google Scholar] [CrossRef]
  18. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  19. Liu, R.Y.; Wang, J.H.; Yang, G.J.; Huang, W.J.; Li, W.G.; Chang, H.; Li, X.W. Comparison of ground-based LAI measuring methods on winter wheat. Trans. CSAE 2011, 27, 220–224. [Google Scholar]
  20. Niu, Q.L.; Feng, H.K.; Yang, G.J.; Li, C.C.; Yang, H.; Xu, B.; Zhao, Y.X. Monitoring plant height and leaf area index of maize breeding material based on UAV digital images. Trans. CSAE 2018, 34, 73–82. [Google Scholar]
  21. Torres-Sánchez, J.; Peña, J.M.; De Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  22. Gitelson, A.A.; Viña, A.; Arkebauer, T.J.; Rundquist, D.C. Remote estimation of leaf area index and green leaf biomass in maize canopies. Geophys. Res. Lett. 2003, 30, 335–343. [Google Scholar] [CrossRef]
  23. Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from chris/proba data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
  24. Sellaro, R.; Crepy, M.; Trupkin, S.A.; Karayekov, E.; Buchovsky, A.S.; Rossi, C.; Casal, J.J. Cryptochrome as a sensor of the blue/green ratio of natural radiation in arabidopsis. Plant. Physiol. 2010, 154, 401–409. [Google Scholar] [CrossRef] [PubMed]
  25. Ahmad, I.S.; Reid, J.F. Evaluation of colour representations for maize images. J. Agric. Eng. Res. 1996, 63, 185–195. [Google Scholar] [CrossRef]
  26. Kawashima, S.; Nakatani, M. An algorithm for estimating chlorophyll content in leaves using a video camera. Ann. Bot. 1998, 81, 49–54. [Google Scholar] [CrossRef]
  27. Saberioon, M.M.; Amin, M.S.M.; Anuar, A.R.; Gholizadeh, A.; Wayayok, A.; Khairunniza-Bejo, S. Assessment of rice leaf chlorophyll content using visible bands at different growth stages at both the leaf and canopy scale. Int. J. Appl. Earth Obs. Geoinf. 2014, 32, 35–45. [Google Scholar] [CrossRef]
  28. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.J.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  29. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  30. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially located platform and aerial photography for documentation of grazing impacts on wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  31. Liu, S.F.; Dang, Y.G.; Fang, Z.G. Grey System Theory and Its Application; Science Press: Beijing, China, 2004. [Google Scholar]
  32. Deng, J.L. Basic of Grey System; Huazhong University of Science and Technology Press: Wuhan, China, 2002. [Google Scholar]
  33. Xia, X.; Sun, Y.; Wu, K.; Jiang, Q. Optimization of a straw ring-die briquetting process combined analytic hierarchy process and grey correlation analysis method. Fuel Process. Technol. 2016, 152, 303–309. [Google Scholar] [CrossRef]
  34. Umut, H.; Mamat, S.; Nijat, K.; Nigela, T.; Wang, J.Z.; Irxat, A. Hyperspectral Estimation Model of Leaf Water Content in Spring Wheat Based on Grey Correlational Analysis. Spectrosc. Spectr. Anal. 2018, 38, 3905–3911. [Google Scholar]
  35. Fang, S.S.; Yao, X.S.; Zhang, J.Q.; Han, M. Grey correlation analysis on travel modes and their influence factors. Procedia Eng. 2017, 174, 347–352. [Google Scholar] [CrossRef]
  36. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
Figure 1. Sketch map of the study area.
Figure 1. Sketch map of the study area.
Sustainability 11 06829 g001
Figure 2. Image mosaic result of UAV.
Figure 2. Image mosaic result of UAV.
Sustainability 11 06829 g002
Figure 3. Scanning and classification results of wheat leaves. (a) Sampling method; (b) Scanning results; (c) Classification results.
Figure 3. Scanning and classification results of wheat leaves. (a) Sampling method; (b) Scanning results; (c) Classification results.
Sustainability 11 06829 g003
Figure 4. Fitting analysis chart between the measured and the predicted values of the LAI.
Figure 4. Fitting analysis chart between the measured and the predicted values of the LAI.
Sustainability 11 06829 g004
Figure 5. Spatial distribution map of the LAI.
Figure 5. Spatial distribution map of the LAI.
Sustainability 11 06829 g005
Table 1. UAV RGB-based image parameters related to the LAI.
Table 1. UAV RGB-based image parameters related to the LAI.
AbbreviationNameEquation References
RDN value of Red ChannelR = DNR[16]
GDN value of Green ChannelG = DNG[16]
BDN value of Blue ChannelB = DNB[16]
rNormalized Redness Intensity D N R / ( D N R + D N G + D N B ) [20]
gNormalized Greenness Intensity D N G / ( D N R + D N G + D N B ) [20]
bNormalized blueness Intensity D N B / ( D N R + D N G + D N B ) [20]
EXGExcess Green Index 2 × D N G D N R D N B [21]
VARIVisible Atmospherically Resistant Index D N G D N R D N G + D N R D N B [22]
GRRIGreen Red Ratio Index D N G D N R [23]
GBRIGreen Blue Ratio Index D N G D N B [24]
RBRIRed Blue Ratio Index D N R D N B [24]
INTColor Intensity ( D N R + D N G + D N B ) / 3 [25]
IKAWKawashima Index ( D N R D N B ) / ( D N R + D N B ) [26]
IPCAPrincipal Component Analysis Index 0.994 | D N R D N B | + 0.961 | D N G D N B | + 0.914 | D N G D N R | [27]
MGRVIModified Green
Red Vegetation Index
( D N G 2 D N R 2 ) / ( D N G 2 + D N R 2 ) [28]
RGBVIRed Green Blue Vegetation Index ( D N G 2 D N B × D N R ) / ( D N G 2 + D N B × D N R ) [28]
GRVIGreen Red Vegetation Index ( D N G D N R ) / ( D N G + D N R ) [29]
GLAGreen Leaf Algorithm ( 2 × D N G D N R D N B ) / ( 2 × D N G + D N R + D N B ) [30]
CIVEColor Index of Vegetation 0.441 D N R 0.881 D N G + 0.385 D N B + 18.7875 [20]
VDVIVisible Differential Vegetation Index 2 × D N G D N R D N B 2 × D N G + D N R + D N B [21]
Table 2. Grey correlation degree and order between the LAI and the UAV RGB-based image parameters.
Table 2. Grey correlation degree and order between the LAI and the UAV RGB-based image parameters.
RGB-Based Image ParametersGrey Correlation Degree (Order)RGB-Based Image ParametersGrey Correlation Degree (Order)
VARI0.9166(1)GBRI0.8638(11)
RGBVI0.8891(2)EXG0.8478(12)
B0.8879(3)R0.8462(13)
GLA0.8837(4)INT0.8426(14)
G0.8817(5)IPCA0.8315(15)
VDVI0.8804(6)r0.7919(16)
MGRVI0.8801(7)CIVE0.7877(17)
g0.8718(8)b0.7872(18)
GRVI0.8697(9)RBRI0.7573(19)
GRRI0.8647(10)IKAW0.7543(20)
Table 3. LAI estimation model based on UAV RGB-based image parameters.
Table 3. LAI estimation model based on UAV RGB-based image parameters.
Type of ModelsRGB-Based Image ParametersModel EquationR2RMSE
univariate regression modelVARIy = 11.1632x + 2.61150.7250.475
y = 11.6809x0.55120.7220.474
y = 3.4525e1.7984x0.7140.485
y= −4.0725x2 + 13.6643x + 2.25100.7260.473
multivariate regression modelVARI, RGBVI, B, GLALAI = 3.9941 × VARI + 4.8813 × RGBVI + 0.0122 × B + 6.0529 × GLA + 1.28180.7670.422

Share and Cite

MDPI and ACS Style

Hasan, U.; Sawut, M.; Chen, S. Estimating the Leaf Area Index of Winter Wheat Based on Unmanned Aerial Vehicle RGB-Image Parameters. Sustainability 2019, 11, 6829. https://doi.org/10.3390/su11236829

AMA Style

Hasan U, Sawut M, Chen S. Estimating the Leaf Area Index of Winter Wheat Based on Unmanned Aerial Vehicle RGB-Image Parameters. Sustainability. 2019; 11(23):6829. https://doi.org/10.3390/su11236829

Chicago/Turabian Style

Hasan, Umut, Mamat Sawut, and Shuisen Chen. 2019. "Estimating the Leaf Area Index of Winter Wheat Based on Unmanned Aerial Vehicle RGB-Image Parameters" Sustainability 11, no. 23: 6829. https://doi.org/10.3390/su11236829

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop