# Estimation of Fv/Fm in Spring Wheat Using UAV-Based Multispectral and RGB Imagery with Multiple Machine Learning Methods

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

^{2}= 0.925, RMSLE = 0.014, MAPE = 0.026). The combined analysis suggests that extracting vegetation indices (SIPI, ExR, and VEG) from RGB and MS remote images by UAV as input variables of the model and using the ARD model can significantly improve the accuracy of Fv/Fm estimation at flowering stage. This approach provides new technical support for rapid and accurate monitoring of Fv/Fm in spring wheat in the Hetao Irrigation District.

## 1. Introduction

## 2. Materials and Methods

#### 2.1. Study Site and Experimental Design

^{2}. The plots were arranged in randomized groups. The sowing rate was set at 300 kg/ha. Phosphorus fertilizer was applied as a basal fertilizer during sowing, and no potassium fertilizer was applied during the entire reproductive phase. Three flood irrigations were performed at the tillering, heading, and grain filling stages, each with a volume of 900 m

^{3}/ha.

#### 2.2. UAV Multispectral Data Acquisition and Processing

#### 2.3. Construction and Selection of Spectral Indices

#### 2.4. Fluorescence Data Acquisition and Processing

#### 2.5. Construction of Regression Model

#### 2.6. Segmentation of Dataset and Accuracy Evaluation

^{2}(Coefficient of Determination) is a statistical measure that represents the proportion of the variance in the dependent variable that is predictable from the independent variables. In regression analysis, R

^{2}is used to evaluate the goodness of fit of the model. Generally, it ranges from 0 to 1, with a higher value indicating a better fit. An R

^{2}of 1 indicates that the model perfectly predicts the target variable, while an R

^{2}of 0 indicates that the model does not explain any variance in the target variable;

## 3. Results

#### 3.1. Basic Statistical Information of the Fv/Fm Dataset

#### 3.2. Correlation Analysis of Fv/Fm with Multispectral and RGB Vegetation Indices

#### 3.3. Important Features Selected after Data Pre-Processing

#### 3.4. Model Based on RGB VIs Development and Evaluation

^{2}accuracy scores, with the top performing model listed first. Evaluation metrics such as mean absolute error (MAE), mean squared error (MSE), root mean squared error (RMSE), relative mean squared logarithmic error (RMLSE), mean absolute percentage error (MAPE), and computation time (TT) were used to assess model performance. The gradient boosting regression (GBR) model achieved the highest accuracy with an R

^{2}score of 0.800, followed closely by the random forest (RF) model with an R

^{2}score of 0.795. Most of the other models performed relatively poorly, with R

^{2}scores ranging from 0.789 to −116.050. Notably, the worst performing models, including lasso, elastic net, least angle regression (LLAR), dummy, support vector machine (SVM), and kernel ridge regression (KR), had negative R

^{2}scores. In addition to the R

^{2}scores, the evaluation metrics indicated that the best performing models also had the lowest MAE, MSE, RMSE, RMLSE, and MAPE scores, demonstrating the models’ ability to make accurate predictions with low error rates. However, there was considerable variation in computation time among the models, with some taking significantly longer than others. In conclusion, the results suggest that GBR models are the most accurate and efficient for estimating Fv/Fm using RGB vegetation indices.

#### 3.5. Model Based on MS VIs Development and Evaluation

^{2}scores, ranging from 0.860 to 0.849, were Huber, LR, Ridge, LAR, OMP, BR, ARD, and TR. These models had the lowest MAE, MSE, RMSE, and RMLSE values among all the models, indicating that they produced the most accurate estimates of vegetation indices. The computation time for these models ranged from 0.004 to 0.316 s, with the LR model having the longest computation time. The next group of models, with R

^{2}scores ranging from 0.794 to 0.684, included KNN, RF, CatBoost, ADA, GBR, ET, and PAR. These models had higher MAE, MSE, RMSE, and RMLSE values than the top models, indicating that they produced less accurate estimates of vegetation indices. The computation time for these models ranged from 0.006 to 0.190 s. The last group of models, with R

^{2}scores ranging from 0.668 to −0.567, included XGB, DT, RANSC, LGBM, SVM, Lasso, EN, LLAR, Dummy, MLP, and KR. These models had the lowest R

^{2}scores and the highest MAE, MSE, RMSE, and RMLSE values, indicating that they produced the least accurate estimates of Fv/Fm. The computation time for these models ranged from 0.004 to 0.298 s, with the KR model having the longest computation time. Overall, the Huber model was the most accurate for estimating Fv/Fm using multispectral vegetation indices.

#### 3.6. Model Based on RGB and MS VIs Development and Evaluation

^{2}accuracy scores of 0.868, followed closely by the Ridge, LR, LAR, BR, and Huber models, all with R

^{2}values of 0.858. The Tr model obtained an R

^{2}of 0.849, while RANSC, KNN, RF, ET, ADA, Catboost, and XGB models had R

^{2}values ranging from 0.830 to 0.723. The GBR and DT models had R

^{2}values of 0.721 and 0.690, respectively. The PAR model had an R

^{2}of 0.593, indicating lower accuracy than the previous models. On the other hand, the LGMB, Lasso, EN, LLAR, and Dummy models had negative R

^{2}values, indicating poor accuracy. Moreover, the SVM and MLP models had low R

^{2}values of −0.073 and −0.292, respectively. The KR model had the worst performance with high MAE, MSE, RMSE, and RMLSE values and a very low R

^{2}value of −118.159. In summary, the ARD and OMP models demonstrated the same highest accuracy for estimating Fv/Fm using both MS and RGB data. However, the computation time of OMP was found to be higher than that of ARD. Therefore, the optimal model is ARD.

#### 3.7. Hyperparameter Optimization of the Highest Accuracy Model in Three Modes

^{2}= 0.925, RMSLE = 0.014 and MAPE = 0.026).

## 4. Discussion

^{2}of test set was 0.920 for multispectral vegetation indices and 0.838 for RGB vegetation indices, indicating that the former had a higher accuracy in estimating Fv/Fm values. Multispectral images acquired by UAVs can provide information about the spectral reflectance of vegetation, which changes simultaneously in the canopy when stress occurs [80], and this can be used to estimate Fv/Fm. The key idea of this approach is that Fv/Fm is related to the fluorescence yield of photosystem II, which in turn is related to the chlorophyll content of leaves. The chlorophyll content can be estimated from the reflectance in the red and near-infrared (NIR) bands [81]. The important spectral index MSAVI2 extracted from the multispectral estimation in this study was calculated precisely from the red and NIR bands. In addition to multispectral images, RGB images can also be used to estimate Fv/Fm values by color information. In general, the color of plants changes when they are under stress, and although some color changes are difficult for the human eye to observe, color values can be quantified by computer technology [82]. However, due to the lack of vegetation-sensitive red edge and infrared bands, the RGB images do not provide enough spectral information, so the estimated Fv/Fm is not as accurate as the MS images.

^{2}of 0.807 and 0.822 and RMSE of 0.018 and 0.017, respectively. Yi et al. [8] developed a generalized estimation model of Fv/Fm for poplar leaves and cherry leaves with R

^{2}= 0.88. In a study [9] estimating Fv/Fm in winter wheat, training set R

^{2}= 0.50, RMSE = 0.012, test set R

^{2}= 0.55, RMSE = 0.014. In these previous estimation studies of Fv/Fm based on hyperspectral, the highest R

^{2}was of 0.88, and the highest accuracy combined model without hyperparameter optimization in this study had R

^{2}of 0.868, which shows that the estimation accuracy of multispectral and RGB is not as good as hyperspectral. However, Yang et al. [87] showed that hyperparameter optimization can be helpful in improving the estimation accuracy of the model. In this study, the test set R

^{2}of both the MS estimation and the estimation of the MS and RGB combination reaches 0.92 after hyperparameter optimization, indicating that the accuracy gap caused by the sensors can be narrowed or even surpassed by algorithm optimization. Negative values of R

^{2}were observed in all three estimation models, which are usually uncommon, indicating that the prediction results using the model are worse than the estimation results using the mean, because the mean reflects the central tendency of the data, while the prediction results of the model deviate more from the true value than the mean [88]. Notably, the performance of linear regression (LR), ridge regression (Ridge), least angle regression (LAR), and orthogonal matching pursuit (OMP) models was almost identical. Moreover, the reason for this was that only MSAVI2 was screened in the multispectral vegetation indices and LR, Ridge, LAR, and OMP are all linear models, indicating that different linear models may not have a significant effect on the accuracy of estimation when a single vegetation index was used as a characteristic variable.

## 5. Conclusions

^{2}= 0.925, RMSLE = 0.014, MAPE = 0.026). Based on the results of this study, there is great potential for the use of remote sensing and machine learning for efficient and sustainable plant health monitoring and management.

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Zhao, Z.; Li, M.; Wu, Q.; Zhang, Y. Effects of Different Soil Moisture-Holding Strategies on Growth Characteristics, Yield and Quality of Winter-Seeded Spring Wheat. Agronomy
**2022**, 12, 2746. [Google Scholar] [CrossRef] - Baker, N.R. Chlorophyll Fluorescence: A Probe of Photosynthesis In Vivo, Annu. Rev. Plant Biol.
**2008**, 59, 89–113. [Google Scholar] [CrossRef][Green Version] - Maxwell, K.; Johnson, G.N. Chlorophyll fluorescence–A practical guide. J. Exp. Bot.
**2000**, 51, 659–668. [Google Scholar] [CrossRef] - Baker, N.R.; Rosenquist, E. Applications of chlorophyll fluorescence can improve crop production strategies: An examination of future possibilities. J. Exp. Bot.
**2004**, 55, 1607–1621. [Google Scholar] [CrossRef][Green Version] - Genty, B.; Briantais, J.M.; Baker, N.R. The relationship between the quantum yield of photosynthetic electron transport and quenching of chlorophyll fluorescence. Biochim. Biophys. Acta (BBA)-Gen. Subj.
**1989**, 990, 87–92. [Google Scholar] [CrossRef] - Sharma, D.K.; Andersen, S.B.; Ottosen, C.O.; Rosenqvist, E. Wheat cultivars selected for high Fv/Fm under heat stress maintain high photosynthesis, total chlorophyll, stomatal conductance, transpiration and dry matter. Physiol. Plant.
**2015**, 153, 284–298. [Google Scholar] [CrossRef] - Zhao, R.; An, L.; Song, D.; Li, M.; Qiao, L.; Liu, N.; Sun, H. Detection of chlorophyll fluorescence parameters of potato leaves based on continuous wavelet transform and spectral analysis. Spectrochim. Acta A
**2021**, 259, 119768. [Google Scholar] [CrossRef] - Yi, P.; Aoli, Z.; Tinge, Z.; Sheng, H.F.; Yan, G.; Yan, Q.T.; Ying, Z.; Kan, L. Using remotely sensed spectral reflectance to indicate leaf photosynthetic efficiency derived from active fluorescence measurements. J. Appl. Rem. Sens.
**2017**, 11, 026034. [Google Scholar] [CrossRef][Green Version] - Jia, M.; Li, D.; Colombo, R.; Wanlg, Y.; Wang, X.; Cheng, T.; Zhu, Y.; Yao, X.; Xu, C.; Ouer, G.; et al. Quantifying Chlorophyll Fluorescence Parameters from Hyperspectral Reflectance at the Leaf Scale under Various Nitrogen Treatment Regimes in Winter Wheat. Remote Sens.
**2019**, 11, 2838. [Google Scholar] [CrossRef][Green Version] - Liu, J.; Zhu, Y.; Tao, X.; Chen, X.; Li, X. Rapid prediction of winter wheat yield and nitrogen use efficiency using consumer-grade unmanned aerial vehicles multispectral imagery. Front. Plant Sci.
**2022**, 13, 1032170. [Google Scholar] [CrossRef] [PubMed] - Torres-Tello, J.W.; Ko, S. A novel approach to identify the spectral bands that predict moisture content in canola and wheat. Biosyst. Eng.
**2021**, 210, 91–103. [Google Scholar] [CrossRef] - Wu, Q.; Zhang, Y.; Zhao, Z.; Xie, M.; Hou, D. Estimation of Relative Chlorophyll Content in Spring Wheat Based on Multi-Temporal UAV Remote Sensing. Agronomy
**2023**, 13, 211. [Google Scholar] [CrossRef] - Wilke, N.; Siegmann, B.; Postma, J.A.; Muller, O.; Krieger, V.; Pude, R.; Rascher, U. Assessment of plant density for barley and wheat using UAV multispectral imagery for high-throughput field phenotyping. Comput. Electron. Agric.
**2021**, 189, 106380. [Google Scholar] [CrossRef] - Du, X.; Wan, L.; Cen, H.; Chen, S.; Zhu, J.; Wang, H.; He, Y. Multi-temporal monitoring of leaf area index of rice under different nitrogen treatments using UAV images. Int. J. Precis. Agric. Aviat.
**2020**, 1, 11–18. [Google Scholar] [CrossRef] - Maimaitijiang, M.; Sagan, V.; Sidike, P.; Daloye, A.M.; Erkbol, H.; Fritschi, F.B. Crop monitoring using satellite/UAV data fusion and machine learning. Remote Sens.
**2020**, 12, 1357. [Google Scholar] [CrossRef] - Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric.
**2008**, 63, 282–293. [Google Scholar] [CrossRef] - Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE
**1995**, 38, 259–269. [Google Scholar] [CrossRef] - Vincent, L. Morphological grayscale reconstruction in image analysis: Applications and efficient algorithms. IEEE Trans. Image Process
**1993**, 2, 176–201. [Google Scholar] [CrossRef][Green Version] - Sellers, P.J.; Hall, F.G.; Nielsen, G.A. Vegetation/atmosphere transfer models. Adv. Space Res.
**1987**, 7, 149–159. [Google Scholar] [CrossRef] - Hague, T.; Tillett, N.D.; Wheeler, H. Automated crop and weed monitoring in widely spaced cereals. Precis. Agric.
**2006**, 7, 21–32. [Google Scholar] [CrossRef] - Guijarro, M.; Pajares, G.; Riomoros, I.; Herrera, P.J.; Burgos-Artizzu, X.P.; Ribeiro, A. Automatic segmentation of relevant textures in agricultural images. Comput. Electron. Agric.
**2011**, 75, 75–83. [Google Scholar] [CrossRef][Green Version] - Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; De Colstoun, E.B.; McMurtrey, J.E. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ.
**2000**, 74, 229–239. [Google Scholar] [CrossRef] - Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ.
**1979**, 8, 127–150. [Google Scholar] [CrossRef][Green Version] - Kawashima, S. A new vegetation index for the monitoring of vegetation phenology and thermal stress. Int. J. Remote Sens.
**2002**, 23, 2003–2017. [Google Scholar] [CrossRef] - Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ.
**2002**, 80, 76–87. [Google Scholar] [CrossRef][Green Version] - Wang, X.; Wang, M.; Wang, S.; Wu, Y. Extraction of vegetation information from visible unmanned aerial vehicle images. Trans. Chin. Soc. Agric. Eng.
**2015**, 31, 152–159. [Google Scholar] [CrossRef] - Saberioon, M.M.; Amin, M.S.M.; Anuar, A.R.; Gholizadeh, A. Assessment of rice leaf chlorophyll content using visible bands at different growth stages at both the leaf and canopy scale. Int. J. Appl. Earth Obs. Geoinf.
**2014**, 32, 35–45. [Google Scholar] [CrossRef] - Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf.
**2015**, 39, 79–87. [Google Scholar] [CrossRef] - Huete, A.; Justice, C.; Liu, H. Development of vegetation and soil indices for MODIS-EOS. Remote Sens. Environ.
**1994**, 49, 224–234. [Google Scholar] [CrossRef] - Gitelson, A.A.; Merzlyak, M.N. Remote sensing of chlorophyll concentration in higher plant leaves. Adv. Space Res.
**1998**, 22, 689–692. [Google Scholar] [CrossRef] - Gamon, J.A.; Surfus, J.S. Assessing leaf pigment content and activity with a reflectometer. New Phytol.
**1999**, 143, 105–117. [Google Scholar] [CrossRef] - Birth, G.S.; McVey, G.R. Measuring the color of growing turf with a reflectance spectrophotometer. Agron. J.
**1968**, 60, 640–643. [Google Scholar] [CrossRef] - Jasper, J.; Reusch, S.; Link, A. Active sensing of the N status of wheat using optimized wavelength combination: Impact of seed rate, variety and growth stage. Precis. Agric.
**2009**, 9, 23–30. [Google Scholar] [CrossRef] - Filella, I.; Penuelas, J. The red edge position and shape as indicators of plant chlorophyll content, biomass and hydric status. Int. J. Remote Sens.
**1994**, 15, 1459–1470. [Google Scholar] [CrossRef] - Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ.
**1994**, 48, 119–126. [Google Scholar] [CrossRef] - Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ.
**1996**, 55, 95–107. [Google Scholar] [CrossRef] - Xing, N.C.; Huang, W.J.; Xie, Q.Y.; Shi, Y.; Ye, H.C.; Dong, Y.Y.; Wu, M.Q.; Sun, G.; Jiao, Q.J. A Transformed Triangular Vegetation Index for Estimating Winter Wheat Leaf Area Index. Remote Sens.
**2019**, 12, 16. [Google Scholar] [CrossRef][Green Version] - Sims, D.A.; Gamon, J.A. Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ.
**2002**, 81, 337–354. [Google Scholar] [CrossRef] - Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. NASA Spec. Publ.
**1973**, 351, 309–317. [Google Scholar] - Chen, J.M. Evaluation of vegetation indices and a modified simple ratio for boreal applications. Can. J. Remote Sens.
**1996**, 22, 229–242. [Google Scholar] [CrossRef] - Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ.
**1988**, 25, 295–309. [Google Scholar] [CrossRef] - Fitzgerald, G.; Rodriguez, D.; O’Leary, G. Measuring and predicting canopy nitrogen nutrition in wheat using a spectral index—The canopy chlorophyll content index (CCCI). Field Crops Res.
**2010**, 116, 318–324. [Google Scholar] [CrossRef] - Kimura, R.; Okada, S.; Miura, H.; Kamichika, M. Relationships among the leaf area index, moisture availability, and spectral reflectance in an upland rice field. Agric. Water Manag.
**2004**, 69, 83–100. [Google Scholar] [CrossRef] - Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ.
**2008**, 112, 2341–2353. [Google Scholar] [CrossRef] - Devadas, R.; Lamb, D.W.; Simpfendorfer, S.; Backhouse, D. Evaluating ten spectral vegetation indices for identifying rust infection in individual wheat leaves. Precis. Agric.
**2009**, 10, 459–470. [Google Scholar] [CrossRef] - Zha, Y.; Gao, J.; Ni, S. Use of normalized difference built-up index in automatically mapping urban areas from TM imagery. Int. J. Remote Sens.
**2003**, 24, 583–594. [Google Scholar] [CrossRef] - Ju, C.H.; Tian, Y.C.; Yao, X.; Cao, W.X.; Zhu, Y.; Hannaway, D. Estimating leaf chlorophyll content using red edge parameters. Pedosphere
**2010**, 20, 633–644. [Google Scholar] [CrossRef] - Clevers, J.G.P.W.; Gitelson, A.A. Remote estimation of crop and grass chlorophyll and nitrogen content using red-edge bands on Sentinel-2 and -3. Int. J. Appl. Earth Obs. Geoinf.
**2013**, 23, 344–351. [Google Scholar] [CrossRef] - Wu, C.Y.; Niu, Z.; Tang, Q.; Huang, W.J. Estimating chlorophyll content from hyperspectral vegetation indices: Modeling and validation. Agric. For. Meteorol.
**2008**, 148, 1230–1241. [Google Scholar] [CrossRef] - Drucker, H. Improving Regressors Using Boosting Techniques. In Proceedings of the Icml; Citeseer: Princeton, NJ, USA, 1997; Volume 97, pp. 107–115. [Google Scholar]
- Mørup, M.; Hansen, L.K. Automatic relevance determination for multi-way models. J. Chemom. A J. Chemom. Soc.
**2009**, 23, 352–363. [Google Scholar] [CrossRef] - MacKay, D.J.C. Bayesian interpolation. Neural Comput.
**1992**, 4, 415–447. [Google Scholar] [CrossRef] - Prokhorenkova, L.; Gusev, G.; Vorobev, A.; Dorogush, A.V.; Gulin, A. CatBoost: Unbiased Boosting with Categorical Features. Adv. Neural Inf. Process. Syst.
**2018**, 31, 6638–6648. [Google Scholar] - Myles, A.J.; Feudale, R.N.; Liu, Y.; Woody, N.A.; Brown, S.D. An Introduction to Decision Tree Modeling. J. Chemom. A J. Chemom. Soc.
**2004**, 18, 275–285. [Google Scholar] [CrossRef] - Wilde, J. Identification of multiple equation probit models with endogenous dummy regressors. Econ. Lett.
**2000**, 69, 309–312. [Google Scholar] [CrossRef] - Zou, H.; Hastie, T. Regularization and variable selection via the elastic net. J. R. Stat. Soc. Ser. B Stat. Methodol.
**2005**, 67, 301–320. [Google Scholar] [CrossRef][Green Version] - Geurts, P.; Ernst, D.; Wehenkel, L. Extremely randomized trees. Mach. Learn.
**2006**, 63, 3–42. [Google Scholar] [CrossRef][Green Version] - Chen, T.; He, T.; Benesty, M.; Khotilovich, V.; Tang, Y.; Cho, H.; Chen, K.; Mitchell, R.; Cano, I.; Zhou, T.; et al. Xgboost: Extreme Gradient Boosting. R Package Version 0.4-2.
**2015**, 1, 1–4. [Google Scholar] - Friedman, J.H. Greedy function approximation: A gradient boosting machine. Ann. Stat.
**2001**, 29, 1189–1232. [Google Scholar] [CrossRef] - Huber, P.J. Robust estimation of a location parameter. Ann. Math. Stat.
**1964**, 35, 73–101. [Google Scholar] [CrossRef] - Zhang, M.-L.; Zhou, Z.-H. ML-KNN: A Lazy Learning Approach to Multi-Label Learning. Pattern Recognit.
**2007**, 40, 2038–2048. [Google Scholar] [CrossRef][Green Version] - Durrant, S.D.; Bissell, J. Performance of kernel-based regression methods in the presence of outliers. J. Process Control
**2010**, 20, 959–967. [Google Scholar] [CrossRef] - Efron, B.; Hastie, T.; Johnstone, I.; Tibshirani, R. Least angle regression. Ann. Stat.
**2004**, 32, 407–499. [Google Scholar] [CrossRef][Green Version] - Tibshirani, R. Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B Stat. Methodol.
**1996**, 58, 267–288. [Google Scholar] [CrossRef] - GuolinKe, Q.M.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.-Y. Lightgbm: A Highly Efficient Gradient Boosting Decision Tree. Adv. Neural Inf. Process. Syst.
**2017**, 30, 52. [Google Scholar] - Su, X.; Yan, X.; Tsai, C.-L. Linear Regression. Wiley Interdiscip. Rev. Comput. Stat.
**2012**, 4, 275–294. [Google Scholar] [CrossRef] - Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature
**1986**, 323, 533–536. [Google Scholar] [CrossRef] - Tropp, J.A.; Gilbert, A.C. Signal Recovery from Random Measurements via Orthogonal Matching Pursuit. IEEE Trans. Inf. Theory
**2007**, 53, 4655–4666. [Google Scholar] [CrossRef][Green Version] - Crammer, K.; Dekel, O.; Keshet, J.; Shalev-Shwartz, S.; Singer, Y. Online passive-aggressive algorithms. J. Mach. Learn. Res.
**2006**, 7, 551–585. [Google Scholar] - Breiman, L. Random forests. Mach. Learn.
**2001**, 45, 5–32. [Google Scholar] [CrossRef][Green Version] - Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM
**1981**, 24, 381–395. [Google Scholar] [CrossRef] - Hoerl, A.E.; Kennard, R.W. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics
**1970**, 12, 55–67. [Google Scholar] [CrossRef] - Drucker, H.; Burges, C.J.; Kaufman, L.; Smola, A.; Vapnik, V. Support Vector Regression Machines. Adv. Neural Inf. Process. Syst.
**1996**, 9, 155–161. [Google Scholar] - Theil, H. A Rank-Invariant Method of Linear and Polynomial Regression Analysis. Indag. Math.
**1950**, 12, 173. [Google Scholar] - Bolhar-Nordenkampf, H.R.; Long, S.P.; Baker, N.R.; Oquist, G.; Schreiber, U.L.E.G.; Lechner, E.G. Chlorophyll fluorescence as a probe of the photosynthetic competence of leaves in the field: A review of current instrumentation. Funct. Ecol.
**1989**, 3, 497–514. [Google Scholar] [CrossRef] - Kalaji, H.M.; Schansker, G.; Brestic, M.; Bussotti, F.; Calatayud, A.; Ferroni, L. Frequently asked questions about chlorophyll fluorescence, the sequel. Photosynth. Res.
**2017**, 132, 13–66. [Google Scholar] [CrossRef] [PubMed][Green Version] - Acevedo, E.; Silva, P.; Silva, H. Wheat Growth and Physiology. Bread Wheat Improv. Prod.
**2002**, 30, 39–70. [Google Scholar] - Shewry, P.R. Wheat. J. Exp. Bot.
**2009**, 60, 1537–1553. [Google Scholar] [CrossRef] - Tilling, A.K.; O’Leary, G.J.; Ferwerda, J.G.; Jones, S.D.; Fitzgerald, G.J.; Rodriguez, D.; Belford, R. Remote Sensing of Nitrogen and Water Stress in Wheat. Field Crops Res.
**2007**, 104, 77–85. [Google Scholar] [CrossRef] - Guo, Y.; Chen, S.; Li, X.; Cunha, M.; Jayavelu, S.; Cammarano, D.; Fu, Y. Machine learning-based approaches for predicting SPAD values of maize using multi-spectral images. Remote Sens.
**2022**, 14, 1337. [Google Scholar] [CrossRef] - Zarco-Tejada, P.J.; Berni, J.A.; Suárez, L.; Sepulcre-Cantó, G.; Morales, F.; Miller, J.R. Imaging chlorophyll fluorescence with an airborne narrow-band multispectral camera for vegetation stress detection. Remote Sens. Environ.
**2009**, 113, 1262–1275. [Google Scholar] [CrossRef] - Liu, Y.; Hatou, K.; Aihara, T.; Kurose, S.; Omasa, K. A robust vegetation index based on different UAV RGB images to estimate SPAD values of naked barley leaves. Remote Sens.
**2021**, 13, 686. [Google Scholar] [CrossRef] - Ahmad, S.; Kalra, A.; Stephen, H. Estimating soil moisture using remote sensing data: A machine learning approach. Adv. Water Resour.
**2010**, 33, 69–80. [Google Scholar] [CrossRef] - Korotcov, A.; Tkachenko, V.; Russo, D.P.; Ekins, S. Comparison of deep learning with multiple machine learning methods and metrics using diverse drug discovery data sets. Mol. Pharmaceut.
**2017**, 14, 4462–4475. [Google Scholar] [CrossRef] [PubMed] - Yuan, Y.; Wang, X.; Shi, M.; Wang, P. Performance comparison of RGB and multispectral vegetation indices based on machine learning for estimating Hopea hainanensis SPAD values under different shade conditions. Front. Plant Sci.
**2022**, 13, 928953. [Google Scholar] [CrossRef] [PubMed] - Zhang, J. Multi-source remote sensing data fusion: Status and trends. Int. J. Image Data Fus.
**2010**, 1, 5–24. [Google Scholar] [CrossRef][Green Version] - Yang, L.; Shami, A. On hyperparameter optimization of machine learning algorithms: Theory and practice. Neurocomputing
**2020**, 415, 295–316. [Google Scholar] [CrossRef] - Mittlböck, M. Calculating adjusted R
^{2}measures for Poisson regression models. Comput. Meth. Prog. Biomed.**2002**, 68, 205–214. [Google Scholar] [CrossRef] [PubMed]

**Figure 3.**Calibration and validation results of the highest accuracy model in MS, RGB, RGB + MS data after hyperparameter optimization.

Year | Organic Matter (g/kg) | Alkaline-N (mg/kg) | Available-P (mg/kg) | Available-K (mg/kg) | PH |
---|---|---|---|---|---|

2020 | 14.31 | 59.63 | 21.32 | 117.71 | 7.62 |

2021 | 13.94 | 56.27 | 20.83 | 110.47 | 7.59 |

Band | Center Wavelength/nm | Bandwidth/nm |
---|---|---|

Blue (B) | 450 | 16 |

Green (G) | 560 | 16 |

Red (R) | 650 | 16 |

Red Edge (RE) | 730 | 16 |

Infrared (NIR) | 840 | 26 |

Index Name | Calculation Formula | References |
---|---|---|

Red (R), Green (G), Blue (B) | Raw digital number value of each band | / |

Normalized Red | $r=\mathrm{R}/(R+G+B)$ | / |

Normalized Green | $g=G/(R+G+B)$ | / |

Normalized Blue | $b=B/(R+G+B)$ | / |

Green Red Ratio Index | $GRRI=G/R$ | / |

Green Blue Ratio Index | $GBRI=G/B$ | / |

Red Blue Ratio Index | $RBRI=R/B$ | / |

Excess Red Vegetation Index | $ExR=1.4\times r-g$ | [16] |

Excess Green Vegetation Index | $ExG=2\times g-r-b$ | [16] |

Excess Blue Vegetation Index | $ExB=1.4\times b-g$ | [16] |

Excess Green Minus Excess Red Index | $ExGR=ExG-ExR$ | [16] |

Woebbecke Index | $WI=(G-B)/(G+R)$ | [17] |

Normalized Difference Index | $NDI=(r-g)/(r+g+0.01)$ | [17] |

Color Intensity | $INT=(R+B+G)/3$ | [18] |

Green Leaf Index 1 | $GLI1=(2\times G-R-B)/(2\times G+R+B)$ | [19] |

Green Leaf Index 2 | $GLI2=(2\times G-R+B)/(2\times G+R+B)$ | [19] |

Vegetative Index | $VEG=G/\left({R}^{(2/3)}\times {b}^{(1/3)}\right)$ | [20] |

Combination | $COM=0.25\times ExG+0.3\times ExGR+0.33\times CIVE+0.12\times VEG$ | [21] |

Color Index of Vegetation | $CIVE=0.441\times r-0.811\times g+0.3856\times b+18.79$ | [22] |

Normalized Green–Red Vegetation Index | $NGRVI=(G-R)/(G+R)$ | [23] |

Kawashima Index | $IKAW=(R-B)/(R+B)$ | [24] |

Visible-band difference vegetation Index | $VDVI=(2\times g-r-b)/(2\times g+r+b)$ | [25] |

Visible Atmospherically Resistance Index | $VARI=(g-r)/(g+r-b)$ | [26] |

Principal Component Analysis Index | $IPCA=0.994\times |R-B|+0.961\times |G-B|+0.914\times |G-R|$ | [27] |

Modified Green Red Vegetation Index | $MGRVI=\left({G}^{2}-{R}^{2}\right)/\left({G}^{2}+{R}^{2}\right)$ | [28] |

Red Green Blue Vegetation Index | $RGBVI=\left({G}^{2}-B\times R\right)/\left({G}^{2}+B\times R\right)$ | [28] |

Index Name | Calculation Formula | References |
---|---|---|

Difference vegetation index | $DVI={R}_{nir}-{R}_{red}$ | [29] |

Enhanced Vegetation Index | $EVI=2.5\times \left({R}_{nir}-{R}_{red}\right)/\left({R}_{nir}+6\times {R}_{red}-7.5\times {R}_{blue}+1\right)$ | [29] |

Leaf chlorophyll index | ${LCI=(R}_{nir}-{R}_{rededge})/({R}_{nir}+{R}_{red})$ | [29] |

Green Normalized Difference Vegetation | $\mathit{GNDVI}=\left({R}_{\mathrm{nir}}-{R}_{\mathrm{green}}\right)/\left({R}_{\mathrm{nir}}+{R}_{\mathrm{green}}\right)$ | [30] |

Ratio Between NIR and Green Bands | $V{I}_{(\mathrm{nir}/\mathrm{green})}={R}_{\mathrm{nir}}/{R}_{\mathrm{green}}$ | [31] |

Ratio Between NIR and Red Bands | $V{I}_{\left(\mathrm{nir}/\mathrm{red}\right)}={R}_{\mathrm{nir}}/{R}_{red}$ | [32] |

Ratio Between NIR and Red Edge Bands | $V{I}_{\left(\mathrm{nir}/\mathrm{rededge}\right)}={R}_{\mathrm{nir}}/{R}_{rededge}$ | [33] |

Napierian Logarithm of The Red Edge | ${ln}_{RE}=100\times \left({ln}_{nir}-{ln}_{red}\right)$ | [34] |

Modified Soil-Adjusted Vegetation Index 1 | $MSAVI1=(1+L)\left(\frac{{R}_{nir}-{R}_{red}}{{R}_{nir}+{R}_{red}+L}\right)(L=0.1)$ | [35] |

Modified Soil-Adjusted Vegetation Index 2 | $MSAVI2={R}_{nir}+0.5-\sqrt{{\left(2\times {R}_{nir}+1\right)}^{2}-8\times \left({R}_{nir}-{R}_{red}\right)}/2$ | [35] |

Optimized Soil-Adjusted Vegetation Index | $OSAVI=(1+0.16)\times \frac{\left({R}_{nir}-{R}_{red}\right)}{\left({R}_{nir}+{R}_{red}+0.16\right)}$ | [36] |

Modified Triangular Vegetation Index 2 | $MTVI2=\frac{1.5\times \left[1.2\times \left({R}_{\mathrm{nir}}-{R}_{\mathrm{green}}\right)-2.5\times \left({R}_{\mathrm{red}}-{R}_{\mathrm{green}}\right)\right]}{\sqrt{{\left(2\times {R}_{nir}+1\right)}^{2}-\left(6\times {R}_{nir}-5\times \sqrt{{R}_{\mathrm{red}}}\right)-0.5}}$ | [37] |

Normalized Difference Red Edge Index | $NDRE=\frac{\left({R}_{\mathrm{nir}}-{R}_{\mathrm{rededge}}\right)}{\left({R}_{\mathrm{nir}}+{R}_{\mathrm{rededge}}\right)}$ | [38] |

Normalized Difference Vegetation Index | $NDVI=\frac{\left({R}_{\mathrm{nir}}-{R}_{\mathrm{red}}\right)}{\left({R}_{\mathrm{nir}}+{R}_{\mathrm{red}}\right)}$ | [39] |

Modified Simple Radio | $MSR=\left({R}_{nir}-{R}_{red}-1\right)/\left(\sqrt{{R}_{\mathrm{nir}}+{R}_{\mathrm{red}}}+1\right)$ | [40] |

Soil-Adjusted Vegetation Index | $SAVI=\frac{\left({R}_{nir}-{R}_{red}\right)}{\left({R}_{nir}+{R}_{red}+0.5\right)}\times (1+0.5)$ | [41] |

Simplified Canopy Chlorophyll Content Index | $SCCCI=\frac{NDRE}{NDVI}$ | [42] |

Modified Chlorophyll Absorption Reflectance Index | $\mathrm{MCARI}=\left({R}_{\mathrm{rededge}}-{R}_{\mathrm{red}}-0.2\times \left({R}_{\mathrm{rededge}}-{R}_{\mathrm{green}}\right)\right)\times \left(\frac{{R}_{\mathrm{rededge}}}{{R}_{\mathrm{red}}}\right)$ | [43] |

Structure-Insensitive Pigment Index | $SIPI=\frac{\left({R}_{\mathrm{nir}}-{R}_{\mathrm{blue}}\right)}{\left({R}_{nir}+{R}_{red}\right)}$ | [44] |

Transformed Chlorophyll Absorption Reflectance Index | $\mathrm{TCARI}=3\times \left(\left({R}_{\mathrm{rededge}}-{R}_{\mathrm{red}}\right)-0.2\times \left({R}_{\mathrm{rededge}}-{R}_{\mathrm{green}}\right)\times \left(\frac{{R}_{\mathrm{rededge}}}{{R}_{\mathrm{red}}}\right)\right)$ | [45] |

Normalized Difference Index | $NDI=\frac{\left({R}_{\mathrm{nir}}-{R}_{\mathrm{rededge}}\right)}{\left({R}_{\mathrm{nir}}+{R}_{\mathrm{red}}\right)}$ | [46] |

Red-Edge Chlorophyll Index 1 | $Cl1=\frac{{R}_{\mathrm{nir}}}{{R}_{\mathrm{rededge}}}-1$ | [47] |

Red-Edge Chlorophyll Index 2 | $Cl2=\frac{{R}_{\mathrm{rededge}}}{{R}_{\mathrm{green}}}-1$ | [48] |

Modified Chlorophyll Absorption Reflectance Index 2 | $MCARI2=1.5\times \frac{\left(2.5\times \left({R}_{\mathrm{nir}}-{R}_{\mathrm{rededge}}\right)-1.3\times \left({R}_{\mathrm{nir}}-{R}_{green}\right)\right)}{\left(2\times {\left({R}_{nir}+1\right)}^{2}-\left(6\times {R}_{nir}-5\times {\left({R}_{red}\right)}^{2}\right)-0.5\right)}$ | [49] |

TCARI/OSAVI | $\frac{TCARI}{OSAVI}$ | [49] |

MCARI/OSAVI | $\frac{MCARI}{OSAVI}$ | [49] |

Model | Abbreviation | References |
---|---|---|

AdaBoost Regressor | ADA | [50] |

Automatic Relevance Determination | ARD | [51] |

Bayesian Ridge | BR | [52] |

CatBoost Regressor | CatBoost | [53] |

Decision Tree Regressor | DT | [54] |

Dummy Regressor | Dummy | [55] |

Elastic Net | EN | [56] |

Extra Trees Regressor | ET | [57] |

Extreme Gradient Boosting | EGB | [58] |

Gradient Boosting Regressor | GBR | [59] |

Huber Regressor | Huber | [60] |

K Neighbors Regressor | KNN | [61] |

Kernel Ridge | KR | [62] |

Lasso Least Angle Regression | LLAR | [63] |

Lasso Regression | Lasso | [64] |

Least Angle Regression | LAR | [63] |

Light Gradient Boosting Machine | LGBM | [65] |

Linear Regression | LR | [66] |

Multilayer Perceptron Regressor | MLP | [67] |

Orthogonal Matching Pursuit | OMP | [68] |

Passive Aggressive Regressor | PAR | [69] |

Random Forest Regressor | RF | [70] |

Random Sample Consensus | RANSC | [71] |

Ridge Regression | Ridge | [72] |

Support Vector Machine Regression | SVM | [73] |

TheilSen Regressor | TR | [74] |

Dataset | Minimum | Maximum | Mean | STDEV | CV (%) |
---|---|---|---|---|---|

Total Dataset | 0.550 | 0.848 | 0.773 | 0.081 | 10.4 |

Training set | 0.551 | 0.846 | 0.775 | 0.077 | 9.900 |

Test set | 0.550 | 0.848 | 0.768 | 0.090 | 11.700 |

Multispectral | Correlation | RGB | Correlation |
---|---|---|---|

Vegetation Indices | Coefficient | Vegetation Indices | Coefficient |

DVI | 0.857 | b | −0.679 |

EVI | 0.869 | g | 0.827 |

NDVI | 0.899 | r | 0.283 |

GNDVI | 0.888 | GRRI | 0.309 |

NDRE | 0.850 | GBRI | 0.737 |

LCI | 0.797 | RBRI | 0.502 |

OSAVI | 0.892 | INT | −0.816 |

VI(NIR/G) | 0.784 | GRVI | 0.319 |

VI(NIR/R) | 0.724 | NDI | −0.324 |

VI(NIR/RE) | 0.807 | WI | 0.769 |

lnRE | 0.86 | IKAW | 0.501 |

MSAVI1 | 0.895 | GLI | 0.832 |

MSAVI2 | 0.896 | GLI2 | −0.126 |

MTVI2 | 0.863 | VARI | −0.55 |

MSR | 0.849 | ExR | −0.435 |

SAVI | 0.88 | ExG | 0.827 |

SCCCI | 0.756 | ExB | −0.743 |

MCARI | −0.793 | ExGR | 0.829 |

MCARI2 | 0.754 | VEG | 0.767 |

TCARI | −0.71 | IPCA | −0.821 |

NDI | 0.865 | CIVE | −0.831 |

CL1 | 0.807 | COM | 0.827 |

CL2 | 0.835 | RGBVI | 0.843 |

SIPI | 0.898 | MGRVI | 0.324 |

TCARI/OSAVI | −0.784 | VDVI | 0.832 |

MCARI/OSAVI | −0.893 |

VIs Type | Important Features |
---|---|

RGB | RGBVI, ExR |

MS | MSAVI2 |

RGB + MS | SIPI, ExR, VEG |

Model | MAE | MSE | RMSE | R^{2} | RMLSE | MAPE | TT (s) |
---|---|---|---|---|---|---|---|

GBR | 0.023 | 0.001 | 0.033 | 0.800 | 0.019 | 0.032 | 0.010 |

RF | 0.024 | 0.001 | 0.033 | 0.795 | 0.019 | 0.032 | 0.058 |

XGB | 0.026 | 0.001 | 0.034 | 0.789 | 0.020 | 0.036 | 0.106 |

Catboost | 0.024 | 0.001 | 0.034 | 0.785 | 0.020 | 0.033 | 0.192 |

ET | 0.024 | 0.001 | 0.034 | 0.782 | 0.020 | 0.033 | 0.048 |

KNN | 0.026 | 0.001 | 0.035 | 0.771 | 0.020 | 0.035 | 0.008 |

ADA | 0.028 | 0.001 | 0.035 | 0.767 | 0.021 | 0.038 | 0.018 |

Huber | 0.030 | 0.002 | 0.039 | 0.711 | 0.022 | 0.040 | 0.006 |

Ridge | 0.031 | 0.002 | 0.039 | 0.707 | 0.023 | 0.041 | 0.006 |

LR | 0.030 | 0.002 | 0.039 | 0.707 | 0.023 | 0.040 | 0.006 |

LAR | 0.030 | 0.002 | 0.039 | 0.707 | 0.023 | 0.040 | 0.006 |

BR | 0.030 | 0.002 | 0.039 | 0.707 | 0.023 | 0.041 | 0.006 |

OMP | 0.031 | 0.002 | 0.039 | 0.707 | 0.022 | 0.041 | 0.004 |

ARD | 0.031 | 0.002 | 0.039 | 0.707 | 0.022 | 0.041 | 0.006 |

DT | 0.031 | 0.002 | 0.040 | 0.688 | 0.023 | 0.042 | 0.004 |

TR | 0.036 | 0.003 | 0.049 | 0.572 | 0.029 | 0.051 | 0.178 |

PAR | 0.042 | 0.003 | 0.050 | 0.528 | 0.029 | 0.056 | 0.006 |

LGBM | 0.049 | 0.004 | 0.063 | 0.283 | 0.036 | 0.067 | 0.018 |

RANSC | 0.043 | 0.005 | 0.065 | 0.195 | 0.037 | 0.062 | 0.008 |

MLP | 0.055 | 0.005 | 0.067 | 0.158 | 0.038 | 0.072 | 0.014 |

lasso | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.006 |

EN | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.006 |

LLAR | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.008 |

Dummy | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.004 |

SVM | 0.072 | 0.006 | 0.077 | −0.104 | 0.044 | 0.093 | 0.006 |

KR | 0.792 | 0.632 | 0.795 | −116.050 | 0.523 | 1.030 | 0.006 |

Model | MAE | MSE | RMSE | R^{2} | RMLSE | MAPE | TT (s) |
---|---|---|---|---|---|---|---|

Huber | 0.021 | 0.001 | 0.027 | 0.860 | 0.015 | 0.028 | 0.006 |

LR | 0.022 | 0.001 | 0.027 | 0.860 | 0.015 | 0.029 | 0.316 |

Ridge | 0.022 | 0.001 | 0.027 | 0.860 | 0.015 | 0.029 | 0.264 |

LAR | 0.022 | 0.001 | 0.027 | 0.860 | 0.015 | 0.029 | 0.004 |

OMP | 0.022 | 0.001 | 0.027 | 0.860 | 0.015 | 0.029 | 0.004 |

BR | 0.022 | 0.001 | 0.027 | 0.860 | 0.015 | 0.029 | 0.004 |

ARD | 0.022 | 0.001 | 0.027 | 0.860 | 0.015 | 0.029 | 0.006 |

TR | 0.022 | 0.001 | 0.028 | 0.849 | 0.016 | 0.029 | 0.020 |

KNN | 0.026 | 0.001 | 0.033 | 0.794 | 0.019 | 0.035 | 0.006 |

RF | 0.032 | 0.001 | 0.038 | 0.737 | 0.022 | 0.043 | 0.054 |

Catboost | 0.033 | 0.002 | 0.039 | 0.723 | 0.022 | 0.045 | 0.190 |

ADA | 0.031 | 0.002 | 0.039 | 0.716 | 0.023 | 0.042 | 0.012 |

GBR | 0.034 | 0.002 | 0.040 | 0.704 | 0.023 | 0.046 | 0.010 |

ET | 0.034 | 0.002 | 0.041 | 0.695 | 0.023 | 0.046 | 0.044 |

PAR | 0.035 | 0.002 | 0.041 | 0.684 | 0.023 | 0.046 | 0.006 |

XGB | 0.036 | 0.002 | 0.042 | 0.668 | 0.024 | 0.049 | 0.132 |

DT | 0.037 | 0.002 | 0.043 | 0.652 | 0.025 | 0.050 | 0.006 |

RANSC | 0.035 | 0.003 | 0.047 | 0.546 | 0.027 | 0.049 | 0.006 |

LGMB | 0.045 | 0.004 | 0.062 | 0.299 | 0.036 | 0.064 | 0.016 |

SVM | 0.064 | 0.005 | 0.069 | 0.097 | 0.039 | 0.082 | 0.006 |

Lasso | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.298 |

EN | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.006 |

LLAR | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.004 |

Dummy | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.004 |

MLP | 0.076 | 0.008 | 0.090 | −0.567 | 0.052 | 0.101 | 0.016 |

KR | 0.784 | 0.619 | 0.787 | −113.552 | 0.528 | 1.024 | 0.006 |

Model | MAE | MSE | RMSE | R^{2} | RMLSE | MAPE | TT (s) |
---|---|---|---|---|---|---|---|

ARD | 0.021 | 0.001 | 0.026 | 0.868 | 0.015 | 0.028 | 0.006 |

OMP | 0.021 | 0.001 | 0.026 | 0.868 | 0.015 | 0.028 | 0.007 |

Ridge | 0.022 | 0.001 | 0.027 | 0.858 | 0.015 | 0.029 | 0.006 |

LR | 0.022 | 0.001 | 0.027 | 0.858 | 0.015 | 0.029 | 0.514 |

LAR | 0.022 | 0.001 | 0.027 | 0.858 | 0.015 | 0.029 | 0.006 |

BR | 0.022 | 0.001 | 0.027 | 0.858 | 0.015 | 0.029 | 0.006 |

Huber | 0.022 | 0.001 | 0.027 | 0.857 | 0.016 | 0.029 | 0.006 |

TR | 0.022 | 0.001 | 0.028 | 0.849 | 0.016 | 0.030 | 0.178 |

RANSC | 0.024 | 0.001 | 0.030 | 0.830 | 0.017 | 0.032 | 0.010 |

KNN | 0.023 | 0.001 | 0.030 | 0.826 | 0.017 | 0.031 | 0.006 |

RF | 0.025 | 0.001 | 0.033 | 0.800 | 0.019 | 0.033 | 0.052 |

ET | 0.025 | 0.001 | 0.034 | 0.785 | 0.020 | 0.033 | 0.050 |

ADA | 0.027 | 0.001 | 0.036 | 0.756 | 0.021 | 0.037 | 0.020 |

Catboost | 0.026 | 0.001 | 0.036 | 0.753 | 0.021 | 0.036 | 0.214 |

XGB | 0.027 | 0.002 | 0.038 | 0.723 | 0.022 | 0.037 | 0.092 |

GBR | 0.028 | 0.002 | 0.039 | 0.721 | 0.022 | 0.037 | 0.012 |

DT | 0.032 | 0.002 | 0.041 | 0.690 | 0.024 | 0.043 | 0.006 |

PAR | 0.039 | 0.002 | 0.047 | 0.593 | 0.027 | 0.051 | 0.006 |

LGBM | 0.043 | 0.003 | 0.055 | 0.429 | 0.032 | 0.060 | 0.018 |

Lasso | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.006 |

EN | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.008 |

LLAR | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.006 |

Dummy | 0.058 | 0.006 | 0.076 | −0.059 | 0.045 | 0.082 | 0.004 |

SVM | 0.071 | 0.006 | 0.076 | −0.073 | 0.043 | 0.092 | 0.006 |

MLP | 0.063 | 0.008 | 0.082 | −0.292 | 0.047 | 0.085 | 0.020 |

KR | 0.799 | 0.646 | 0.803 | −118.159 | 0.512 | 1.042 | 0.006 |

Image Type | Model | Dataset | MAE | MSE | RMSE | R^{2} | RMLSE | MAPE |
---|---|---|---|---|---|---|---|---|

RGB | GBR | Training set | 0.014 | 0.000 | 0.021 | 0.925 | 0.012 | 0.019 |

Test set | 0.027 | 0.001 | 0.036 | 0.838 | 0.020 | 0.037 | ||

MS | Huber | Training set | 0.021 | 0.001 | 0.027 | 0.878 | 0.015 | 0.028 |

Test set | 0.018 | 0.001 | 0.025 | 0.920 | 0.014 | 0.024 | ||

MS + RGB | ARD | Training set | 0.021 | 0.001 | 0.026 | 0.882 | 0.015 | 0.028 |

Test set | 0.019 | 0.001 | 0.024 | 0.925 | 0.014 | 0.026 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Wu, Q.; Zhang, Y.; Xie, M.; Zhao, Z.; Yang, L.; Liu, J.; Hou, D. Estimation of Fv/Fm in Spring Wheat Using UAV-Based Multispectral and RGB Imagery with Multiple Machine Learning Methods. *Agronomy* **2023**, *13*, 1003.
https://doi.org/10.3390/agronomy13041003

**AMA Style**

Wu Q, Zhang Y, Xie M, Zhao Z, Yang L, Liu J, Hou D. Estimation of Fv/Fm in Spring Wheat Using UAV-Based Multispectral and RGB Imagery with Multiple Machine Learning Methods. *Agronomy*. 2023; 13(4):1003.
https://doi.org/10.3390/agronomy13041003

**Chicago/Turabian Style**

Wu, Qiang, Yongping Zhang, Min Xie, Zhiwei Zhao, Lei Yang, Jie Liu, and Dingyi Hou. 2023. "Estimation of Fv/Fm in Spring Wheat Using UAV-Based Multispectral and RGB Imagery with Multiple Machine Learning Methods" *Agronomy* 13, no. 4: 1003.
https://doi.org/10.3390/agronomy13041003