# Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

#### Literature Review

## 2. Materials and Methods

#### 2.1. Acquisition

#### 2.2. Proposed Powdery Mildew Damage Levels

#### 2.3. Preprocessing

#### 2.4. Feature Extraction

#### 2.5. Feature Selection

#### 2.6. Formation of the Feature Vectors

#### 2.7. Proposed Multiclass Classification Framework

#### 2.8. Performance Evaluation

## 3. Results

#### 3.1. Different Color Space Feature Vectors

#### 3.2. Same Color Space Feature Vectors

## 4. Discussion

## 5. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## Abbreviations

PM | Powdery mildew |

RGB | Red, green, and blue |

HSV | Hue, saturation, and value |

L*a*b | Luminance, red, and blue crominance |

YCbCr | Luma component, Cb and Cr chroma components |

ANOVA | Analysis of variance |

ROI | Region of interest |

GLCM | Gray-level co-ocurrence matrix |

CC | Color component |

TD | Texture descriptor |

## References

- Barickman, T.C.; Horgan, T.E.; Wilson, J.C. Efficacy of fungicide applications and powdery mildew resistance in three pumpkin cultivars. Crop Prot.
**2017**, 101, 90–94. [Google Scholar] [CrossRef] - Gudbrand, O.A. Methods for Detection of Powdery Mildew in Agricultural Plants with Hyperspectral Imaging. Master’s Thesis, Norwegian University of Life Sciences, Ås, Norway, 2017. [Google Scholar]
- Burdon, J.J.; Zhan, J. Climate change and disease in plant communities. PLoS Biol.
**2020**, 18, e3000949. [Google Scholar] [CrossRef] [PubMed] - Pydipati, R.; Burks, T.; Lee, W. Identification of citrus disease using color texture features and discriminant analysis. Comput. Electron. Agric.
**2006**, 52, 49–59. [Google Scholar] [CrossRef] - Camargo, A.; Smith, J. Image pattern classification for the identification of disease causing agents in plants. Comput. Electron. Agric. 66, 121–125. [CrossRef]
- Pawar, P.; Turkar, V.; Patil, P. Cucumber disease detection using artificial neural network. In Proceedings of the International Conference on Inventive Computation Technologies, ICICT, Coimbatore, India, 26–27 August 2016. [Google Scholar]
- Costa Lage, D.A.; Marouelli, W.A.; da S. S. Duarte, H.; Café-Filho, A.C. Standard area diagrams for assessment of powdery mildew severity on tomato leaves and leaflets. Crop Prot.
**2015**, 67, 26–34. [Google Scholar] [CrossRef] - Kumar, S.; Sharma, B.R.; Sharma, V.K.; Sharma, H.; Bansal, J.C. Plant leaf disease identification using exponential spider monkey optimization. Sustain. Comput. Inform. Syst.
**2020**, 28, 100283. [Google Scholar] [CrossRef] - Lamba, S.; Kukreja, V.; Baliyan, A.; Rani, S.; Ahmed, S.H. A Novel Hybrid Severity Prediction Model for Blast Paddy Disease Using Machine Learning. Sustainability
**2023**, 15, 1502. [Google Scholar] [CrossRef] - Kaya, Y.; Gürsoy, E. A novel multi-head CNN design to identify plant diseases using the fusion of RGB images. Ecol. Inform.
**2023**, 75, 101998. [Google Scholar] [CrossRef] - Xu, Q.; Cai, J.; Ma, L.; Tan, B.; Li, Z.; Sun, L. Custom-Developed Reflection–Transmission Integrated Vision System for Rapid Detection of Huanglongbing Based on the Features of Blotchy Mottled Texture and Starch Accumulation in Leaves. Plants
**2023**, 12, 616. [Google Scholar] [CrossRef] - Sabat-Tomala, A.; Raczko, E.; Zagajewski, B. Comparison of support vector machine and random forest algorithms for invasive and expansive species classification using airborne hyperspectral data. Remote Sens.
**2020**, 12, 516. [Google Scholar] [CrossRef] - Kasinathan, T.; Singaraju, D.; Uyyala, S.R. Insect classification and detection in field crops using modern machine learning techniques. Inf. Process. Agric.
**2020**, 8, 446–457. [Google Scholar] [CrossRef] - Yağ, İ.; Altan, A. Artificial Intelligence-Based Robust Hybrid Algorithm Design and Implementation for Real-Time Detection of Plant Diseases in Agricultural Environments. Biology
**2022**, 11, 1732. [Google Scholar] [CrossRef] [PubMed] - Fernández, C.I.; Leblon, B.; Wang, J.; Haddadi, A.; Wang, K. Cucumber powdery mildew detection using hyperspectral data. Can. J. Plant Sci.
**2022**, 102, 20–32. [Google Scholar] [CrossRef] - Rivera-Romero, C.A.; Palacios-Hernández, E.R.; Trejo-Durán, M.; Rodríguez-Liñán, M.d.C.; Olivera-Reyna, R.; Morales-Saldaña, J.A. Visible and near-infrared spectroscopy for detection of powdery mildew in Cucurbita pepo L. leaves. J. Appl. Remote Sens.
**2020**, 14, 044515. [Google Scholar] [CrossRef] - Mattonen, S.A.; Huang, K.; Ward, A.D.; Senan, S.; Palma, D.A. New techniques for assessing response after hypofractionated radiotherapy for lung cancer. J. Thorac. Dis.
**2014**, 6, 375–386. [Google Scholar] - Brynolfsson, P.; Nilsson, D.; Torheim, T. Haralick texture features from apparent diffusion coefficient (ADC) MRI images depend on imaging and pre-processing parameters. Sci. Rep.
**2017**, 7, 4041. [Google Scholar] [CrossRef] [PubMed] - Dinstein, I.; Shanmugam, K.; Haralick, R.M. Textural Features for Image Classification. IEEE Trans. Syst. Man Cybern.
**1973**, 3, 610–621. [Google Scholar] - Johnson, R.A.; Wichern, D.W. Applied Multivariate Statistical Analysis, 6th ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2007. [Google Scholar]
- Conover, W.J. Practical Nonparametric Statistics, 3rd ed.; John Wiley & Sons: Hoboken, NJ, USA, 1998. [Google Scholar]
- Lilliefors, H.W. On the kolmogorov-smirnov test for normality with mean and variance unknown. J. Am. Stat. Assoc.
**1967**, 62, 399–402. [Google Scholar] [CrossRef] - Rumpf, T.; Mahlein, A.K.; Steiner, U.; Oerke, E.C.; Dehne, H.W.; Plümer, L. Early detection and classification of plant diseases with Support Vector Machines based on hyperspectral reflectance. Comput. Electron. Agric.
**2010**, 74, 91–99. [Google Scholar] [CrossRef] - Deng, X.; Liu, Q.; Deng, Y.; Mahadevan, S. An improved method to construct basic probability assignment based on the confusion matrix for classification problem. Inf. Sci.
**2016**, 340–341, 250–261. [Google Scholar] [CrossRef] - Salla, R.; Wilhelmiina, H.; Sari, K.; Mikaela, M.; Pekka, M.; Jaakko, M. Evaluation of the confusion matrixmethod in the validation of an automated system for measuring feeding behaviour of cattle. Behav. Process.
**2018**, 148, 56–62. [Google Scholar] - Ma, J.; Du, K.; Zheng, F.; Zhang, L.; Gong, Z.; Sun, Z. A recognition method for cucumber diseases using leafsymptom images based on deep convolutional neural network. Comput. Electron. Agric.
**2018**, 154, 18–24. [Google Scholar] [CrossRef] - Griffel, L.; Delparte, D.; Edwards, J. Using support vector machines classification to differentiate spectralsignatures of potato plants infected with potato virus y. Comput. Electron. Agric.
**2018**, 153, 318–324. [Google Scholar] [CrossRef] - Ohsaki, M.; Wang, P.; Matsuda, K.; Katagiri, S.; Watanabe, H.; Ralescu, A. Confusion-matrix-based kernel logistic regression for imbalanced data classification. IEEE Trans. Knowl. Data Eng.
**2017**, 29, 1806–1819. [Google Scholar] [CrossRef] - Vieira, S.M.; Kaymak, U.; Sousa, J.M.C. Cohen’s kappa coefficient as a performance measure for feature selection. In Proceedings of the International Conference on Fuzzy Systems, Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar]
- Pare, S.; Bhandari, A.K.; Kumar, A.; Singh, G.K. An optimal color image multilevel thresholding technique using grey-level co-occurrence matrix. Expert Syst. Appl.
**2017**, 87, 335–362. [Google Scholar] [CrossRef] - Kadir, A. A model of plant identification system using GLCM, lacunarity and shen features. Res. J. Pharm. Biol. Chem. Sci.
**2014**, 5, 1–10. [Google Scholar] - Ehsanirad, A.; Sharath Kumar, Y.H. Leaf recognition for plant classification using GLCM and PCA methods. Orient. J. Comput. Sci. Technol.
**2010**, 3, 31–36. [Google Scholar] - Malegori, C.; Franzetti, L.; Guidetti, R.; Casiraghi, E.; Rossi, R. GLCM, an image analysis technique for early detection of biofilm. J. Food Eng.
**2016**, 185, 48–55. [Google Scholar] [CrossRef] - Mukherjee, G.; Chatterjee, A.; Tudu, B. Study on the potential of combined glcm features towards medicinalplant classification. In Proceedings of the 2016 2nd International Conference on Control, Instrumentation, Energy Communication (CIEC), Kolkata, India, 28–30 January 2016; pp. 98–102. [Google Scholar]
- Arabi, P.M.; Joshi, G.; Deepa, N.V. Performance evaluation of glcm and pixel intensity matrix for skin textureanalysis. Perspect. Sci.
**2016**, 8, 203–206. [Google Scholar] [CrossRef] - Barbedo, J.G.A. Using digital image processing for counting whiteflies on soybean leaves. J. Asia-Pac. Entomol.
**2014**, 17, 685–694. [Google Scholar] [CrossRef]

**Figure 1.**Proposed methodology for PM damage level detection, where image collection is used for feature extraction and selection. A multiclassification is operated with the results of the classification process. In the end, a performance evaluation is conducted to verify the optimal classification.

**Figure 2.**A timeline of the sampling days and the phenological growth stages to identify PM damage levels. The phenological stages (${S}_{1}$ to ${S}_{8}$) and the sampling days (${D}_{1}$ to ${D}_{19}$) are considered as basic information. Then, four PM damage levels are defined: ${T}_{1}$ for healthy leaves, ${T}_{2}$ for leaves with spore in germination, ${T}_{3}$ for leaves with the first symptoms, and ${T}_{4}$ for diseased leaves.

**Figure 3.**Visual evaluation of cucurbit leaves where four PM damage levels were defined: (

**a**) ${T}_{1}$: healthy leaves, (

**b**) ${T}_{2}$: leaves with spore in germination, (

**c**) ${T}_{3}$: leaves with the first symptoms, and (

**d**) ${T}_{4}$: diseased leaves.

**Figure 4.**Exploration by parts of the leaf for the selection of the region of interest (ROI): (

**a**) division of the leaf, central part (${R}_{1}$), lower right lobe (${R}_{2}$), upper right lobe (${R}_{3}$), upper central lobe (${R}_{4}$), upper left lobe (${R}_{5}$) and lower left lobe (${R}_{6}$), (

**b**) first symptoms at ${R}_{4}$.

**Figure 5.**Preprocessing of the ROI images starting with the color transformation and separation of color components (CCs), where the sample image ($I(x,y)$) is the original image, which is followed by the analysis of ROI results in a new sample in RGB ($R(s,t)$), then a contrast adjust ($C(p,q)$) is performed to obtain the transformation of the image ($T(s,t)$) in the different color spaces ($G(i,j)$, $L(i,j)$, $H(i,j)$, $Y(i,j)$) and the separation for color components.

**Figure 6.**Calculation of the GLCM matrix in a gray image. The distance is $d=1$, and the angle is $\theta =0$: (

**a**) gray image, (

**b**) gray levels $I(x,y)$, and (

**c**) GLCM matrix with the paired pixels $g(i,j)$.

**Figure 7.**Processed image ($I(x,y)$ and $U(s,t)$); color transformation ($H(s,t)$); components ${H}_{1}(s,t)$, ${H}_{2}(s,t)$, and ${H}_{3}(s,t)$; and their GLCM matrices ${G}_{1}(i,j)$, ${G}_{2}(i,j)$, and ${G}_{3}(i,j)$ with 255 gray levels.

**Figure 9.**Feature selection process consists of a Lilliefors test, then an analysis of variance, and Tukey’s test.

**Figure 10.**Results of the ANOVA and Tuke’s test: (

**a**) mean values of the damage levels of diss-BB; (

**b**) Tukey’s test, where the means of the damage levels are significantly different; (

**c**) mean values of the damage levels of auto-A; (

**d**) Tukey’s test, where the means of ${T}_{2}$, ${T}_{3}$, and ${T}_{4}$ are equal but significantly different from ${T}_{1}$.

**Figure 11.**Kernel selection in the multiclassification system with the feature vectors in different color spaces with the optimal hyperplane: (

**a**) linear kernel in 2D with diss${}_{BB}$ versus cont${}_{V}$, (

**b**) 3D optimal hyperplane, (

**c**) training and validation data with error in SVM ${T}_{1}$ versus ${T}_{2}$, (

**d**) polynomial kernel in 2D with auto${}_{V}$ versus savg${}_{G}$, (

**e**) 3D optimal hyperplane, (

**f**) training and validation data with error in SVM ${T}_{3}$ versus ${T}_{4}$, (

**g**) sigmoidal kernel in 2D with ener${}_{GG}$ versus dvar${}_{A}$, (

**h**) 3D optimal hyperplane, (

**i**) training and validation data with the error in SVM ${T}_{2}$ versus ${T}_{3}$, (

**j**) radial base function kernel in 2D with diss${}_{Y}$ versus inf${}_{1}$${}_{BB}$ con kernel RBF, (

**k**) 3D optimal hyperplane, and (

**l**) training and validation data with the error in SVM ${T}_{2}$ versus ${T}_{4}$.

**Figure 12.**Kernel selection in the multiclassification system with the feature vectors in different color space with the optimal hyperplane: (

**a**) radial base function kernel in 2D with auto${}_{V}$ versus dent${}_{S}$, (

**b**) 3D optimal hyperplane, (

**c**) training and validating data with the error in the SVM ${T}_{3}$ versus ${T}_{4}$, (

**d**) linear kernel in 2D with idmn${}_{G}$ versus diss${}_{G}$, (

**e**) 3D optimal hyperplane, (

**f**) training and validate data with the error in the SVM ${T}_{1}$ versus ${T}_{2}$, (

**g**) polynomial kernel in 2D with dvar${}_{CR}$ versus homo${}_{Y}$, (

**h**) 3D optimal hyperplane, (

**i**) training and validate data with the error in the SVM ${T}_{2}$ versus ${T}_{4}$, (

**j**) radial base function kernel in 2D with ener${}_{V}$ versus entr${}_{S}$ con kernel RBF, (

**k**) 3D optimal hyperplane, and, (

**l**) training and validate data with the error in the SVM ${T}_{1}$ versus ${T}_{2}$.

**Figure 13.**One-versus-one multiclassification method. The main inputs are the support vectors ${s}_{1},\dots ,{s}_{6}$), the validation data for each binary classifier ${M}_{1},\dots ,{M}_{6}$, and $\sigma $. Each block ${V}_{1},\dots ,{V}_{4}$ contains the different support vector machines for multiple classification.

**Figure 14.**SVM binary classifiers: (

**a**) test data ${F}_{1}$ and SVM-classified data, (

**b**) test data ${F}_{2}$ and SVM-classified data, (

**c**) test data ${F}_{3}$ and SVM-classified data, (

**d**) test data ${F}_{4}$ and SVM-classified data, and (

**e**) test data ${F}_{5}$ and SVM-classified data.

**Figure 15.**SVM binary classifiers with components of the same color space: (

**a**) test data ${G}_{1}$ and SVM-classified data, (

**b**) test data ${G}_{2}$ and SVM-classified data, (

**c**) test data ${G}_{3}$ and SVM-classified data, (

**d**) test data ${G}_{4}$ and SVM-classified data, and (

**e**) test data ${G}_{5}$ and SVM-classified data.

Equation | DTs | Texture Descriptors |
---|---|---|

${\sum}_{i,j}(i,j)p(i,j)$ | auto | Autocorrelation |

${\sum}_{i,j}{\parallel i-j\parallel}^{2}p(i,j)$ | cont | Contrast |

${\sum}_{i,j}\frac{\{i\times j\}\times p(i,j)-\{{\mu}_{x}\times {\mu}_{y}\}}{{\sigma}_{x}\times {\sigma}_{y}}$ | corr | Correlation ^{1} |

${\sum}_{i,j}{\{i+j-{\mu}_{x}-{\mu}_{y}\}}^{4}\times p(i,j)$ | cpro | Cluster Prominence ^{1} |

${\sum}_{i,j}{\{i+j-{\mu}_{x}-{\mu}_{y}\}}^{3}\times p(i,j)$ | csha | Cluster Shade ^{1} |

${\sum}_{i,j}\parallel i-j\parallel \xb7p(i,j)$ | diss | Dissimilarity |

${\sum}_{i,j}p{(i,j)}^{2}$ | ener | Energy |

$-{\sum}_{i,j}p(i,j)lo{g}_{2}\left(p(i,j)\right)$ | entr | Entropy |

${\sum}_{i,j}\frac{1}{1-{(i-j)}^{2}}p(i,j)$ | homo | Homogeneity${}_{1}$ |

$ma{x}_{i,j}p(i,j)$ | maxp | Maximum Probability ^{1} |

${\sum}_{i,j}{(i-\mu )}^{2}p(i,j)$ | sosv | Sum of Squares |

${\sum}_{i,j}i{p}_{x+y}\left(i\right)$ | savg | Sum Average |

${\sum}_{i,j}(i-j)2p(i,j)$ | svar | Sum Variance |

$-{\sum}_{i,j}{p}_{x+y}\left(i\right)log\left({p}_{x+y}\left(i\right)\right)$ | sent | Sum Entropy |

${\sum}_{i,j}{(k-{\mu}_{x}x-y)}^{2}{p}_{x-y}\left(k\right)$ | dvar | Difference Variance ^{1} |

$-{\sum}_{i,j}{p}_{x+y}\left(i\right)lo{g}_{2}\left({p}_{x+y}\left(i\right)\right)$ | dent | Difference Entropy |

$\frac{HXY-HX{Y}_{1}}{max(HX,HY)}$ | inf${}_{1}$ | Information Measure of Correlation${}_{1}$ ^{2 3} |

$\sqrt{1-exp[-2(HX{Y}_{2}-HXY)]}$ | inf${}_{2}$ | Information Measure of Correlation${}_{2}$ ^{2 3} |

${\sum}_{i,j}{\{i-j\}}^{2}\times p(i,j)$ | indn | Inverse Difference Normalized |

${\sum}_{i,j}\frac{1}{1+{(i-j)}^{2}}p(i,j)$ | idmn | Inverse Difference Moment Normalized |

^{1}${\mu}_{x},{\mu}_{y}$ and ${\sigma}_{x}$, and ${\sigma}_{y}$ are the median and standard deviation of ${p}_{x}$ and ${p}_{y}$, respectively.

^{2}$HXY$ = entr, where $HX$ and $HY$ are the entropies of ${p}_{x}$ and ${p}_{y}$, respectively.

^{3}$HX{Y}_{1}=-{\sum}_{i,j}p(i,j)log\left\{{p}_{x}\left(i\right){p}_{y}\left(j\right)\right\}$ and $HX{Y}_{2}=-{\sum}_{i,j}{p}_{x}\left(i\right){p}_{y}\left(j\right)log\left\{{p}_{x}\left(i\right){p}_{y}\left(j\right)\right\}$.

**Table 2.**An example of the results of two features submitted to the Lilliefors test. For each CC (G—gray, R—red, GG—green, BB—blue, H—hue, S—saturation, V—value, L—luminance, A—a * red and green coordinates, B—b * yellow coordinates with blue, Y—luma, CB—Cb chrominance difference of blue, CR—Cr chrominance difference of red). The features are shown with their four damage levels. If the $h\text{-}\mathrm{v}\mathrm{a}\mathrm{l}\mathrm{u}\mathrm{e}$ in “ 0 ” appears at any level, the feature is discarded for not complying with the normality condition.

Gray | RGB | HSV | L * a * b | YCbCr | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

TDs | G | R | GG | BB | H | S | V | L | A | B | Y | CB | CR | |

${T}_{1}$ | diss | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |

${T}_{2}$ | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | |

${T}_{3}$ | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | |

${T}_{4}$ | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | |

${T}_{1}$ | homo | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |

${T}_{2}$ | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |

${T}_{3}$ | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |

${T}_{4}$ | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |

${T}_{1}$ | idmn | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |

${T}_{2}$ | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |

${T}_{3}$ | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |

${T}_{4}$ | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |

**Table 3.**Examples of results of Tukey’s test by feature listed in order according to the ability to separate the four damage levels of PM.

Feature | F Statistic | ${\mathit{T}}_{1}$ | ${\mathit{T}}_{2}$ | ${\mathit{T}}_{3}$ | ${\mathit{T}}_{4}$ |
---|---|---|---|---|---|

ener${}_{B}$ | 184.7 | a | b | c | d |

corr${}_{G}$ | 174.7 | a | b | c | d |

homo${}_{V}$ | 171.2 | a | b | c | d |

corr${}_{G}$ | 158.6 | a | b | c | d |

ener${}_{GG}$ | 143.2 | a | b | c | d |

ener${}_{V}$ | 142.6 | a | b | c | d |

dent${}_{A}$ | 134.5 | a | b | c | d |

sosv${}_{V}$ | 71.4 | a | b | c | d |

dvar${}_{A}$ | 125.5 | a | b | c | d |

idmn${}_{A}$ | 124.4 | a | b | c | d |

cpro${}_{GG}$ | 122.4 | a | b | c | d |

homo${}_{G}$ | 119.4 | a | b | c | d |

entr${}_{S}$ | 112.7 | a | b | c | d |

homo${}_{Y}$ | 111.3 | a | b | c | d |

cont${}_{L}$ | 109.7 | a | b | c | d |

dvar${}_{L}$ | 109.7 | a | b | c | d |

dvar${}_{GG}$ | 105.9 | a | b | c | d |

**Table 4.**Formation of the features vectors with the combination of six TDs features belonging to different color spaces (${F}_{1}$, …, ${F}_{5}$), and the features belonging to components of the same color space (${G}_{1}$, …, ${G}_{5}$).

TDs | Vector | Features |
---|---|---|

Different | ${F}_{1}$ | auto${}_{V}$, dent${}_{S}$, svar${}_{V}$, svag${}_{L}$, sosv${}_{V}$, savg${}_{G}$ |

color | ${F}_{2}$ | entr${}_{R}$, homo${}_{R}$, idmn${}_{R}$, idmn${}_{CR}$, dvar${}_{R}$, cont${}_{R}$ |

space | ${F}_{3}$ | dvar${}_{CR}$, cont${}_{CR}$, idmn${}_{Y}$, idmn${}_{G}$, idmn${}_{GG}$, cont${}_{Y}$ |

combinations | ${F}_{4}$ | cont${}_{L}$, homo${}_{Y}$, entr${}_{S}$, homo${}_{G}$, cpro${}_{GG}$, idmn${}_{A}$ |

${F}_{5}$ | cont${}_{A}$, dvar${}_{A}$, dent${}_{A}$, ener${}_{V}$, ener${}_{GG}$, corr${}_{L}$ | |

Same | ${G}_{1}$ | sent${}_{R}$, entr${}_{R}$, idmn${}_{GG}$, cont${}_{GG}$, diss${}_{BB}$, inf${}_{1}$${}_{BB}$ |

color | ${G}_{2}$ | dent${}_{S}$, entr${}_{S}$, auto${}_{V}$, svar${}_{V}$, sosv${}_{V}$, ener${}_{V}$ |

space | ${G}_{3}$ | diss${}_{L}$, savg${}_{L}$, idmn${}_{A}$, cont${}_{A}$, dvar${}_{A}$, ener${}_{B}$ |

combinations | ${G}_{4}$ | diss${}_{Y}$, homo${}_{Y}$, corr${}_{Y}$, idmn${}_{CR}$, dvar${}_{CR}$, cont${}_{CR}$ |

${G}_{5}$ | diss${}_{G}$, savg${}_{G}$, idmn${}_{G}$, cont${}_{G}$, dvar${}_{G}$, homo${}_{G}$ |

**Table 5.**Support vector machines ${M}_{1},\dots ,{M}_{6}$ with different space color components ${F}_{1},\dots ,{F}_{5}$ with the kernels’ linear, polynomial, sigmoidal, and radial base function for the selection of the SVM.

Kernel | SVM | p/$\mathit{\omega}$/$\mathit{\sigma}$ | R${}^{2}$ | h${}_{\mathit{est}}$ | $\mathbf{\Gamma}$ | ${\left|\right|\mathit{w}\left|\right|}^{2}$ | % Error |
---|---|---|---|---|---|---|---|

Lineal | ${M}_{1}$ | - | 433.36 | 1.0 × 10^{15} | 0.0 + 90.74 | 2.4 × 10^{12} | 17.4 |

Lineal | ${M}_{3}$ | - | 448.28 | 1.1 × 10^{15} | 0.0 + 9.53i | 2.6 × 10^{12} | 18.6 |

Polynomial | ${M}_{4}$ | 4 | 3917 | 1.2 × 10^{16} | 0.0 + 3.0i | 3105.6 | 30 |

Polynomial | ${M}_{6}$ | 4 | 3989 | 7.1 × 10^{16} | 0.0 + 8.7i | 1801.79 | 20.2 |

Sigmoidal | ${M}_{1}$ | 1 | 2192.8 | 1096.5 | 0.0 + 1.67i | 500 | 17.6 |

Sigmoidal | ${M}_{4}$ | 7 | 1614.5 | 8072.3 | 0.0 + 9.1i | 500 | 43.4 |

RBF | ${M}_{1}$ | 0.5 | 0.9978 | 460.17 | 4.1048 | 461.18 | 0 |

RBF | ${M}_{2}$ | 0.5 | 0.9977 | 464.95 | 4.1237 | 465.98 | 0 |

RBF | ${M}_{4}$ | 0.5 | 0.9979 | 493.85 | 4.2359 | 494.87 | 0 |

RBF | ${M}_{5}$ | 0.5 | 0.9979 | 487.92 | 4.2132 | 488.94 | 0 |

RBF | ${M}_{3}$ | 1 | 0.9972 | 413.45 | 3.9135 | 414.60 | 0 |

RBF | ${M}_{6}$ | 1 | 0.9974 | 456.07 | 4.0885 | 457.22 | 0 |

**Table 6.**Support vector machines with components of a space color ${M}_{1},\dots ,{M}_{6}$ with the kernels’ linear, polynomial, sigmoidal and radial base function for the selection of the SVM.

Kernel | SVM | p/$\mathit{\omega}$/$\mathit{\sigma}$ | R${}^{2}$ | h${}_{\mathit{est}}$ | $\mathbf{\Gamma}$ | ${\left|\right|\mathit{w}\left|\right|}^{2}$ | % Error |
---|---|---|---|---|---|---|---|

Lineal | ${M}_{1}$ | - | 478.89 | 1.1 × 10^{15} | 0.0 + 9.74i | 2.4 × 10^{12} | 16.8 |

Lineal | ${M}_{3}$ | - | 455.57 | 1.0 × 10^{15} | 0.0 + 8.59i | 2.2 × 10^{12} | 15 |

Polinomial | ${M}_{1}$ | 6 | 9.95 × 10^{18} | 4.97 × 10^{21} | 0.0 + 26i | 500 | 16 |

Polinomial | ${M}_{6}$ | 6 | 9.28 × 10^{18} | 1.24 × 10^{15} | 0.0 + 98.2i | 0.0001 | 0 |

Sigmoidal | ${M}_{1}$ | 3 | 2137.45 | 1065.1 | 0.0 + 11.8i | 500 | 17.2 |

Sigmoidal | ${M}_{2}$ | 3 | 2104.09 | 1052.5 | 0.0 + 11.1i | 500 | 17.8 |

RBF | ${M}_{2}$ | 1 | 0.9957 | 458.48 | 4.098 | 460.42 | 0 |

RBF | ${M}_{3}$ | 0.5 | 0.9979 | 469.96 | 4.1434 | 470.95 | 0 |

RBF | ${M}_{4}$ | 0.5 | 0.9978 | 491.79 | 4.2280 | 492.79 | 0 |

RBF | ${M}_{5}$ | 0.5 | 0.9979 | 488.64 | 4.2160 | 489.64 | 0 |

RBF | ${M}_{1}$ | 2 | 0.9799 | 34,676.99 | 25.8 | 35,385.5 | 0 |

RBF | ${M}_{6}$ | 1 | 0.9962 | 764.31 | 5.1414 | 767.17 | 0 |

**Table 7.**Computed parameters for the performance evaluation of the classified data ${F}_{1}$, ${F}_{2}$, ${F}_{3}$, ${F}_{4}$, and ${F}_{5}$.

Vector | $\mathit{ACC}$(%) | $\mathit{SN}$ | $\mathit{SP}$ | $\mathit{PREC}$ | $\mathit{FPR}$ | ${\mathit{F}}_{\mathit{\beta}}$ |
---|---|---|---|---|---|---|

${F}_{1}$ | 93.1 | 0.832 | 0.965 | 0.887 | 0.035 | 85.8 |

${F}_{2}$ | 88.4 | 0.700 | 0.945 | 0.811 | 0.055 | 75.1 |

${F}_{3}$ | 88.9 | 0.682 | 0.958 | 0.844 | 0.042 | 75.5 |

${F}_{4}$ | 90.0 | 0.728 | 0.957 | 0.850 | 0.043 | 78.4 |

${F}_{5}$ | 91.2 | 0.754 | 0.964 | 0.875 | 0.036 | 81.0 |

**Table 8.**Computed parameters for the performance evaluation of the classified data ${G}_{1}$, ${G}_{2}$, ${G}_{3}$, ${G}_{4}$ and ${G}_{5}$.

Vector | $\mathit{ACC}$ | $\mathit{SN}$ | $\mathit{SP}$ | $\mathit{PREC}$ | $\mathit{FPR}$ | ${\mathit{F}}_{\mathit{\beta}}$ |
---|---|---|---|---|---|---|

${G}_{1}$ | 87.3 | 0.625 | 0.956 | 0.824 | 0.044 | 0.711 |

${G}_{2}$ | 90.8 | 0.776 | 0.952 | 0.843 | 0.048 | 0.808 |

${G}_{3}$ | 94.4 | 0.877 | 0.967 | 0.898 | 0.033 | 0.887 |

${G}_{4}$ | 91.4 | 0.752 | 0.968 | 0.887 | 0.032 | 0.814 |

${G}_{5}$ | 87.3 | 0.678 | 0.938 | 0.786 | 0.062 | 0.728 |

b) SVM-${F}_{1}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 71 | 6 | 0 | 2 | 79 | 89.87 |

${T}_{2}$ | 2 | 9 | 0 | 0 | 11 | 81.82 |

${T}_{3}$ | 0 | 0 | 12 | 0 | 12 | 100.00 |

${T}_{4}$ | 2 | 0 | 0 | 2 | 4 | 50.00 |

Test data | 75 | 15 | 12 | 4 | 106 | |

% Correct | 94.67 | 60.00 | 100.00 | 50.00 | 88.68 | |

d) SVM-${F}_{2}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 46 | 5 | 2 | 2 | 55 | 83.64 |

${T}_{2}$ | 4 | 7 | 0 | 0 | 11 | 63.64 |

${T}_{3}$ | 3 | 0 | 7 | 0 | 10 | 70.00 |

${T}_{4}$ | 2 | 0 | 0 | 17 | 19 | 89.47 |

Test data | 55 | 12 | 9 | 19 | 95 | |

% Correct | 83.64 | 58.33 | 77.78 | 89.47 | 81.05 | |

f) SVM-${F}_{3}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 92 | 4 | 1 | 1 | 98 | 93.88 |

${T}_{2}$ | 4 | 5 | 3 | 0 | 12 | 41.67 |

${T}_{3}$ | 2 | 1 | 0 | 0 | 3 | 0.00 |

${T}_{4}$ | 3 | 0 | 0 | 6 | 9 | 66.67 |

Test data | 101 | 10 | 4 | 7 | 122 | |

% Correct | 91.09 | 50.00 | 0.00 | 85.71 | 84.43 | |

b) SVM-${F}_{4}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 74 | 4 | 2 | 1 | 81 | 91.36 |

${T}_{2}$ | 3 | 3 | 1 | 0 | 7 | 42.86 |

${T}_{3}$ | 1 | 1 | 6 | 0 | 8 | 75.00 |

${T}_{4}$ | 1 | 0 | 2 | 8 | 11 | 72.73 |

Test data | 79 | 8 | 11 | 9 | 107 | |

% Correct | 93.67 | 37.50 | 54.55 | 88.89 | 85.05 | |

d) SVM-${F}_{5}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 86 | 3 | 2 | 0 | 91 | 94.51 |

${T}_{2}$ | 0 | 0 | 0 | 0 | 0 | 0.00 |

${T}_{3}$ | 6 | 0 | 0 | 0 | 6 | 0.00 |

${T}_{4}$ | 3 | 0 | 0 | 12 | 15 | 80.00 |

Test data | 95 | 3 | 2 | 12 | 112 | |

% Correct | 90.53 | 0.00 | 0.00 | 100.00 | 87.50 | |

Time | 5–6 ms |

Vector | Features | $\mathit{ACC}$ | Kappa | % Correct |
---|---|---|---|---|

${F}_{1}$ | auto${}_{V}$, dent${}_{S}$, svar${}_{V}$, | 0.931 | 0.7874 | 88.68 |

savg${}_{L}$, sosv${}_{V}$, savg${}_{G}$ | ||||

${F}_{5}$ | cont${}_{A}$, dvar${}_{A}$, dent${}_{A}$, | 0.912 | 0.7841 | 87.50 |

ener${}_{V}$, ener${}_{GG}$, corr${}_{L}$ | ||||

${G}_{3}$ | diss${}_{L}$, savg${}_{L}$, idmn${}_{A}$ | 0.944 | 0.7638 | 89.76 |

cont${}_{A}$, dvar${}_{A}$, ener${}_{B}$ | ||||

${G}_{4}$ | diss${}_{Y}$, homo${}_{Y}$, corr${}_{Y}$ | 0.914 | 0.7835 | 88.68 |

idmn${}_{CR}$, dvar${}_{CR}$, cont${}_{CR}$ |

**Table 11.**Confusion matrix with the classified test data ${G}_{1}$, ${G}_{2}$, and ${G}_{5}$ in components of the same color space.

b) SVM-${G}_{1}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 56 | 3 | 2 | 1 | 62 | 90.32 |

${T}_{2}$ | 0 | 2 | 0 | 0 | 2 | 100.00 |

${T}_{3}$ | 6 | 0 | 6 | 0 | 12 | 50.00 |

${T}_{4}$ | 1 | 0 | 3 | 11 | 15 | 73.33 |

Test data | 63 | 5 | 11 | 12 | 91 | |

% Correct | 88.89 | 40.00 | 54.55 | 91.67 | 82.42 | |

d) SVM-${G}_{2}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 82 | 5 | 4 | 2 | 93 | 88.17 |

${T}_{2}$ | 2 | 9 | 1 | 0 | 12 | 75.00 |

${T}_{3}$ | 2 | 0 | 1 | 0 | 3 | 33.33 |

${T}_{4}$ | 2 | 0 | 0 | 5 | 7 | 71.43 |

Test data | 88 | 14 | 6 | 7 | 115 | |

% Correct | 93.18 | 64.29 | 16.67 | 71.43 | 84.35 | |

b) SVM-${G}_{3}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 101 | 0 | 2 | 0 | 103 | 98.06 |

${T}_{2}$ | 4 | 4 | 4 | 0 | 12 | 33.33 |

${T}_{3}$ | 0 | 0 | 0 | 0 | 0 | 0.00 |

${T}_{4}$ | 3 | 0 | 0 | 9 | 12 | 75.00 |

Test data | 108 | 4 | 6 | 9 | 127 | |

% Correcs | 93.52 | 100.00 | 0.00 | 100.00 | 89.76 | |

d) SVM-${G}_{4}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Correct |

${T}_{1}$ | 90 | 0 | 0 | 0 | 90 | 100.00 |

${T}_{2}$ | 6 | 4 | 0 | 0 | 10 | 40.00 |

${T}_{3}$ | 4 | 0 | 0 | 0 | 4 | 0.00 |

${T}_{4}$ | 2 | 0 | 0 | 0 | 2 | 0.00 |

Prueba | 102 | 4 | 0 | 0 | 106 | |

% Corrects | 88.24 | 100.00 | 0.00 | 0.00 | 88.68 | |

f) SVM-${G}_{5}$ | ${T}_{1}$ | ${T}_{2}$ | ${T}_{3}$ | ${T}_{4}$ | Classified | % Corrects |

${T}_{1}$ | 67 | 3 | 0 | 2 | 72 | 93.06 |

${T}_{2}$ | 9 | 7 | 3 | 0 | 19 | 36.84 |

${T}_{3}$ | 3 | 2 | 5 | 0 | 10 | 50.00 |

${T}_{4}$ | 3 | 2 | 0 | 20 | 25 | 80.00 |

Test data | 82 | 14 | 8 | 22 | 126 | |

% Corrects | 81.71 | 50.00 | 62.50 | 90.91 | 78.57 | |

Time | 5–6 ms |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Rivera-Romero, C.A.; Palacios-Hernández, E.R.; Vite-Chávez, O.; Reyes-Portillo, I.A.
Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors. *Inventions* **2024**, *9*, 8.
https://doi.org/10.3390/inventions9010008

**AMA Style**

Rivera-Romero CA, Palacios-Hernández ER, Vite-Chávez O, Reyes-Portillo IA.
Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors. *Inventions*. 2024; 9(1):8.
https://doi.org/10.3390/inventions9010008

**Chicago/Turabian Style**

Rivera-Romero, Claudia Angélica, Elvia Ruth Palacios-Hernández, Osbaldo Vite-Chávez, and Iván Alfonso Reyes-Portillo.
2024. "Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors" *Inventions* 9, no. 1: 8.
https://doi.org/10.3390/inventions9010008