Next Article in Journal
Structure–Properties Relationship of Reprocessed Bionanocomposites of Plasticized Polylactide Reinforced with Nanofibrillated Cellulose
Next Article in Special Issue
Analytical Method for Assessing Stability of a Counterbalanced Forklift Truck Assembled with Interchangeable Equipment
Previous Article in Journal
Automatic Correction of an Automated Guided Vehicle’s Course Using Measurements from a Laser Rangefinder
Previous Article in Special Issue
Effect of Commercial Microbial Preparations Containing Paenibacillus azotofixans, Bacillus megaterium and Bacillus subtilis on the Yield and Photosynthesis of Winter Wheat and the Nitrogen and Phosphorus Content in the Soil
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Modified Barnacles Mating Optimization with Deep Learning Based Weed Detection Model for Smart Agriculture

by
Amani Abdulrahman Albraikan
1,
Mohammed Aljebreen
2,
Jaber S. Alzahrani
3,
Mahmoud Othman
4,
Gouse Pasha Mohammed
5,* and
Mohamed Ibrahim Alsaid
5
1
Department of Computer Sciences, College of Computer and Information Sciences, Princess Nourah Bint Abdulrahman University, Riyadh 11671, Saudi Arabia
2
Department of Computer Science, Community College, King Saud University, Riyadh 11437, Saudi Arabia
3
Department of Industrial Engineering, College of Engineering at Alqunfudah, Umm Al-Qura University, Makkah 24382, Saudi Arabia
4
Department of Computer Science, Faculty of Computers and Information Technology, Future University in Egypt New Cairo, Cairo 11835, Egypt
5
Department of Computer and Self Development, Preparatory Year Deanship, Prince Sattam Bin Abdulaziz University, Al-Kharj 16278, Saudi Arabia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(24), 12828; https://doi.org/10.3390/app122412828
Submission received: 27 October 2022 / Revised: 14 November 2022 / Accepted: 16 November 2022 / Published: 14 December 2022
(This article belongs to the Special Issue Engineering of Smart Agriculture)

Abstract

:
Weed control is a significant means to enhance crop production. Weeds are accountable for 45% of the agriculture sector’s crop losses, which primarily occur because of competition with crops. Accurate and rapid weed detection in agricultural fields was a difficult task because of the presence of a wide range of weed species at various densities and growth phases. Presently, several smart agriculture tasks, such as weed detection, plant disease detection, species identification, water and soil conservation, and crop yield prediction, can be realized by using technology. In this article, we propose a Modified Barnacles Mating Optimization with Deep Learning based weed detection (MBMODL-WD) technique. The MBMODL-WD technique aims to automatically identify the weeds in the agricultural field. Primarily, the presented MBMODL-WD technique uses the Gabor filtering (GF) technique for the noise removal process. For automated weed detection, the presented MBMODL-WD technique uses the DenseNet-121 model as feature extraction with the MBMO algorithm as hyperparameter optimization. The design of the MBMO algorithm involves the integration of self-population-based initialization with the standard BMO algorithm. At last, the Elman Neural Network (ENN) method was applied for the weed classification process. To demonstrate the enhanced performance of the MBMODL-WD approach, a series of simulation analyses were performed. A comprehensive set of simulations highlighted the enhanced performance of the presented MBMODL-WD methodology over other DL models with a maximum accuracy of 98.99%.

1. Introduction

Agriculture is confronting enormous difficulties, which include threats from diseases, weeds, and pests, varying climate, severe scarcity of water resources and arable lands, etc. [1]. Many efforts were exerted to weed control over the years by farmers and researchers to solve the difficulties brought by weeds [2]. Weeds appear everywhere arbitrarily in the domain and compete with crops for sunlight, water, and nutrients, which causes harmful effects on crop quality and yields if not controlled properly. Many studies have illustrated a strong correlation between weed competition and crop yield loss [3]. There are different weed varieties that are detrimental to crop productivity and should be identified in the initial stage of growth. The weed’s growth in the crop would affect fundamental resources such as minerals, sunlight, water, fresh air, soil, etc., which are the fundamental necessities of the crop [4]. It has been found that 35% of crops are ruined because of the growth of various weeds in the agricultural field. This study aimed to learn various techniques and tools employed by the researchers to classify and detect weeds, which are essential for assessing the growth of weeds [5]. Numerous computer-related approaches, such as wireless sensor networks (WSN), artificial intelligence (AI), and some other approaches, are used that enhance agriculture.
Weed detection in plants is now becoming a great challenge since there is not much effort put into weed detection. Techniques to realize field weed identification through computer vision (CV) technology primarily use conventional deep learning (DL) and image processing [6]. If weed identification has been carried out by conventional image-processing technology, then feature extraction, such as shape, color, and texture of the image, and integration with conventional ML classifiers, such as the Support Vector Machine (SVM) or random forest (RF) algorithm, for weed detection are essential [7]. Such techniques must devise features manually and highly depend on pre-processing methods, quality of feature extraction, and image acquisition algorithms. Classical algorithms for detecting agricultural weeds had a focus on direct identification of the weed; yet, there were substantial variances in weed species [8]. With the advances in computational power and growth in data volume, DL methodologies are capable of extracting multidimensional and multiscale spatial semantic feature information of weeds by using the Convolutional Neural Network (CNN) because of its enhanced data expression abilities for images, evading demerits of conventional extracted approaches [9,10]. Hence, CNN models have gained higher attention in the research community.
This manuscript introduces a Modified Barnacles Mating Optimization with Deep Learning based weed detection (MBMODL-WD) technique. The presented MBMODL-WD technique initially pre-processes the input images via the Gabor filtering (GF) technique to eradicate the noise. For automated weed detection, the presented MBMODL-WD technique uses the DenseNet-121 model as feature extractor. Moreover, the MBMO algorithm is used for the hyperparameter optimization process. At last, the Elman Neural Network (ENN) method was applied for the classification of images into plants and weeds. To demonstrate the enhanced performance of the MBMODL-WD approach, a series of simulation analyses were performed.

2. Literature Review

Sodjinou et al. [11] introduce a segmentation approach relevant to the compilation of K-means and semantic segmentation methods for segmenting weeds and crops in color images. The two distinct databases of agronomic images are employed for segmenting methods. Everything except the plants has been eliminated from the images with the use of the threshold method. Then, utilizing U-net and the K-means subtractive approach for the segmentation of weeds and crops, semantic segmentation has been implemented. A U-net for the segmentation of weeds and wheat on images was given in [12]. An image classifier task has been employed for choosing the backbone network for the encoding part. The abovementioned task on a similar dataset will be exploited for pretraining and choosing the decoding network. The training process implemented TL. Sa et al. [13] formulates a weed segmentation and mapping structure that processes multispectral images received from drones by utilizing deep neural networks (DNNs). Many researchers are increasingly making an effort in weed or crop semantic segmentation just by considering single images for classification and processing.
A weed species and density evaluation approach rely upon an image semantic segmentation NN that has been modeled in [14]. To train the network, an amalgamation of fine-tuning and pretraining training approaches has been employed. The pretraining datasets are the images that just have a single weed species in a single image. The weeds will be labeled mechanically by an image segmentation approach related to the minimum error threshold and Excess Green (ExG). The fine-tuning data are real imageries comprising many crops and weeds and are manually labeled. In [15], an AI-oriented method has been modeled to find weeds’ unintentional growth on agricultural fields and rise the control rate, concerning technological advances. This presented technique will train the dataset by utilizing pretrained DL techniques and has been optimized through metaheuristic optimization approaches through the selection of the species-related activations that can be sent to Softmax, which was in the final layer of DL techniques.
In [16], a technique related to DL has been modeled for weed segmentation from the images. This approach may segment weeds from crops and soil in the images. This semantic segmentation approach has been advanced using a simplified U-net. An image augmentation method has been devised because of the difficulty of image labeling for the semantic segmentation of weeds. The semantic segmentation network is trained by a two-stage training technique made up of fine-tuning and pretraining. Abdalla et al. [17] intended to assess three TL approaches with the use of a VGG16-oriented encoder net for segmenting the oilseed rape images in a field including high-density weeds. Three TL methods utilizing a VGG16-based encoder method have been modeled, and their performances will be compared with a VGG19-related encoder net.

3. The Proposed Model

In this article, we have introduced a new MBMODL-WD technique for the automated identification of weeds using computer vision techniques and improved crop productivity. It encompasses a series of processes: GF-based image pre-processing, DenseNet-121 feature extractor, MBMO-based hyperparameter optimizer, and ENN-based classification. Figure 1 demonstrates the block diagram of the MBMODL-WD system.

3.1. Image Pre-Processing

Primarily, the presented MBMODL-WD technique removes noise using the GF technique. Gabor filter has been employed in the presented system for enhancing the ridges and relaxing the valleys through the enforcement of short-term Fourier transformation including Gaussian window in the spatial domain [18]. It helps to gain deviations in characteristics and textures in the fingerprint images for distinct scales and orientations. Such statistical features generated image features that can be significantly accentuated by utilizing the frequency information and orientations in fingerprint images by fine-tuning a Gabor filter. A set of Gabor filters was employed on image I ( x ,   y ) in distinct frequencies having distinct orientations, utilizing the Gabor function g ( x , y), as defined by Equation (1).
g ( x ,   y ) =   exp   ( x 2 + γ 2 y 2 2 σ 2 ) cos   ( 2 π X l + ϕ )
Here 𝜒 = x c o x θ + y s m ˙ θ and y = y c o x θ x s m ˙ θ .
This Gabor transform will be applied in the Gaussian envelope σ , in addition to the χ and y directions.

3.2. Feature Extraction

For automated weed detection, the DenseNet-121 model produces feature vectors. A CNN accomplishes higher performance in the domain of image classification [19]. However, training a CNN from scratch is not easier because classifier accuracy based on hyperparameter tuning, namely learning rate, initial weight, number of epochs, optimizers, and dropout, needs a wide-ranging amount of labeled training datasets and higher computation power. This problem can be leveraged by means of the TL method. In the TL model, the training time can be decreased by the weight attained from the pretrained mechanism that is applied as an initial weight for training the novel problem. This process of reusing the pretrained weight leads to lower generalization errors. The DenseNet framework is popular since the DenseNet module motivates feature reuse, attenuates the gradient vanishing problems, decreases the parameter count, and enhances feature propagation. In DCNN, all the layers are interconnected to other layers as a feed forward pattern. Every layer in DenseNet accepts the feature map of each previous layer as other inputs and passes on its own feature map to every succeeding layer. Therefore, the n - th layer has n inputs of each previous layer.
In general, a CNN changes the size of feature maps via the down-sampling of layers. However, DenseNet facilitates feature concatenation and down-sampling by separating the network into multiple densely connected dense blocks. The feature map size in the block is unchanged. Inside the dense block, the size of the feature maps is similar which assists in performing concatenation while outside the dense block, while pooling and convolution operations were performed to down-sample. At the end of every dense layer, a transition block or layer was added. The transition layer comprises a batch normalization layer, a 1 × 1 convolutional, and 2 × 2 average pooling layers. The transition layer changes the size of the feature map. Therefore, the DenseNet comprises 1 classification, 117 Conv, and 3 transitions, making the size of the layer 121 [19].
Here, the MBMO algorithm as a hyperparameter optimizer is applied. Barnacles are certain kinds of arthropods which constitute an infraclass Cirripedia based on crabs and lobsters [20]. A barnacle is a marine animal that lives in shallow and tidal waters. They can be found all around the seawater and are raised on hard surfaces in seawater. After they hatch eggs, barnacle larvae were disseminated in the water to find and stick to hard surfaces. Indeed, hard surfaces cover the bodies of barnacles and enhance shell plates. They must seek a balance between managing ever-longer penises and accomplishing more mates in a turbulent flow. A novel optimization technique, named the barnacles mating optimizer (BMO) technique, based on these behaviors has been introduced. The balance behavior can be devised depending upon the Hardy–Weinberg equilibrium as follows:
The initial population of barnacles for the solution can be determined by:
X = { χ 1 1 χ 1 N χ n 1 χ n N }  
where n defines candidate number, and N denotes count of decision parameters according to lower and higher bounds:
l b = [ l b 1 ,   ,   l b i ]
u b = [ u b 1 ,   ,   u b i ]
From the expression, u b   and   l b indicate the upper and lower bounds of variable i . By assessing the objective function for each candidate, better outcomes to worst outcomes are stored and arranged at the initial iteration.
The presented method involves exploration and exploitation. The offspring generation can be performed by the sperm cast method as an exploration term:
b D = r a n d ( n )
b M = r a n d ( n )
where b D and b M denote the mated parents.
Depend upon the Hardy–Weinberg theory, the BMO method considers the parents’ genotype frequencies or inheritance features in the offspring generation to model the reproduction process:
X i N n e w = p X b D N + q X b M N
Now, X b M N and X b D N  characterize the variables of Mum and Dad candidates, correspondingly, and p defines a pseudo-random value disseminated between 0 and 1, = ( 1 p ) .
When the candidate choice for mating exceling p l amount is taken into account initially, then the exploration term can take place:
X i N n e w = r a n d × X b M n
In Equation (8), r a n d defines a random integer between zero and one. The recently produced offspring for exploration can be created by Mum’s candidate.
The offspring will be added and analyzed to the parents to extend the solution matrices from the candidate dimension. Hence, to choose fifty percent top solutions, a technique was utilized for arranging individual dimension, and the inappropriate solution was removed. The MBMO algorithm involves the integration of self-population-based initialization with the standard BMO algorithm. Similar to other metaheuristic models, BMO has a population-related optimization method that begins with random initialization. This implies that it needs a control variable to determine the population size. Nevertheless, it is worth noting that the selection of population size to resolve case problems becomes difficult. The self-adaptive population will regulate the population size at the iteration. Now, the initial population size in the initial iteration can be accomplished by a self-adaptive population:
P o p S i z e = 10 × d
where d signifies the problem dimension and it can be defined as follows:
P o p S i z e n e w =   max   ( d ,   r o u n d ( P o p S i z e + r × P o p S i z e ) )
where r defines a random number between 0.5 and 0.5.
Algorithm 1: Pseudocode of BMO
Initializing the population of barnacles X i
Compute the fitness of all the barnacles
Arrange for locating an optimum outcome at the top of populations (T = the optimum solution)
while (I < Maximal iterations)
               Fixed the value of p l
               if selective of Dad and Mum = p l
                              for all the variables
                                             Off spring generation:
                              end for
               else if selective of Dad and Mum > p l
                              for all the variables
                                             Off spring generation:
               end for end if
               Bring the present barnacle back once it moves outside the boundaries
               Compute the fitness of all the barnacles
               Arranging and upgrading T if there is an optimum solution
               l = l + 1
end while
Return T

3.3. Weed Detection and Classification

For the weed recognition process, the MBMODL-WD technique exploited the ENN model. The ENN model is a common example of a dynamic recurrent network, and its architecture comprises an input layer with specific context nodes, an output, and hidden layers [21]. The key advantage of ENN is that the context node might be employed for remembering the prior hidden node activation, which makes it appropriate in the fields of dynamic system identification and prediction control. Consider the external input of the network as u . The output is y , and the output of the hidden layer is χ . Hence:
x ( k ) = f [ w k 1 x c ( k ) + w k 2 u ( k 1 ) ]
x c ( k ) = x ( k 1 )
y ( k ) = g [ w k 3 x ( k ) ]  
where   w k 1 w k 2 ,   and   w k 3 indicate the weight connection matrices from context to implicit layers, the input to hidden layers, and hidden to output layers, correspondingly. Now, f and g represent the transfer function of implicit and output layers.
x c ( k ) = x ( k 1 ) = f [ x k 1 1 x c ( k 1 ) + x k 1 2 u ( k 2 ) ]
Then,
x c ( k 1 ) = x ( k 2 )
where x c ( k ) depends on   w k 1 1 , w k 2 2 at distinct moments; hence, x c ( k ) represents the dynamic recursive method. Consequently, the BP model applied to Elman regression NN training is the dynamic BP learning mechanism.

4. Experimental Validation

The proposed model is simulated using the Python 3.6.5 tool. The proposed model is experimented on PC i5-8600k, GeForce 1050Ti 4 GB, 16 GB RAM, 250 GB SSD, and 1 TB HDD. The parameter settings are given as follows: learning rate: 0.01, dropout: 0.5, batch size: 5, epoch count: 50, and activation: ReLU.
The weed detection results of the MBMODL-WD model are tested using a dataset, compassing 3000 images. The dataset holds 287 crop images and 2713 weed images, as defined in Table 1. Figure 2 illustrates some sample images.
The confusion matrices of the MBMODL-WD model on the weed detection process are demonstrated in Figure 3. The outcomes displaying the MBMODL-WD model properly recognized the crop and weed images under all aspects.
Table 2 and Figure 4 report the weed detection results of the MBMODL-WD method on 60% of TR and 40% of TS databases. The simulation values revealed the MBMODL-WD method properly classified the crop and weed images. On 60% of TR database, the MBMODL-WD model reached an average a c c u b a l of 94.73%, p r e c n of 93.75%, r e c a l of 94.73%, F s c o r e of 94.23%, MCC of 88.48%, and G m e a n of 94.64%. Concurrently, on 40% of TS database, the MBMODL-WD approach gained an average a c c u b a l of 92.73%, p r e c n of 97.40%, r e c a l of 92.73%, F s c o r e of 94.91%, MCC of 90.01%, and G m e a n of 92.47%.
Table 3 and Figure 5 depict the weed detection results of the MBMODL-WD system on 70% of TR and 30% of TS databases. The simulation outcomes stated that the MBMODL-WD system properly classified the crop and weed images. On 70% of TR database, the MBMODL-WD system reached an average a c c u b a l of 98.30%, p r e c n of 97.87%, r e c a l of 98.30%, F s c o r e of 98.08%, MCC of 96.17%, and G m e a n of 98.29%. Simultaneously, on 30% of TS database, the MBMODL-WD algorithm attained an average a c c u b a l of 98.99%, p r e c n of 96.13%, r e c a l of 98.99%, F s c o r e of 97.51%, MCC of 95.08%, and G m e a n of 98.99%.
The TACC and VACC of the MBMODL-WD approach are investigated on weed detection performance in Figure 6. The figure states that the MBMODL-WD methodology revealed an improved performance with higher values of TACC and VACC. It is evident that the MBMODL-WD system has attained superior TACC outcomes.
The TLS and VLS of the MBMODL-WD methodology are tested on weed detection performance in Figure 7. The figure points out that the MBMODL-WD approach has exposed optimum performance with decreased values of TLS and VLS. It is noticeable that the MBMODL-WD method has resulted in lesser VLS outcomes.
A noticeable precision–recall study of the MBMODL-WD system under test database is represented in Figure 8. The figure pointed out that the MBMODL-WD algorithm has enhanced values of precision–recall values in various classes.
A comprehensive ROC investigation of the MBMODL-WD methodology under test database is depicted in Figure 9. The results denoted the MBMODL-WD algorithm has displayed its capability in categorizing in various classes.
Table 4 reports a detailed weed detection result of the MBMODL-WD model and existing models [22]. Figure 10 examines the p r e c n and r e c a l results of the MBMODL-WD method and recent methods. The results demonstrated the MBMODL-WD method reached enhanced values of p r e c n and r e c a l . For instance, based on p r e c n , the MBMODL-WD model has shown a higher p r e c n value of 96.13%. In contrast, the RF, KNN, SVM, ResNet-101, VGG-16, SVM-Pixel-based, and HLBODL-WDSA models have shown reduced p r e c n of 95.24%, 62.43%, 91.32%, 93.15%, 93.96%, 85.41%, and 95.84%, respectively. In addition, in terms of r e c a l , the MBMODL-WD approach has exposed a maximally r e c a l value of 98.99%. In contrast, the RF, KNN, SVM, ResNet-101, VGG-16, SVM-Pixel-based, and HLBODL-WDSA models have shown lesser r e c a l values of 93.93%, 61.85%, 91.24%, 94.47%, 92.76%, 85.79%, and 95.18%, correspondingly.
Figure 11 observes the MCC and G m e a n results of the MBMODL-WD algorithm and recent approaches. The outcomes exhibited that the MBMODL-WD system gained improved values of MCC and G m e a n . For instance, with respect to MCC, the MBMODL-WD model outperformed a maximal MCC value of 95.08%. In contrast, the RF, KNN, SVM, ResNet-101, VGG-16, SVM-Pixel-based, and HLBODL-WDSA systems outperformed lower MCC values of 88.63%, 35.83%, 84%, 92.90%, 92.93%, 86.81%, and 93.34%, correspondingly.
Moreover, in terms of G m e a n , the MBMODL-WD system has depicted a superior G m e a n value of 98.99%. In contrast, the RF, KNN, SVM, ResNet-101, VGG-16, SVM-Pixel-based, and HLBODL-WDSA algorithms have demonstrated minimal G m e a n values of 89.87%, 93.29%, 90.34%, 92.80%, 89.07%, 93.62%, and 95.15%, correspondingly. These results reassured the superior outcomes of the MBMODL-WD model.

5. Conclusions

In this article, we have modeled a new MBMODL-WD technique for the automated identification of weeds using computer vision techniques and improving crop productivity. Primarily, the presented MBMODL-WD technique removes noise using the GF technique. For automated weed detection, the DenseNet-121 model produces feature vectors with the MBMO algorithm as hyperparameter optimizer. At last, the ENN method was applied for the classification of images into plants and weeds. To demonstrate the enhanced performance of the MBMODL-WD approach, a series of simulation analyses were performed. A comprehensive set of simulations highlighted the enhanced performance of the presented MBMODL-WD methodology over other DL models, with a maximum accuracy of 98.99%. In the future, the performance of the proposed model can be improved using hybrid DL classification models. In addition, the computational complexity of the proposed model needs to be examined in the future.

Author Contributions

Conceptualization, A.A.A.; Data curation, M.I.A.; Funding acquisition, A.A.A.; Methodology, J.S.A. and M.A.; Project administration, G.P.M.; Supervision, A.A.A.; Validation, M.O.; Visualization, M.I.A.; Writing—original draft, A.A.A., J.S.A. and M.A.; Writing—review & editing, M.O. All authors have read and agreed to the published version of the manuscript.

Funding

Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2022R191), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia. The authors would like to thank the Deanship of Scientific Research at Umm Al-Qura University for supporting this work by Grant Code: 22UQU4340237DSR55. Research Supporting Project number (RSP2022R459), King Saud University, Riyadh, Saudi Arabia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing is not applicable to this article as no datasets were generated during the current study.

Acknowledgments

Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2022R191), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia. Research Supporting Project number (RSP2022R459), King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare that they have no conflict of interest. The manuscript was written through contributions of all authors. All authors have given approval to the final version of the manuscript.

References

  1. Peteinatos, G.G.; Reichel, P.; Karouta, J.; Andújar, D.; Gerhards, R. Weed identification in maize, sunflower, and potatoes with the aid of convolutional neural networks. Remote Sens. 2020, 12, 4185. [Google Scholar] [CrossRef]
  2. Yang, J.; Bagavathiannan, M.; Wang, Y.; Chen, Y.; Yu, J. A comparative evaluation of convolutional neural networks, training image sizes, and deep learning optimizers for weed detection in Alfalfa. Weed Technol. 2022, 36, 1–11. [Google Scholar] [CrossRef]
  3. Sabzi, S.; Abbaspour-Gilandeh, Y.; Garcia-Mateos, G. A fast and accurate expert system for weed identification in potato crops using metaheuristic algorithms. Comput. Ind. 2018, 98, 80–89. [Google Scholar] [CrossRef]
  4. Wang, Q.; Cheng, M.; Xiao, X.; Yuan, H.; Zhu, J.; Fan, C.; Zhang, J. An image segmentation method based on deep learning for damage assessment of the invasive weed Solanum rostratum Dunal. Comput. Electron. Agric. 2021, 188, 106320. [Google Scholar] [CrossRef]
  5. Dadashzadeh, M.; Abbaspour-Gilandeh, Y.; Mesri-Gundoshmian, T.; Sabzi, S.; Hernández-Hernández, J.L.; Hernández-Hernández, M.; Arribas, J.I. Weed classification for site-specific weed management using an automated stereo computer-vision machine-learning system in rice fields. Plants 2020, 9, 559. [Google Scholar] [CrossRef]
  6. Wang, S.; Han, Y.; Chen, J.; Zhang, K.; Zhang, Z.; Liu, X. Weed Density Extraction based on Few-shot Learning through UAV Remote Sensing RGB and Multi-spectral Images in Ecological Irrigation Area. Front. Plant Sci. 2022, 12, 3456. [Google Scholar] [CrossRef]
  7. Wang, Y.; Zhang, X.; Ma, G.; Du, X.; Shaheen, N.; Mao, H. Recognition of weeds at asparagus fields using multi-feature fusion and backpropagation neural network. Int. J. Agric. Biol. Eng. 2021, 14, 190–198. [Google Scholar] [CrossRef]
  8. Roy, A.M.; Bhaduri, J. Real-time growth stage detection model for high degree of occultation using DenseNet-fused YOLOv4. Comput. Electron. Agric. 2022, 193, 106694. [Google Scholar] [CrossRef]
  9. Khan, W.; Raj, K.; Kumar, T.; Roy, A.M.; Luo, B. Introducing Urdu Digits Dataset with Demonstration of an Efficient and Robust Noisy Decoder-Based Pseudo Example Generator. Symmetry 2022, 14, 1976. [Google Scholar] [CrossRef]
  10. Yaacob, M.E.; Nobily, F.B.; Lu, L.; Che Ya, N.N.; Aziz, A.A.; Dupraz, C.; Yahya, M.S.; Hassan, S.N.S.; Mamun, M.A.A. Tropical Weed Identification in Large Scale Solar Photovoltaic Infrastructures: Potential Impacts on Field Operation. Available at SSRN 4075575. [CrossRef]
  11. Sodjinou, S.G.; Mohammadi, V.; Mahama, A.T.S.; Gouton, P. A deep semantic segmentation-based algorithm to segment crops and weeds in agronomic color images. Inf. Process. Agric. 2022, 9, 355–364. [Google Scholar] [CrossRef]
  12. Zou, K.; Liao, Q.; Zhang, F.; Che, X.; Zhang, C. A segmentation network for smart weed management in wheat fields. Comput. Electron. Agric. 2022, 202, 107303. [Google Scholar] [CrossRef]
  13. Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. WeedMap: A large-scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef] [Green Version]
  14. Zou, K.; Wang, H.; Yuan, T.; Zhang, C. Multi-species weed density assessment based on semantic segmentation neural network. Precis. Agric. 2022, 1–24. [Google Scholar] [CrossRef]
  15. Toğaçar, M. Using DarkNet models and metaheuristic optimization methods together to detect weeds growing along with seedlings. Ecol. Inform. 2022, 68, 101519. [Google Scholar] [CrossRef]
  16. Zou, K.; Chen, X.; Wang, Y.; Zhang, C.; Zhang, F. A modified U-Net with a specific data argumentation method for semantic segmentation of weed images in the field. Comput. Electron. Agric. 2021, 187, 106242. [Google Scholar] [CrossRef]
  17. Abdalla, A.; Cen, H.; Wan, L.; Rashid, R.; Weng, H.; Zhou, W.; He, Y. Fine-tuning convolutional neural network with transfer learning for semantic segmentation of ground-level oilseed rape images in a field with high weed pressure. Comput. Electron. Agric. 2019, 167, 105091. [Google Scholar] [CrossRef]
  18. Ahsan, M.; Based, M.A.; Haider, J.; Kowalski, M. An intelligent system for automatic fingerprint identification using feature fusion by Gabor filter and deep learning. Comput. Electr. Eng. 2021, 95, 107387. [Google Scholar]
  19. Chhabra, M.; Kumar, R. A Smart Healthcare System Based on Classifier DenseNet 121 Model to Detect Multiple Diseases. In Mobile Radio Communications and 5G Networks; Springer: Singapore, 2022; pp. 297–312. [Google Scholar]
  20. Norouzi, A.; Shayeghi, H.; Olamaei, J. Multi-objective allocation of switching devices in distribution networks using the Modified Barnacles Mating Optimization algorithm. Energy Rep. 2022, 8, 12618–12627. [Google Scholar] [CrossRef]
  21. Fan, Q.; Zhang, Z.; Huang, X. Parameter conjugate gradient with secant equation based elman neural network and its convergence analysis. Adv. Theory Simul. 2022, 5, 2200047. [Google Scholar] [CrossRef]
  22. Alrowais, F.; Asiri, M.M.; Alabdan, R.; Marzouk, R.; Hilal, A.M.; Gupta, D. Hybrid leader based optimization with deep learning driven weed detection on internet of things enabled smart agriculture environment. Comput. Electr. Eng. 2022, 104, 108411. [Google Scholar] [CrossRef]
Figure 1. Block diagram of MBMODL-WD system.
Figure 1. Block diagram of MBMODL-WD system.
Applsci 12 12828 g001
Figure 2. Sample weed images.
Figure 2. Sample weed images.
Applsci 12 12828 g002
Figure 3. Confusion matrices of MBMODL-WD system, (a,b) TR and TS databases of 60:40, and (c,d) TR and TS databases of 70:30.
Figure 3. Confusion matrices of MBMODL-WD system, (a,b) TR and TS databases of 60:40, and (c,d) TR and TS databases of 70:30.
Applsci 12 12828 g003
Figure 4. Average weed classification analysis of MBMODL-WD system under 60:40 of TR/TS databases.
Figure 4. Average weed classification analysis of MBMODL-WD system under 60:40 of TR/TS databases.
Applsci 12 12828 g004
Figure 5. Average weed classification analysis of MBMODL-WD system under 70:30 of TR/TS databases.
Figure 5. Average weed classification analysis of MBMODL-WD system under 70:30 of TR/TS databases.
Applsci 12 12828 g005
Figure 6. TACC and VACC analyses of MBMODL-WD system on weed classification.
Figure 6. TACC and VACC analyses of MBMODL-WD system on weed classification.
Applsci 12 12828 g006
Figure 7. TLS and VLS analyses of MBMODL-WD system on weed classification.
Figure 7. TLS and VLS analyses of MBMODL-WD system on weed classification.
Applsci 12 12828 g007
Figure 8. Precision–recall analysis of MBMODL-WD system.
Figure 8. Precision–recall analysis of MBMODL-WD system.
Applsci 12 12828 g008
Figure 9. ROC curve analysis of MBMODL-WD system.
Figure 9. ROC curve analysis of MBMODL-WD system.
Applsci 12 12828 g009
Figure 10. P r e c n and r e c a l  analyses of MBMODL-WD system with other weed classification algorithms.
Figure 10. P r e c n and r e c a l  analyses of MBMODL-WD system with other weed classification algorithms.
Applsci 12 12828 g010
Figure 11. MCC and G m e a n analyses of MBMODL-WD system with other weed classification algorithms.
Figure 11. MCC and G m e a n analyses of MBMODL-WD system with other weed classification algorithms.
Applsci 12 12828 g011
Table 1. Dataset details.
Table 1. Dataset details.
ClassNo. of Images
Crop287
Weed2713
Total Number of Images3000
Table 2. Weed detection outcome of MBMODL-WD system under 60:40 of TR/TS databases.
Table 2. Weed detection outcome of MBMODL-WD system under 60:40 of TR/TS databases.
Class A c c u b a l P r e c n R e c a l F s c o r e MCC G m e a n
Training Phase (60%)
Crop90.6288.4190.6289.5188.4894.64
Weed98.8499.0898.8498.9688.4894.64
Average94.7393.7594.7394.2388.4894.64
Testing Phase (40%)
Crop85.8396.4685.8390.8390.0192.47
Weed99.6398.3499.6398.9890.0192.47
Average92.7397.4092.7394.9190.0192.47
Table 3. Weed detection outcome of MBMODL-WD system under 70:30 of TR/TS databases.
Table 3. Weed detection outcome of MBMODL-WD system under 70:30 of TR/TS databases.
Class A c c u b a l P r e c n R e c a l F s c o r e MCC G m e a n
Training Phase (70%)
Crop97.0196.0697.0196.5396.1798.29
Weed99.5899.6899.5899.6396.1798.29
Average98.3097.8798.3098.0896.1798.29
Testing Phase (30%)
Crop98.8492.3998.8495.5195.0898.99
Weed99.1499.8899.1499.5195.0898.99
Average98.9996.1398.9997.5195.0898.99
Table 4. Comparative analysis of MBMODL-WD system with other algorithms.
Table 4. Comparative analysis of MBMODL-WD system with other algorithms.
Methods A c c u b a l P r e c n R e c a l MCC G m e a n
MBMODL-WD98.9996.1398.9995.0898.99
RF Model95.5395.2493.9388.6389.87
KNN Model62.8162.4361.8535.8393.29
SVM Model94.3991.3291.2484.0090.34
ResNet-101 Model93.5293.1594.4792.9092.80
VGG-16 Model93.2993.9692.7692.9389.07
SVM-Pixel-based85.8485.4185.7986.8193.62
HLBODL-WDSA98.9695.8495.1893.3495.16
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Albraikan, A.A.; Aljebreen, M.; Alzahrani, J.S.; Othman, M.; Mohammed, G.P.; Ibrahim Alsaid, M. Modified Barnacles Mating Optimization with Deep Learning Based Weed Detection Model for Smart Agriculture. Appl. Sci. 2022, 12, 12828. https://doi.org/10.3390/app122412828

AMA Style

Albraikan AA, Aljebreen M, Alzahrani JS, Othman M, Mohammed GP, Ibrahim Alsaid M. Modified Barnacles Mating Optimization with Deep Learning Based Weed Detection Model for Smart Agriculture. Applied Sciences. 2022; 12(24):12828. https://doi.org/10.3390/app122412828

Chicago/Turabian Style

Albraikan, Amani Abdulrahman, Mohammed Aljebreen, Jaber S. Alzahrani, Mahmoud Othman, Gouse Pasha Mohammed, and Mohamed Ibrahim Alsaid. 2022. "Modified Barnacles Mating Optimization with Deep Learning Based Weed Detection Model for Smart Agriculture" Applied Sciences 12, no. 24: 12828. https://doi.org/10.3390/app122412828

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop