Next Article in Journal
Biomimicking Atherosclerotic Vessels: A Relevant and (Yet) Sub-Explored Topic
Next Article in Special Issue
A Two-Layer Self-Organizing Map with Vector Symbolic Architecture for Spatiotemporal Sequence Learning and Prediction
Previous Article in Journal
Novel Computational Design of Polymer Micromachined Insect-Mimetic Wings for Flapping-Wing Nano Air Vehicles
Previous Article in Special Issue
Challenging the Limits of Binarization: A New Scheme Selection Policy Using Reinforcement Learning Techniques for Binary Combinatorial Problem Solving
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Ameliorated Snake Optimizer-Based Approximate Merging of Disk Wang–Ball Curves

1
College of Mathematics and Computer Application, Shangluo University, Shangluo 726000, China
2
Department of Applied Mathematics, Xi’an University of Technology, Xi’an 710054, China
3
Department of Computer and Information Science, Linköping University, 58183 Linköping, Sweden
4
Faculty of Science, Fayoum University, Faiyum 63514, Egypt
*
Author to whom correspondence should be addressed.
Biomimetics 2024, 9(3), 134; https://doi.org/10.3390/biomimetics9030134
Submission received: 10 October 2023 / Revised: 17 February 2024 / Accepted: 20 February 2024 / Published: 22 February 2024
(This article belongs to the Special Issue Nature-Inspired Computer Algorithms: 2nd Edition)

Abstract

:
A method for the approximate merging of disk Wang–Ball (DWB) curves based on the modified snake optimizer (BEESO) is proposed in this paper to address the problem of difficulties in the merging of DWB curves. By extending the approximate merging problem for traditional curves to disk curves and viewing it as an optimization problem, an approximate merging model is established to minimize the merging error through an error formulation. Considering the complexity of the model built, a BEESO with better convergence accuracy and convergence speed is introduced, which combines the snake optimizer (SO) and three strategies including bi-directional search, evolutionary population dynamics, and elite opposition-based learning. The merging results and merging errors of numerical examples demonstrate that BEESO is effective in solving approximate merging models, and it provides a new method for the compression and transfer of product shape data in Computer-Aided Geometric Design.

1. Introduction

Computer-Aided Geometric Design (CAGD for short) [1] takes the representation, drawing, display, analysis, and processing of product geometric shape information as the core research content, and it occupies an important position in manufacturing, medical diagnosis, artificial intelligence, computer vision, and other fields. Product geometry design is the focus of CAGD research, and free-form curves and surfaces are an important tool for describing product geometry. From the Ferguson method [2] to Bézier [3,4], B-spline [5,6], NURBS [7,8,9], and other methods, the representation of free-form curves and surfaces has gone through different stages of development driven by industrial software technology. The Ball method is also one of the current commonly used representations. It was proposed in 1974 as the mathematical basis for the former British Airways CONSURF fuselage surface modeling system [10,11,12]. Subsequent research by several scholars has led to the emergence of a variety of generalized forms such as the Said–Ball curve [13,14,15], the Wang–Ball curve [16], the generalized Ball curves of the Wang–Said type, and the generalized Ball curves of the Said–Bézier type [17]. The Wang–Ball curve not only has good properties such as stability, symmetry, endpoint interpolation, and geometric invariance but also significantly outperforms the Said–Ball and Bézier curves in terms of degree elevation, degree reduction, and recursive evaluation [15].
With the rapid development of the geometric modeling industry, the requirements for accuracy standards, surface quality, and overall smoothness of free curves and surfaces are becoming increasingly stringent. The inaccuracies and limited accuracy caused by the floating-point environment in the modeling of curves and surfaces are major causes for the lack of robustness in solid modeling. Interval analysis [18], which is a tool to enhance the stability of algorithms by dealing with errors, has been gradually introduced into the fields of geometric modeling, computer graphics, and Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM) since the 1980s [19,20,21]. Subsequently, the concepts of interval Bézier curves, interval Ball curves, and interval B-spline curves have been proposed. Unlike conventional curves, interval curves are constructed with rectangles instead of control vertices represented by real numbers. However, the interval method also has the disadvantage of an expanding error domain under rotational transformation. To this end, Lin et al. [22] put forward disk Bézier curves in combination with the disk algorithm. A disk curve uses disks to represent the control vertices. Compared with interval curves, disk curves have the following advantages: (1) the shape of disk curves remains unchanged under affine transformation; (2) in terms of data storage volume, the interval method requires eight data records in contrast to only two data records for the disk, which reduces the amount of data. Since then, research related to disk curves has been carried out rapidly. Chen [23] studied the order reduction problem of disk Bézier curves using both linear programming and optimization methods. In 2018, Ao et al. [24] proposed a high-precision intersection algorithm for disk B-spline curves and made this curve flexible for stroke representation. Seah et al. [25] applied disk B-spline curves to artistic brush strokes and 2D animation, and Hu et al. [26] constructed the disk Wang–Ball (DWB) curve and investigated its degree reduction problem.
The rapid development of the graphics industry and the manufacturing sector is accompanied by the constant updating of geometric modeling systems, which has led to an increase in the exchange, integration, and sharing of data between different systems for geometric descriptions. Approximate merging [27] is an approximate transformation technique proposed by Hoschek in 1987, which involves approximating a curve consisting of multiple lower-order curve segments with a single higher-order curve. Approximate merging reduces the amount of data transfer during product design and development, thus enabling efficient data transfer and exchange between different systems. In 2001, Hu et al. [28] presented a methodology to merge two adjacent Bézier curves using controlled vertex perturbations and least squares and showed that the merger error could be reduced if the original curves were first raised to higher degrees before approximating the merger. This method is simple, intuitive, and operational, so Tai et al. [29] used a similar method to solve the exact merging of B-spline curves and suggested a node adjustment technique for tuning the end nodes of the kth-order curve without varying the shape of the curve. Subsequently, Cheng et al. [30] gave a uniform matrix representation for exact merging by minimizing the curve distance. Zhu and Wang [31] used the L2 parametrization to measure the approximation error of the curves before and after the merger and achieved the optimal merger problem for Bézier curves with G2 continuity. Lu et al. [32] also minimized the L2 distance and obtained the merger curve for two Bézier curves with G3 continuity in an explicit manner.
It is clear from the above research that most of the approximate merging problems are confined to conventional curves and do not involve interval and disk curves. Therefore, the questions of how to extend the approximate merging method to interval as well as disk curves, how to estimate the merging error of interval and circular curves, and whether it is possible to directly extend the approximate merging method of traditional curves to interval curves all need to be addressed. The main objective of this paper is to investigate an approximation method to merge disk Wang–Ball curves with improved robustness and accuracy. The merging error is one of the criteria for judging the effectiveness of curve merging, so we take into account the objective of minimizing the merging error to establish an approximate merging model for circular domain curves. In the problem of how to obtain the optimal coordinates of the merged curves, we invoke meta-heuristic optimization algorithms.
Meta-heuristics algorithms have the advantages of fast convergence, high search power, and high solution accuracy, and they play an increasingly important role in optimization problems. Particle swarm optimization (PSO) [33] has appeared in various improved versions since its introduction in 1995, such as multi-objective particle swarm optimization [34] and adaptive particle swarm optimization algorithms [35], and it has been successfully applied in many fields such as image classification [36], path planning [37] and biomedicine [38]. The marine predators algorithm (MPA) [39] is inspired by ocean predator and prey movements, and its combination with problems such as the shape optimization [40], image segmentation [41], wind power prediction [42], and the 0–1 knapsack problem [43] have achieved high-quality optimization results. There are also a variety of algorithms and enhanced variants developed based on different inspirations, such as the chimp optimization algorithm [44], the nutcracker optimizer [45], enhanced black widow optimization (QIWBWO) [46], the snow ablation optimizer (SAO) [47], the multi-strategy enhanced chameleon swarm algorithm (MCSA) [48], and the multi-objective artificial hummingbird algorithm (MOAHA) [49].
The snake optimizer (SO) [50] is inferred from the unique lifestyle of snakes and has been successfully applied to Hammerstein adaptive filters [51] and power-aware task schedulers for wearable biomedical systems [52]. As SO is prone to falling into local optimality and inadequate optimization capabilities when faced with different optimization problems, enhanced versions of it have been proposed successively to achieve better results. A multi-strategy fused snake optimizer (MFISO) was developed by Fu et al. for deep-learning prediction models of gas prominence in underground mines [53]. Rawa investigated a hybrid form of SO and sine cosine algorithm (SCA) called SO-SCA using both parallel and tandem mechanism runs and used it to solve transmission expansion planning models [54]. Khurma et al. not only generated a binary version called BSO based on the S-shaped transformation function but also integrated the new evolutionary greedy crossover operator with SO to propose a BSO-CV algorithm for medical classification problems [55]. The multi-strategy enhanced snake optimizer (BEESO) [56] is an improved algorithm proposed by Hu et al. for the optimization problem introducing bi-directional search, modified evolutionary population dynamics, and elite opposition-based learning strategy in SO. The experimental results in the literature demonstrate that it possesses a highly competitive search capability and convergence speed compared to a variety of sophisticated algorithms. Therefore, BEESO will be used to solve the approximate merging problem for DWB curves, and the main contributions are summarized as follows:
(1)
We discuss the approximate merging problem of DWB curves and establish an approximate merging optimization model with the merging error as the objective;
(2)
We propose an approximate merging method of DWB curves based on BEESO and demonstrate the optimization capability of BEESO with numerical examples.
The remainder of this text is structured as follows: The definition of DWB curves is presented in Section 2, along with a discussion of the approximate merging of adjacent DWB curves and a specific optimization model; Section 3 introduces the BEESO and proposes an approximate merging method of DWB curves based on BEESO and gives three numerical examples; Section 4 concludes the paper.

2. Approximate Merging of DWB Curves

2.1. Definition of the DWB Curves

Definition 1.
In R2, given n+1 control disks, then the DWB curve of degree n is defined as
( W ) ( t ) = i = 0 n W i , n ( t ) ( P i ) = i = 0 n W i , n ( t ) ( p i , r i ) , ( 0 t 1 ) ,
where p i = ( x i , y i ) and ri represent the center coordinates and radius of the control disk, and { W i , n ( t ) } i = 0 n indicates Wang–Ball basis functions, in which
W i , n ( t ) = ( 2 t ) i ( 1 t ) i + 2 , 0 i n / 2 1 , ( 2 t ) n / 2 ( 1 t ) n / 2 , i = n / 2 , ( 2 ( 1 t ) ) n / 2 t n / 2 , i = n / 2 , W n i n ( 1 t ) , n / 2 + 1 i n .
According to Equation (1), the DWB curve can be written in the following form:
( W ) ( t ) = ( C ( t ) , R ( t ) ) = ( i = 0 n W i n ( t ) p i , i = 0 n W i n ( t ) r i ) , ( 0 t 1 ) ,
where C(t) and R(t) are the center curve and radius function, respectively.

2.2. Approximate Merging of Adjacent DWB Curves

2.2.1. Problem Description

It is given that ( W ) 1 ( t ) and ( W ) 2 ( t ) are two adjacent DWB curves, and their control disks are ( P 1 , i ) ( i = 0 , 1 , , n 1 ) and ( P 2 , j ) ( j = 0 , 1 , , n 2 ) , respectively. The approximate merging of these two neighboring DWB curves means seeking another nth DWB curve ( D ) ( t ) , such that the metric distance between ( D ) ( t ) and ( D ¯ ) ( t ) is minimized on the interval [0, 1]. Here, the expression is
( D ¯ ) ( t ) = i = 0 n 1 W i , n 1 ( t λ ) ( P 1 , i ) , ( 0 t λ ) , j = 0 n 2 W j , n 2 ( t λ 1 λ ) ( P 2 , j ) , ( λ t 1 ) ,
where λ is the subdivision parameter. W i , n 1 ( t λ ) and W j , n 2 ( t λ 1 λ ) are the Wang–Ball basis functions of the order n1 and n2 defined by Equation (2). d ( ( D ) ( t ) , ( D ¯ ) ( t ) ) can be selected to the appropriate value as required.

2.2.2. Construction of the Approximate Merger Model

  • Approximate merging without endpoint preservation
According to Equation (3), the two adjacent DWB curves ( W ) 1 ( t ) , ( W ) 2 ( t ) are denoted as ( C 1 ( t ) , R 1 ( t ) ) and ( C 2 ( t ) , R 2 ( t ) ) , and the DWB curve ( D ¯ ) ( t ) in Equation (4) is expressed as ( C ¯ ( t ) , R ¯ ( t ) ) . Referring to the nth DWB curve ( D ) ( t ) to be found as the third curve, that is,
( D ) ( t ) = ( C ( t ) , R ( t ) ) = k = 0 n W k , n ( t ) ( Q k ) , n max ( n 1 , n 2 ) .
Take the subdivision parameter λ to be any constant in the open interval (0, 1). Then, divide ( D ) ( t ) into two DWB curves of degree n1 on the left and n2 on the right, which are recorded as ( D ) l e f t ( t ) = ( C l e f t ( t ) , R l e f t ( t ) ) and ( D ) r i g h t ( t ) = ( C r i g h t ( t ) , R r i g h t ( t ) ) ,
( D ) l e f t ( t ) = ( D ) ( λ t ) , ( D ) r i g h t ( t ) = ( D ) ( λ + ( 1 λ ) t ) , t [ 0 , 1 ] .
For the three curves above, we measure their metric distances in terms of two components, the center curve and the radius function. Let the subdivision parameter λ = 0 1 | C 1 ( t ) | d t 0 1 | C 1 ( t ) | d t + 0 1 | C 2 ( t ) | d t , the distance between the center curves, which is defined as
d C ( C ( t ) , C ¯ ( t ) ) = 0 1 | C 1 ( t ) C l e f t ( t ) | 2 d t + 0 1 | C 2 ( t ) C r i g h t ( t ) | 2 d t .
Based on the above description of curve merging, there should be
d C ( C ( t ) , C ¯ ( t ) ) = min .
For the same between the radius functions R ( t ) and R ¯ ( t ) , there is
min d R ( R ( t ) , R ¯ ( t ) ) = 0 1 | R 1 ( t ) R l e f t ( t ) | 2 d t + 0 1 | R 2 ( t ) R r i g h t ( t ) | 2 d t .
In order to verify the optimized effect, the merging error formula is a common criterion for judging. In summary, this paper will use the merging error between the merging curve ( W ) 1 ( t ) and ( W ) 2 ( t ) as the curve to be merged and as an objective function to build an optimization model, defined as
min ε = 0 1 ( | C 1 ( t ) C l e f t ( t ) | 2 + | C 2 ( t ) C r i g h t ( t ) | 2 ) d t + 0 1 ( | R 1 ( t ) R l e f t ( t ) | 2 + | R 2 ( t ) R r i g h t ( t ) | 2 ) d t .
2.
Approximate merging with endpoint preservation
The resulting merged curve does not guarantee that ( D ) ( t ) is interpolated at the left endpoint of the curve ( W ) 1 ( t ) and the right endpoint of the curve ( W ) 2 ( t ) , so this part will build an approximate merge optimization model with endpoint-preserving interpolation between the merged curve and the curve to be merged. Endpoint-preserving interpolation means that the third curve ( D ) ( t ) found must not only approximate the merge of the two given DWB curves, but also the interpolation at the left endpoint of ( W ) 1 ( t ) as well as the right endpoint of ( W ) 2 ( t ) to be merged.
In accordance with the above definition of endpoint-preserving interpolation approximation merging and the endpoint properties of DWB curves, the endpoint-preserving interpolation approximation merging optimization model is expressed as
min ε = 0 1 ( | C 1 ( t ) C l e f t ( t ) | 2 + | C 2 ( t ) C r i g h t ( t ) | 2 ) d t + 0 1 ( | R 1 ( t ) R l e f t ( t ) | 2 + | R 2 ( t ) R r i g h t ( t ) | 2 ) d t .
The constraints are as follows:
( Q 0 ) = ( q 0 , r 0 ) = ( p 1 , 0 , r 1 , 0 ) = ( P 1 , 0 ) , ( Q n ) = ( q n , r n ) = ( p 2 , n 2 , r 2 , n 2 ) = ( P 2 , n 2 ) .

3. DWB Curves Merging Based on BEESO

3.1. BEESO

BEESO [56] is an improved algorithm proposed to address the shortcomings of SO, which integrates three strategies into SO to improve its optimization capabilities. The update phase of BEESO includes exploration, exploitation, and mutation operations. The initial population is calculated using Equation (13), in which y i = [ y i , 1 , y i , 2 , , y i , D ] ( i = 1 , 2 , , N ) represents the ith individual in a population.
y i = L B + r × ( U B L B ) ,
where r is randomly selected at [0, 1], LB stands for the lower bound, and UB stands for the upper bound.
The population is composed of 50% females and 50% males, and the population is randomly split into two sub-groups before the start of the iteration, as follows:
N m N / 2 ,   N f = N N m
where Nm and Nf represent male and female subgroups.
The food mass P and temperature Temp that control the algorithmic process are calculated with the following formula:
T e m p = exp ( k K ) ,
P = s 1 × exp ( k K K ) ,
where k and K, respectively, represent the current and maximum number of iterations, and s1 = 0.5.

3.1.1. Exploration Phase

If the food quality P < Threshold means that it is of low quality, the population needs to search for better-quality food in the feasible range, which indicates that BEESO enters the exploration phase. The exploration phase consists of a random search and a bi-directional search; BEESO will use these two search methods to update the position of individuals and select the better ones for the next stage by comparing their fitness values. For the random search, the random update position of each individual is formulated as follows:
y i , m ( k + 1 ) = y r , m ( k ) ± s 2 × A M × ( ( U B L B ) × r + L B ) ,
y i , f ( k + 1 ) = y r , f ( k ) ± s 2 × A F × ( ( U B L B ) × r + L B ) ,
where y r , m and y r , f represent male and female individuals, respectively. AM and AF are the snake’s ability to find food, r is a random number, and s2 = 0.5.
y i , m ( k + 1 ) = y r , m ( k ) ± s 2 × A M × ( ( U B L B ) × r + L B ) ,
y i , f ( k + 1 ) = y r , f ( k ) ± s 2 × A F × ( ( U B L B ) × r + L B ) ,
where F i t represents the fitness value.
Bi-directional search makes use of the best and worst individuals to guide BEESO toward the optimal value while maximizing the search area, which effectively improves the disadvantages of the random search such as lower randomness, higher uncertainty, and a narrow search range. The mathematical approach is formulated as
y i , m ( k + 1 ) = y i , m ( k ) + r 1 × ( y b e s t , m y i , m ( k ) ) r 2 × ( y w o r s t , m y i , m ( k ) )
y i , f ( k + 1 ) = y i , f ( k ) + r 1 × ( y b e s t , f y i , f ( k ) ) r 2 × ( y w o r s t , f y i , f ( k ) )
where y b e s t and y w o r s t are the best and worst individuals, respectively, and r1 and r2 are evenly generated random numbers.

3.1.2. Exploitation Phase

A higher quality of food represents BEESO entering the exploitation phase. When the temperature Temp > 0.6 represents a higher temperature in the environment, the snake will move toward the food, as described by the mathematical equation
y i , j ( k + 1 ) = y f o o d ± c 3 × T e m p × r × ( y f o o d x i , j ( k ) )
Mating behavior occurs when the temperature is right, and there is a fighting mode and a mating mode. Fighting mode means that each female engages in mating with the best male, and the males will get the best females by fighting, as in the following equation:
y i , m ( k + 1 ) = y i , m ( k ) ± s 3 × F M × r × ( y b e s t , f y i , m ( k ) )
y i , f ( k + 1 ) = y i , f ( k ) ± s 3 × F F × r × ( y b e s t , m y i , f ( k ) )
where FM and FF are the combat abilities and are calculated using the following formula:
F M = exp ( F i t b e s t , f F i t i )
F F = exp ( F i t b e s t , m F i t i )
Mating patterns in which mating between each pair of individuals occurs is mathematically modeled as
y i , m ( k + 1 ) = y i , m ( k ) ± s 3 × M M × r × ( P × y i , f ( k ) y i , m ( k ) )
y i , f ( k + 1 ) = y i , f ( k ) ± s 3 × M F × r × ( P × y i , m ( k ) y i , f ( k ) )
where MM and MF stand for the mating ability:
M M = exp ( F i t i , f F i t i , m )
M F = exp ( F i t i , m F i t i , f )
There is a potential for egg production after mating. If the snake eggs hatch, modified evolutionary population dynamics (MEPD) is implemented on the current parent to improve population quality by eliminating the poorer individuals and mutating the better ones. New offspring are first generated for the bottom 50% of individuals using Equations (32) and (33):
O f f s p r i n g i ( k + 1 ) = y b e s t ( k ) + s i g n ( r 0.5 ) × ( U B L B × r + L B ) ,   if r < 0.5 y i ( k ) + s i g n ( r 0.5 ) × ( U B L B × r + L B ) ,   else
y i ( k + 1 ) = O f f s p r i n g i ( k + 1 ) , if F i t ( O f f s p r i n g i ( k + 1 ) ) < F i t ( y i ( k + 1 ) ) U B L B × r + L B , else
where i = 1, 2, ..., N/2.
The mutation operation is applied to the top 50% of individuals, as follows:
M y i ( k ) = y ( k ) p 1 + F ( y ( k ) p 2 y ( k ) p 3 )
y i ( k + 1 ) = M y i ( k ) , if F i t ( M y i ( k ) ) < F i t ( y i ( k + 1 ) ) y i ( k + 1 ) ,   else
where p1, p2, p3 are random integers between [1, N], and p1p2p3i. F is the scaling factor:
F = 1 2 ( sin ( 2 π × f r e q × k ) × ( k + K ) + 1 )
where freq is the frequency of vibration of the sine function.

3.1.3. Elite Opposition-Based Learning Strategy

The tendency to fall into local optima is a common problem with optimization algorithms. The elite opposition-based learning strategy is meant to optimize the better-performing individuals in the population, so that the algorithm approximates the global optimum with a higher probability. E y n = [ e y n , 1 , e y n , 2 , , e y n , D ] , ( n = 1 , 2 , , E N ) stands for elite individuals, who rank in the top EN of the population. The elite opposite solution of the current individual is calculated below:
e y ¯ i , j ( k ) = S ( E A j ( k ) + E B j ( k ) ) y i , j ( k )
E B j ( k ) = max ( e y n , j ( k ) )
e y ¯ i , j ( k ) = r a n d ( E B j ( k ) E A j ( k ) ) + E A j ( k ) , i f   e y ¯ i , j < L B j | | e y ¯ i , j > U B j
where EN = 0.1 × N, and E A ( k ) and E B ( k ) are the minimum and maximum values of elite individuals.

3.2. Steps for Solving the Approximate Merger Models by BEESO

In Section 2, two optimization models, endpoint-preserving merging and non-endpoint-preserving merging, are established with the merging error of the curves before and after merging as the objective function. In this section, the BEESO will be used to solve the approximate merged optimized model for DWB curves, and the implementation steps are as follows:
Step one: Setting the parameters of the algorithm;
Step two: Enter the coordinates and radius of the DWB curves to be merged, and calculate the subdivision parameters;
Step three: Initialization. Calculate the initial population according to Equation (13), and divide it into female and male populations. Use the approximate merging model Equation (9) (or Equation (10)) as the objective function;
Step four: Judging food quality. If P < 0.25, execute Equations (17)–(20) and the bi-directional search Equations (21) and (22) of the exploration phase to generate new individuals. Calculate the fitness value of the individuals, and select the better individuals to proceed to the next phase; otherwise, proceed to the exploitation phase;
Step five: The exploitation phase starts by calculating the temperature Temp. If Temp > 0.6, the individual moves closer to the food as in Equation (23). Otherwise, it enters the fight mode or the mating mode. When r > 0.6, the individual is in fight mode and performs Equations (24)–(27); otherwise, it operates in mating mode;
Step six: The mating pattern performs Equations (28)–(31) on individuals, followed by consideration of whether the eggs hatch after mating. If the eggs hatch, MEPD (Equations (32)–(36)) is used to produce individuals that move on to the next stage;
Step seven: Find the elite individuals, and update the population again by Equations (37)–(39);
Step eight: Calculate the fitness value, and determine the optimal individual;
Step nine: Determine the termination condition of the algorithm. Let k = k + 1. If k < K , return to step four; otherwise, output the minimum merging error and the coordinates of the control disks of the merging curve.
A detailed flowchart for the approximate merged model is shown in Figure 1.

3.3. Optimization Examples

To verify the feasibility of the algorithm, three specific numerical examples are given in this section. Each numerical example is optimized in terms of both endpoint-preserving interpolation and endpoint-unperceiving interpolation. The population size and iterations in all experiments are 50 and 500. In addition to BEESO, several advanced algorithms are selected for comparison purposes, such as CSA [48], SCA [57], grey wolf optimizer (GWO) [58], white shark optimizer (WSO) [59], the Coot optimization algorithm (COOT) [60], the sooty tern optimization algorithm (STOA) [61], and Harris hawks optimization (HHO) [62].
Example 1.
It is given that two adjacent DWB curves and their control vertex coordinates as well as the control radius are shown in Table 1, and this constructs the shape of the DWB curve shown in Figure 2. In this example, these two curves of 3 and 4 degrees will be combined into a single 6-degree DWB curve according to the model built in Section 2.2.2.
Table 2 and Table 3 show the optimum results obtained by the eight intelligent algorithms including BEESO and SCA for the two cases of no endpoint preservation and endpoint preservation, respectively, which include the control disk coordinates of the merged curves and the merging error. The comparative results of the before and after merging curves are shown in Figure 3 and Figure 4, where the merged DWB curve is shown in green. The change in the shape of the curves before and after the merger can be observed in the figures. For example, the shape of the left segment of the curve obtained by CSA shows a large error. In addition, the convergence in the optimization process is shown in Figure 3i and Figure 4i.
The overall result shows that the optimization of the “C” curve fits the original curve better. For the non-endpoint-preserving case, SCA and STOA obtain curves that differ significantly from the width of the original curve, and their errors are accordingly two of the larger of all the algorithms. In the endpoint-preserving case, STOA also has a larger difference in results and a slower convergence rate in the optimization process. Although the remaining algorithms have similar results, BEESO has better results when looking at the fit and width of the two curves before and after merging. BEESO achieves the best results among the eight algorithms, both in terms of error results and convergence speed.
Example 2.
The purpose of this example is to approximate the merging of two adjacent 4-degree DWB curves into one 5-degree curve. The control disks of the two curves before merging are shown in Table 4, and the shape of the curve obtained by visualizing it is shown in Figure 5.
Table 5 and Table 6, respectively, give the optimization results of the eight algorithms for the two cases of no endpoint preservation and endpoint preservation, which include the optimal control vertices and the merging error. Figure 6 and Figure 7 show the optimization results for two adjacent 4-degree DWB curves merged into one 5-degree DWB curve under different constraints, respectively.
In the case where endpoints are not preserved, there is a large difference in the optimization results between the eight algorithms. For example, the visual difference between the merged curves obtained by HHO, GWO, and CSA and the original curve is significant, which is also evident from the error for each algorithm in Table 5. Whereas for the endpoint-preserving case, BEESO, CSA, SO, WSO, and COOT obtain similar effect plots, the optimized curve from GWO has a large difference in width from the pre-merged curve. However, by combining the error data in the table, BEESO obtains the minimum error under each of the two constraints.
Example 3.
A 3-degree curve and 4-degree curve are given, respectively, as shown in Table 7, and the two curves are constructed as shown in Figure 8. This example will discuss the problem of merging 3- and 4-degree curves based on intelligent algorithms in both the non-endpoint-preserving and endpoint-preserving cases.
The results obtained for this example using the eight algorithms including BEESO and CSA are shown in Table 8 and Table 9. In addition, the graphs of the merging effect obtained by visualizing the control disks are shown in Figure 9 and Figure 10.
For the case of non-endpoint preservation, the approximate merging results of the three algorithms BEESO, COOT, and SO in Figure 9 are relatively good, with BEESO having the smallest merging error of all the algorithms. The other algorithms, on the other hand, all have much room for improvement in their overall results, especially GWO, HHO, and STOA, which have large errors in the position of the curve and the width of the curve before merging. Due to the low dimensionality of the optimizing variables in the endpoint-preserving case, the results for all seven algorithms apart from STOA are approximate and have a small gap in terms of the combined errors given. For BEESO, SO, and WSO, there is no significant gap when the errors are kept to three decimal places. The experimental outcome by keeping the valid data to more than one decimal place is 0.14669880, 0.14669887, and 0.14669885 for BEESO, SO, and WSO respectively; the error for BEESO is the smallest of the three algorithms.

4. Conclusions

Based on the basic theory of the disk Wang–Ball curve, an approximate merging method based on a meta-heuristics algorithm is proposed for the problem that this curve is difficult to merge. This paper first discusses the approximate merging of DWB curves and establishes two optimization models from the perspectives of non-endpoint-preserving merging and endpoint-preserving merging, with the merging error as the objective function. In addition, BEESO is introduced to solve the constructed optimization models, and the steps of the algorithm for solving the approximate merging problem of DWB curves are specified. The method can directly obtain the control disks of the merging curve while calculating the merging error, which is characterized by simple and practical calculation. Finally, some advanced algorithms are selected for comparison in Section 3.3, and the effectiveness of the method in curve design is also demonstrated by the fact that BEESO achieves DWB curves with good merging results in all three numerical examples.
The approximate merging model developed in this paper achieves a good merging effect and a small error, but the integral form leads to a more complex model, and the algorithm runs slower during the solution process. The question of how to simplify or build the mathematical model in a simple way so that it can be applied to the approximate merging of disk curves is a problem to be solved in the future. In addition, BEESO can be used to solve optimization problems in areas such as cryptosystem design [63], path planning, geometry optimization [40,64], engineering design [65,66], and feature selection [67].

Author Contributions

Conceptualization, G.H.; methodology, G.H., R.Y., J.L. and A.G.H.; software, R.Y.; validation, R.Y.; formal analysis, G.H.; investigation, G.H., R.Y., J.L. and A.G.H.; resources, G.H.; data curation, G.H. and R.Y.; writing—original draft preparation, G.H., R.Y., J.L. and A.G.H.; writing—review and editing, G.H., R.Y., J.L. and A.G.H.; visualization, G.H.; supervision, G.H.; project administration, G.H.; funding acquisition, G.H. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (grant Nos. 52375264 and 62376212).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data generated or analyzed during the study are included in this published article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Barnhill, R.E.; Riesenfeld, R.F. Computer-Aided Geometric Design; Academic Press: New York, NY, USA, 1974. [Google Scholar]
  2. Ferguson, J. Multivariable curve interpolation. J. ACM 1964, 11, 221–228. [Google Scholar] [CrossRef]
  3. Mamar, E. Shape preserving alternatives to the rational Bézier model. Comput. Aided Geom. Des. 2001, 18, 37–60. [Google Scholar] [CrossRef]
  4. Forrest, A.R. Interactive interpolation and approximation by Bézier curve. Comput. J. 1972, 15, 71–79. [Google Scholar] [CrossRef]
  5. De Boor, C. On Calculating with B-splines. J. Approx. Theory 1972, 6, 50–62. [Google Scholar] [CrossRef]
  6. Gordon, W.J.; Riesenfeld, R.F. Bernstein-Bézier methods for the computer-aided design of free-form curves and surfaces. J. ACM 1974, 21, 293–310. [Google Scholar] [CrossRef]
  7. Tiller, W. Rational B-splines for curve and surface representation. IEEE Comput. Graph. Appl. 1983, 3, 6l–69. [Google Scholar] [CrossRef]
  8. Farin, G. Curvature continuity and offsets for piecewise conics. ACM Trans. Graph. 1989, 8, 89–99. [Google Scholar] [CrossRef]
  9. Piegl, L. Modifying the shape of rational B-splines, part 2: Surfaces. Comput.-Aided Des. 1989, 21, 538–546. [Google Scholar] [CrossRef]
  10. Ball, A.A. CONSURF. Part one: Introduction of the conic lofting title. Comput.-Aided. Des. 1974, 6, 243–249. [Google Scholar] [CrossRef]
  11. Ball, A.A. CONSURF. Part two: Description of the algorithms. Comput.-Aided. Des. 1975, 7, 237–242. [Google Scholar] [CrossRef]
  12. Ball, A.A. CONSURF. Part three: How the program is used. Comput.-Aided. Des. 1977, 9, 9–12. [Google Scholar] [CrossRef]
  13. Said, H.B. A generalized Ball curve and its recursive algorithm. ACM Trans. Graph. 1989, 8, 360–371. [Google Scholar] [CrossRef]
  14. Goodman, T.N.T.; Said, H.B. Properties of generalized Ball curves and surfaces. Comput.-Aided. Des. 1991, 23, 554–560. [Google Scholar] [CrossRef]
  15. Hu, S.-M.; Wang, G.-Z.; Jin, T.-G. Properties of two types of generalized Ball curves. Comput.-Aided. Des. 1996, 28, 125–133. [Google Scholar] [CrossRef]
  16. Wang, G.J. Ball curve of high degree and its geometric properties. Appl. Math. 1987, 2, 126–140. [Google Scholar]
  17. Wu, H.Y. Two new types of generalised Ball curves. J. Appl. Math. 2000, 23, 196–205. [Google Scholar]
  18. Moore, R.E. Interval Analysis; Prentice-Hall: Englewood Cliffs, NJ, USA, 1966; pp. 8–13. [Google Scholar]
  19. Mudur, S.P.; Koparkar, P.A. Interval methods for processing geometric objects. IEEE Comput. Graph. Appl. 1984, 4, 7–17. [Google Scholar] [CrossRef]
  20. Hu, C.-Y.; Patrikalakis, N.M.; Ye, X. Robust interval solid modelling part I: Representations. Comput.-Aided. Des. 1996, 28, 807–817. [Google Scholar] [CrossRef]
  21. Hu, C.-Y.; Patrikalakis, N.M.; Ye, X. Robust interval solid modelling part II: Boundary evaluation. Comput.-Aided. Des. 1996, 28, 819–830. [Google Scholar] [CrossRef]
  22. Lin, Q.; Rokne, J.G. Disk Bézier curves. Comput. Aided Geom. Des. 1998, 15, 721–737. [Google Scholar] [CrossRef]
  23. Chen, F.L.; Yang, W. Degree reduction of disk Bézier curves. Comput. Aided Geom. Des. 2004, 21, 263–280. [Google Scholar] [CrossRef]
  24. Ao, X.; Fu, Q.; Wu, Z.; Wang, X.; Zhou, M.; Chen, Q.; Seah, H.S. An intersection algorithm for disk B-spline curves. Comput. Graph. 2018, 70, 99–107. [Google Scholar] [CrossRef]
  25. Seah, H.S.; Wu, Z.; Tian, F.; Xiao, X.; Xie, B. Artistic brushstroke representation and animation with disk b-spline curve. In Proceedings of the 2005 ACM SIGCHI International Conference on Advances in Computer Entertainment Technolog, Valencia, Spain, 15–17 June 2005. [Google Scholar]
  26. Hu, G.; Yang, R.; Wei, G. Hybrid chameleon swarm algorithm with multi-strategy: A case study of degree reduction for disk Wang–Ball curves. Math. Comput. Simul. 2023, 206, 709–769. [Google Scholar] [CrossRef]
  27. Hosehek, J. Approximate conversion of spline curves. Comput. Aided Geom. Des. 1987, 4, 59–66. [Google Scholar] [CrossRef]
  28. Hu, S.-M.; Tong, R.-F.; Ju, T.; Sun, J.-G. Approximate merging of a pair of Bézier curves. Comput.-Aided. Des. 2001, 33, 125–136. [Google Scholar] [CrossRef]
  29. Tai, C.L.; Hu, S.M.; Huang, Q.X. Approximate merging of B-spline curves via knot adjustment and constrained optimization. Comput.-Aided. Des. 2003, 35, 893–899. [Google Scholar] [CrossRef]
  30. Cheng, M.; Wang, G. Approximate merging of multiple Bézier segments. Proc. Natl. Acad. Sci. USA 2008, 18, 757–762. [Google Scholar] [CrossRef]
  31. Zhu, P.; Wang, G.Z. Optimal approximate merging of a pair of Bézier curves with G2-continuity. J. Zhejiang Univ. Sci. A 2009, 10, 554–561. [Google Scholar] [CrossRef]
  32. Lu, L. An explicit method for G3 merging of two Bézier curves. J. Comput. Appl. Math. 2014, 260, 421–433. [Google Scholar] [CrossRef]
  33. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995. [Google Scholar]
  34. Li, A.; Xue, B.; Zhang, M.J. Multi-objective particle swarm optimization for key quality feature selection in complex manufacturing processes. Inf. Sci. 2023, 641, 119062. [Google Scholar] [CrossRef]
  35. Okulewicz, M.; Zaborski, M.; Mańdziuk, J. Self-Adapting Particle Swarm Optimization for continuous black box optimization. Appl. Soft Comput. 2022, 131, 109722. [Google Scholar] [CrossRef]
  36. Elhani, D.; Megherbi, A.C.; Zitouni, A.; Dornaika, F.; Sbaa, S.; Taleb-Ahmed, A. Optimizing convolutional neural networks architecture using a modified particle swarm optimization for image classification. Expert Syst. Appl. 2023, 229, 120411. [Google Scholar] [CrossRef]
  37. Li, X.H.; Yu, S.H. Three-dimensional path planning for AUVs in ocean currents environment based on an improved compression factor particle swarm optimization algorithm. Ocean Eng. 2023, 280, 114610. [Google Scholar] [CrossRef]
  38. Sundaramurthy, S.; Jayavel, P. A hybrid Grey Wolf Optimization and Particle Swarm Optimization with C4.5 approach for prediction of Rheumatoid Arthritis. Appl. Soft Comput. 2020, 94, 106500. [Google Scholar] [CrossRef]
  39. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  40. Hu, G.; Zhu, X.; Wei, G.; Chang, C.-T. An improved marine predators algorithm for shape optimization of developable Ball surfaces. Eng. Appl. Artif. Intell. 2021, 105, 104417. [Google Scholar] [CrossRef]
  41. Abd Elaziz, M.; Mohammadi, D.; Oliva, D.; Salimifard, K. Quantum marine predators algorithm for addressing multilevel image segmentation. Appl. Soft Comput. 2021, 110, 107598. [Google Scholar] [CrossRef]
  42. Al-qaness, M.A.A.; Ewees, A.A.; Fan, H.; Abualigah, L.; Abd Elaziz, M. Boosted ANFIS model using augmented marine predator algorithm with mutation operators for wind power forecasting. Appl. Energ. 2022, 314, 118851. [Google Scholar] [CrossRef]
  43. Abdel-Basset, M.; Mohamed, R.; Chakrabortty, R.K.; Ryan, M.; Mirjalili, S. New binary marine predators optimization algorithms for 0–1 knapsack problems. Comput. Ind. Eng. 2021, 151, 106949. [Google Scholar] [CrossRef]
  44. Wang, Y.; Liu, H.; Ding, G.; Tu, L. Adaptive chimp optimization algorithm with chaotic map for global numerical optimization problems. J. Supercomput. 2023, 79, 6507–6537. [Google Scholar] [CrossRef]
  45. Abdel-Basset, M.; Mohamed, R.; Jameel, M.; Abouhawwash, M. Nutcracker optimizer: A novel nature-inspired metaheuristic algorithm for global optimization and engineering design problems. Knowl.-Based Syst. 2023, 262, 110248. [Google Scholar] [CrossRef]
  46. Hu, G.; Du, B.; Li, H.; Wang, X. Quadratic interpolation boosted black widow spider-inspired optimization algorithm with wavelet mutation. Math. Comput. Simul. 2022, 200, 428–467. [Google Scholar] [CrossRef]
  47. Deng, L.; Liu, S. Snow ablation optimizer: A novel metaheuristic technique for numerical optimization and engineering design. Expert Syst. Appl. 2023, 225, 120069. [Google Scholar] [CrossRef]
  48. Hu, G.; Yang, R.; Qin, X.; Wei, G. MCSA: Multi-strategy boosted chameleon-inspired optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2023, 403, 115676. [Google Scholar] [CrossRef]
  49. Zhao, W.G.; Zhang, Z.X.; Mirjalili, S.; Wang, L.Y.; Khodadadi, N.; Mirjalili, S.M. An effective multi-objective artificial hummingbird algorithm with dynamic elimination-based crowding distance for solving engineering design problems. Comput. Meth. Appl. Mech. Eng. 2022, 398, 115223. [Google Scholar] [CrossRef]
  50. Hashim, F.A.; Hussien, A.G. Snake Optimizer: A novel meta-heuristic optimization algorithm. Knowl.-Based Syst. 2022, 242, 108320. [Google Scholar] [CrossRef]
  51. Janjanam, L.; Saha, S.K.; Kar, R. Optimal design of Hammerstein cubic spline filter for non-linear system modelling based on snake optimiser. IEEE Trans. Ind. Electron. 2022, 70, 8457–8467. [Google Scholar] [CrossRef]
  52. Yousri, R.; Elbayoumi, M.; Soltan, A.; Darweesh, M.S. A power-aware task scheduler for energy harvesting-based wearable biomedical systems using snake optimizer. Analog. Integr. Circuits Signal Process. 2023, 115, 183–197. [Google Scholar] [CrossRef]
  53. Fu, H.; Shi, H.F.; Xu, Y.S.; Shao, J.Y. Research on Gas Outburst Prediction Model Based on Multiple Strategy Fusion Improved Snake Optimization Algorithm With Temporal Convolutional Network. IEEE Access 2022, 10, 117973–117984. [Google Scholar] [CrossRef]
  54. Rawa, M. Towards Avoiding cascading failures in transmission expansion planning of modern active power systems using hybrid snake-sine cosine optimization algorithm. Mathematics 2022, 10, 1323. [Google Scholar] [CrossRef]
  55. Khurma, R.A.; Albashish, D.; Braik, M.; Alzaqebah, A.; Qasem, A.; Adwan, O. An augmented snake optimizer for diseases and COVID-19 diagnosis. Biomed. Signal Process. Control. 2023, 84, 104718. [Google Scholar] [CrossRef]
  56. Hu, G.; Yang, R.; Abbas, M.; Wei, G. BEESO: Multi-strategy boosted snake-inspired optimizer for engineering applications. J. Bionic Eng. 2023, 20, 1791–1827. [Google Scholar] [CrossRef]
  57. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  58. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  59. Braik, M.; Hammouri, A.; Atwan, J.; Al-Betar, M.A.; Mohammed, A.A. White shark optimizer: A novel bio-inspired meta-heuristic algorithm for global optimization problems. Knowl.-Based Syst. 2022, 243, 108457. [Google Scholar] [CrossRef]
  60. Naruei, I.; Keynia, F. A new optimization method based on COOT bird natural life model. Expert Syst. Appl. 2021, 183, 115352. [Google Scholar] [CrossRef]
  61. Dhiman, G.; Kaur, A. STOA: A bio-inspired based optimization algorithm for industrial engineering problems. Eng. Appl. Artif. Intell. 2019, 82, 148–174. [Google Scholar] [CrossRef]
  62. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H.L. Harris hawks optimization: Algorithm and applications. Futur. Gener. Comp. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  63. Alzubi, O.A.; Alzubi, J.A.; Dorgham, O.; Alsayyed, M. Cryptosystem design based on Hermitian curves for IoT security. J. Supercomput. 2020, 76, 8566–8589. [Google Scholar] [CrossRef]
  64. Zheng, J.; Ji, X.; Ma, Z.; Hu, G. Construction of local-shape-controlled quartic generalized said-ball model. Mathematics 2023, 11, 2369. [Google Scholar] [CrossRef]
  65. Hu, G.; Guo, Y.; Wei, G.; Abualigah, L. Genghis Khan shark optimizer: A novel nature-inspired algorithm for engineering optimization. Adv. Eng. Inform. 2023, 58, 102210. [Google Scholar] [CrossRef]
  66. Hu, G.; Zheng, Y.X.; Abualigah, L.; Hussien, A.G. DETDO: An adaptive hybrid dandelion optimizer for engineering optimization. Adv. Eng. Inform. 2023, 57, 102004. [Google Scholar] [CrossRef]
  67. Hu, G.; Du, B.; Wang, X.; Wei, G. An enhanced black widow optimization algorithm for feature selection. Knowl.-Based Syst. 2022, 235, 107638. [Google Scholar] [CrossRef]
Figure 1. Flowchart of BEESO solving the approximate merged model.
Figure 1. Flowchart of BEESO solving the approximate merged model.
Biomimetics 09 00134 g001
Figure 2. Adjacent 3-degree and 4-degree DWB curves in Example 1.
Figure 2. Adjacent 3-degree and 4-degree DWB curves in Example 1.
Biomimetics 09 00134 g002
Figure 3. Graphs and convergence curves for merging into 6-degree curves without endpoint preservation in Example 1.
Figure 3. Graphs and convergence curves for merging into 6-degree curves without endpoint preservation in Example 1.
Biomimetics 09 00134 g003
Figure 4. Graphs and convergence curves for merging into 6-degree curves in the endpoint-preserving case.
Figure 4. Graphs and convergence curves for merging into 6-degree curves in the endpoint-preserving case.
Biomimetics 09 00134 g004
Figure 5. Two adjacent 4-degree DWB curves in Example 2.
Figure 5. Two adjacent 4-degree DWB curves in Example 2.
Biomimetics 09 00134 g005
Figure 6. Graphs and convergence curves for merging into 6-degree curves without endpoint preservation in Example 2.
Figure 6. Graphs and convergence curves for merging into 6-degree curves without endpoint preservation in Example 2.
Biomimetics 09 00134 g006aBiomimetics 09 00134 g006b
Figure 7. Graphs and convergence curves for merging into 5-degree curves in the endpoint-preserving case.
Figure 7. Graphs and convergence curves for merging into 5-degree curves in the endpoint-preserving case.
Biomimetics 09 00134 g007
Figure 8. Adjacent 3-degree and 4-degree DWB curves in Example 3.
Figure 8. Adjacent 3-degree and 4-degree DWB curves in Example 3.
Biomimetics 09 00134 g008
Figure 9. Graphs and convergence curves for merging into 4-degree curves without endpoint preservation.
Figure 9. Graphs and convergence curves for merging into 4-degree curves without endpoint preservation.
Biomimetics 09 00134 g009
Figure 10. Graphs and convergence curves for merging into 4-degree curves in the endpoint-preserving case.
Figure 10. Graphs and convergence curves for merging into 4-degree curves in the endpoint-preserving case.
Biomimetics 09 00134 g010aBiomimetics 09 00134 g010b
Table 1. Coordinates of the two adjacent DWB curves in Example 1.
Table 1. Coordinates of the two adjacent DWB curves in Example 1.
CurvesVariablesValues
i = 0i = 1i = 2i = 3i = 4
( W ) 1 ( t ) p1,i(0.3, 0.3)(0.8, 1.6)(1.6, 1.8)(2.5, 1.7)/
r1,i0.130.150.190.17/
( W ) 2 ( t ) p2,i(2.5, 1.7)(3.4, 1.8)(3.8, 1.3)(3.4, 0.7)(2.2, 0.3)
r2,i0.170.120.130.10.16
Table 2. Optimization results of merging into 6-degree DWB curves without endpoint preservation.
Table 2. Optimization results of merging into 6-degree DWB curves without endpoint preservation.
MethodsVariablesOptimal ResultsErrors
j = 0j = 1j = 2j = 3j = 4j = 5j = 6
BEESOxj0.320610.642405.770041.147085.714275.461502.161067.256 × 10−4
yj0.263963.212201.234373.874707.908340.934250.30057
rj0.122860.146620.599510.063830.194170.195420.13245
SCAxj0.100001.954874.712650.500006.165135.4586222.884 × 10−2
yj0.322892.796500.179825.161304.698271.697570.40675
rj0.050000.019200.076880.481840.001540.600000.06559
GWOxj0.248011.676572.406462.888122.410486.346492.122321.450 × 10−3
yj0.252493.385140.737133.988258.350520.679430.32403
rj0.171590.013290.012170.585500.028330.011540.17778
CSAxj0.193452.333051.023043.015951.853926.618692.096722.713 × 10−3
yj0.364942.078443.391084.048756.322161.551620.26369
rj0.075950.498820.021860.157040.037530.237450.13200
SOxj0.292011.106594.268851.847964.385025.843742.138231.009 × 10−3
yj0.290322.951081.448514.319036.392911.455850.26852
rj0.132740.086490.397280.392190.049700.090400.15708
COOTxj0.261031.514273.242202.071294.648995.610182.166349.550 × 10−4
yj0.306212.664492.579763.602197.885571.010020.29692
rj0.132400.216870.116640.273940.017460.225240.13620
WSOxj0.303020.995894.349252.040264.086245.858882.149058.701 × 10−4
yj0.339932.156574.132522.943509.022350.705680.31241
rj0.123360.149940.567900.023660.552720.029050.15067
STOAxj0.296160.970485.288890.871308.452774.235812.271275.721 × 10−3
yj0.319092.910110.609835.069045.365421.739800.25750
rj0.109370.421150.012060.007420.575210.003190.06925
Table 3. Optimization results of merging into 6-degree DWB curves in the endpoint-preserving case.
Table 3. Optimization results of merging into 6-degree DWB curves in the endpoint-preserving case.
MethodsVariablesOptimal ResultsErrors
j = 1j = 2j = 3j = 4j = 5
BEESOxj0.713646.031250.740956.905575.003015.777 × 10−4
yj2.499723.727072.649809.649660.6
rj0.193030.061000.484970.149030.01804
SCAxj1.239931.427265.463322.167875.658291.682 × 10−2
yj3.530950.365033.702418.472491.11223
rj0.362420.494460.001000.002080.00223
GWOxj1.733371.133563.806882.038846.013182.569 × 10−3
yj3.278330.181164.626126.985561.09384
rj0.348680.061890.109140.224760.06069
CSAxj1.202663.689442.189054.667295.465181.108 × 10−3
yj3.162170.665124.416747.228101.04698
rj0.183450.229540.308630.321530.00197
SOxj0.979654.859551.392695.931925.204246.454 × 10−4
yj2.447413.739462.836599.209220.70607
rj0.145200.330540.281440.349530.00432
COOTxj1.281073.446742.229994.733175.434978.813 × 10−4
yj2.727682.568113.396438.631090.78280
rj0.165090.194560.414580.004020.09005
WSOxj0.854595.268861.276136.156395.143366.644 × 10−4
yj2.811002.127233.724007.777020.99677
rj0.200480.068640.473510.001160.08057
STOAxj1.016423.682552.930572.856105.887476.704 × 10−3
yj3.315140.145794.505596.965901.15675
rj0.064860.186530.004510.011240.00424
Table 4. Coordinates of the two adjacent DWB curves in Example 2.
Table 4. Coordinates of the two adjacent DWB curves in Example 2.
CurvesVariablesValues
i = 0i = 1i = 2i = 3i = 4
( W ) 1 ( t ) p1,i(0.3, 0.3)(0.8, 1.6)(1.6, 1.8)(2.5, 1.7)/
r1,i0.130.150.190.17/
( W ) 2 ( t ) p2,i(2.5, 1.7)(3.4, 1.8)(3.8, 1.3)(3.4, 0.7)(2.2, 0.3)
r2,i0.170.120.130.10.16
Table 5. Optimization results of merging into 5-degree DWB curves without endpoint preservation.
Table 5. Optimization results of merging into 5-degree DWB curves without endpoint preservation.
MethodsVariablesOptimal ResultsErrors
j = 0j = 1j = 2j = 3j = 4j = 5
BEESOxj4.14407−13.6356949.35647−27.2867527.2991713.756347.936 × 10−2
yj11.015042.49772−28.0968250.4125915.346646.33120
rj0.311820.254100.584690.073990.002080.38995
HHOxj1.958003.5806232.85162−33.1921534.0085213.214031.896 × 100
yj13.22078−32.0885348.85257−15.3260336.520326.02862
rj0.282220.496381.095750.091790.671920.03881
GWOxj2.96220−0.1455524.40576−8.1908820.2881114.187036.706 × 10−1
yj12.35794−16.2148316.68298−0.3802939.693584.48880
rj0.021990.178901.873520.096940.445970.01420
CSAxj2.86664−2.3550834.38615−25.5213530.7078213.217709.830× 10−1
yj13.43952−27.9569735.67889−10.0382341.665214.46278
rj0.222630.936220.148210.323360.582200.12940
SOxj3.95336−10.7358642.78311−21.3300525.5083313.726221.259 × 10−1
yj11.51607−2.07501−19.3335741.5471919.302096.09531
rj0.010030.374211.916340.001170.001310.33275
COOTxj2.780673.5802413.059926.8967812.8729214.746292.991 × 10−1
yj11.71382−4.26839−16.2806641.7982618.292456.19255
rj0.287020.342580.679240.333860.051450.35011
WSOxj4.20564−13.2444346.90530−23.0346324.7820914.002211.081 × 10−1
yj11.29270−1.16410−19.5945540.8342720.107845.89122
rj0.314420.012750.717210.083650.664880.15256
STOAxj3.83290−11.9814546.50427−23.5106424.9422113.947467.952 × 10−1
yj12.24440−15.5771918.53611−8.3922045.532164.00000
rj0.016790.897261.590330.001010.915950.42818
Table 6. Optimization results of merging into 5-degree DWB curves in the endpoint-preserving case.
Table 6. Optimization results of merging into 5-degree DWB curves in the endpoint-preserving case.
MethodsVariablesOptimal ResultsErrors
j = 1j = 2j = 3j = 4
BEESOxj−8.2013139.56450−19.6593124.343629.035 × 10−2
yj3.24733−29.2914549.9098516.90465
rj0.537270.001000.001000.56893
HHOxj0.8773213.142804.7547216.711685.343 × 10−1
yj−3.53821−7.5893325.5211225.73455
rj0.315370.001010.846760.31721
GWOxj0.4729416.115632.1323417.188043.160 × 10−1
yj2.97535−28.2411948.4637717.45618
rj0.026520.249200.173850.00476
CSAxj−4.6890129.32213−9.4999620.926111.310 × 10−1
yj3.30309−29.2461249.4255217.16853
rj0.394490.227910.025200.56299
SOxj−7.3937537.00421−16.7128523.271509.260 × 10−2
yj2.65403−27.5917648.1830317.49296
rj0.426630.165350.001000.53631
COOTxj−8.5555340.40914−20.0436524.390211.656 × 10−1
yj−1.89241−14.7830836.1427421.37543
rj0.010051.276170.479710.41079
WSOxj−8.3526240.14078−20.3566424.580889.188 × 10−2
yj2.59905−27.3867248.1160517.48977
rj0.132780.656010.433030.06355
STOAxj0.4056016.824360.3667917.992423.627 × 10−1
yj−1.91517−15.7474238.2449920.50645
rj0.188710.001330.045710.00142
Table 7. Coordinates of the two adjacent DWB curves in Example 3.
Table 7. Coordinates of the two adjacent DWB curves in Example 3.
CurvesVariablesValues
i = 0i = 1i = 2i = 3i = 4
( W ) 1 ( t ) p1,i(2, 7.2)(−1, 1)(2.5, 0.1)(5, 1.5)/
r1,i0.10.30.40.2/
( W ) 2 ( t ) p2,i(5, 1.5)(7.5, 3)(7.3, 7.8)(3, 7)(2.5, 2.5)
r2,i0.20.40.30.20.2
Table 8. Optimization results of merging into 4-degree DWB curves without endpoint preservation.
Table 8. Optimization results of merging into 4-degree DWB curves without endpoint preservation.
MethodsVariablesOptimal ResultsErrors
j = 0j = 1j = 2j = 3j = 4
BEESOxj2.33358−7.1630611.9436811.572881.923188.957 × 10−2
yj6.876980.49849−13.0668718.743532.21263
rj0.085170.705710.123700.466690.18610
HHOxj0.410957.43913−0.7297821.808871.336471.242 × 100
yj6.81754−4.44108−2.789394.608704.11749
rj0.001000.902870.001001.214840.14700
GWOxj1.239820.876436.2385916.367871.479552.491 × 10−1
yj7.17040−1.71070−11.5385717.467362.32996
rj0.003200.207390.142150.834200.05256
CSAxj1.83074−2.980938.2776015.375621.490601.754 × 10−1
yj7.34904−3.57450−9.4082514.955962.64042
rj0.009950.855010.228830.544320.15462
SOxj2.23359−6.6828711.8196111.630431.851291.082 × 10−1
yj7.17724−1.95063−10.9838416.470342.50440
rj0.162090.154100.457960.271530.21107
COOTxj1.88280−3.703098.9299314.758411.566621.214 × 10−1
yj6.900140.24457−12.8877318.625152.23042
rj0.203320.038490.511550.211450.19090
WSOxj2.38130−6.9833211.4493912.273881.815999.278 × 10−2
yj6.804090.83965−13.1758118.787262.21304
rj0.113980.218750.547590.090840.21260
STOAxj1.59611−2.290759.3687213.106931.819321.821 × 10−1
yj6.762071.85716−14.3996620.786481.78510
rj0.001991.042600.022120.001390.12128
Table 9. Optimization results of merging into 4-degree DWB curves in the endpoint-preserving case.
Table 9. Optimization results of merging into 4-degree DWB curves in the endpoint-preserving case.
MethodsVariablesOptimal ResultsErrors
j = 1j = 2j = 3
BEESOxj−6.3914212.631849.158821.467 × 10−1
yj−1.40489−11.6834216.97125
rj0.619660.189280.38233
HHOxj−6.9698913.261278.724261.588 × 10−1
yj−1.97471−10.7330516.21491
rj0.001640.643590.35071
GWOxj−6.4125212.650439.156501.574 × 10−1
yj−1.52899−11.5579616.89344
rj0.038530.086820.67139
CSAxj−6.4128012.659059.139501.310 × 10−1
yj−1.42554−11.6571316.95251
rj0.672480.116430.43568
SOxj−6.3917912.632409.158421.467 × 10−1
yj−1.40506−11.6832716.97116
rj0.619480.189480.38217
COOTxj−6.4035012.647549.146121.468 × 101
yj−1.42201−11.6609916.95623
rj0.541930.284130.31917
WSOxj−6.3895512.628359.161851.467 × 10−1
yj−1.40627−11.6813116.96950
rj0.620910.188200.38252
STOAxj−6.3421612.535659.259981.618 × 10−1
yj−1.69692−11.5058416.83058
rj0.005440.001630.80658
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lu, J.; Yang, R.; Hu, G.; Hussien, A.G. Ameliorated Snake Optimizer-Based Approximate Merging of Disk Wang–Ball Curves. Biomimetics 2024, 9, 134. https://doi.org/10.3390/biomimetics9030134

AMA Style

Lu J, Yang R, Hu G, Hussien AG. Ameliorated Snake Optimizer-Based Approximate Merging of Disk Wang–Ball Curves. Biomimetics. 2024; 9(3):134. https://doi.org/10.3390/biomimetics9030134

Chicago/Turabian Style

Lu, Jing, Rui Yang, Gang Hu, and Abdelazim G. Hussien. 2024. "Ameliorated Snake Optimizer-Based Approximate Merging of Disk Wang–Ball Curves" Biomimetics 9, no. 3: 134. https://doi.org/10.3390/biomimetics9030134

Article Metrics

Back to TopTop