Next Article in Journal
Impact of Environmental Conditions on the Health of the Far Eastern Region Population
Next Article in Special Issue
A Distributed Algorithm for Fast Mining Frequent Patterns in Limited and Varying Network Bandwidth Environments
Previous Article in Journal
A New Method of Simultaneous Localization and Mapping for Mobile Robots Using Acoustic Landmarks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Hybrid Crow Search Algorithm for Solving Permutation Flow Shop Scheduling Problems

1
Department of Electrical Engineering, National Kaohsiung University of Science and Technology, Kaohsiung City 807, Taiwan
2
Computer Science Department, BINUS Graduate Program-Master of Computer Science Bina Nusantara University, Jakarta 11480, Indonesia
3
Department of Computer Science and Information Management, Providence University, Taichung City 433, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(7), 1353; https://doi.org/10.3390/app9071353
Submission received: 7 March 2019 / Revised: 25 March 2019 / Accepted: 27 March 2019 / Published: 30 March 2019
(This article belongs to the Special Issue Actionable Pattern-Driven Analytics and Prediction)

Abstract

:
The permutation flow shop scheduling problem (PFSP) is a renowned problem in the scheduling research community. It is an NP-hard combinatorial optimization problem that has useful real-world applications. In this problem, finding a useful algorithm to handle the massive amounts of jobs required to retrieve an actionable permutation order in a reasonable amount of time is important. The recently developed crow search algorithm (CSA) is a novel swarm-based metaheuristic algorithm originally proposed to solve mathematical optimization problems. In this paper, a hybrid CSA (HCSA) is proposed to minimize the makespans of PFSPs. First, to make the CSA suitable for solving the PFSP, the smallest position value rule is applied to convert continuous numbers into job sequences. Then, the HCSA uses a Nawaz–Enscore–Ham (NEH) technique to create a population with the required levels of quality and diversity. We apply a local search to enhance the quality of the solutions and avoid premature convergence; simulated annealing enhances the local search of a method based on a variable neighborhood search. Computational tests are used to evaluate the algorithm using PFSP benchmarks with job sizes between 20 and 500. The tests indicate that the performance of the proposed HCSA is significantly superior to that of other algorithms.

1. Introduction

Several optimization methods have been proposed in artificial intelligence to find interesting patterns or optimization results for larger NP-hard optimization problems within a reasonable amount of time. To date, several real-world problems have been investigated to address this issue, such as data mining [1,2,3,4], DNA fragment assembly [5,6], DNA compression [7], Internet of Things [8,9], knapsack problem [10], networks of evolutionary processors [11], scheduling [12], and traveling salesman [13].
Among these, the permutation flow shop scheduling problem (PFSP) has attracted significant attention and has important roles in scheduling research. The PFSP was first proposed by Johnson [14] in 1953, and has attracted extensive attention since then. Several theoretical, algorithmic, and computational research studies have addressed this problem. The basic concept of the PFSP is that n jobs need to be processed on a sequence of m machines, such that each job needs to be processed on all machines in the same order. Johnson addressed it as a two-machine problem. Kan proved that the makespan minimization of the PFSP is an NP-hard problem [15]. For large problem instances, both the traditional dynamic programming approach and the heuristic approach require excessive computational time. To date, the Nawaz–Enscore–Ham (NEH) heuristic [16] is considered to be one of the most effective heuristic approaches for solving the PFSP; numerous variants of the NEH algorithm have been proposed, such as those in References [17,18,19,20,21,22]. Metaheuristic approaches are often applied to solve NP-hard combinatorial optimization problems. Various metaheuristic algorithms [23] continue to attract increasing interest in optimization research. This type of research is typically inspired by natural behavior, and has yielded algorithms such as the artificial bee colony (ABC) [24], ant colony optimization (ACO) [25], cuckoo search (CS) [26], differential evolution (DE) [27], firefly algorithm (FA) [28], gravitational search algorithm (GSA) [29], particle swarm optimization (PSO) [30], and whale optimization (WA) [31]. Recently, the combinatorial optimization community has attempted to solve the PFSP by using metaheuristic approaches, such as those in References [32,33,34,35,36,37]. One of the recently proposed swarm-based intelligence algorithms is the crow search algorithm (CSA) [38]. The CSA is based on the intelligent behavior of crows, who tend to store excess food in hiding places and retrieve it when required. Crows may try to follow other crows to steal their hidden food. The CSA simulates crow strategies with awareness probability and flight length parameters. The CSA provides an efficient search strategy for solving optimization problems. Furthermore, it is considered to be quicker than and superior to the PSO algorithm.
The performance of a simple CSA is controlled by two parameters: the flight length f L and the awareness probability A P . Although the CSA is better than PSO, similar issues occur; namely, it is necessary to avoid premature convergence and enhance diverse abilities [39,40]. In addition, the original CSA is more suitable for continuous optimization problems, whereas little research has been done on transferring the approach to combinatorial optimization problems. It is a challenge to map the CSA to PFSP problems. Thus, in this paper, we propose a hybrid CSA (HCSA) approach to minimize makespan in PFSP problems. First, to make the CSA suitable for solving the PFSP, the smallest position value rule is applied to convert continuous numbers into job sequences. Our HCSA applies an NEH technique to create a population with excellent quality and diversity. Finally, to enhance the quality of the solutions and avoid premature convergence, simulated annealing (SA) [41] is combined with the variable neighborhood search (VNS) [23] to produce SA-VNS, which provides a local search to enhance the exploitation and exploration of HCSA. Performance evaluations proved that our HCSA outperformed existing algorithms. In summary, HCSA applies the evolutionary searching of CSA to generate the population, individual update, and competition, and effectively find better solutions by utilizing and initializing adaptive local searches to create diversity.
The remainder of this paper is organized as follows. Background and related work are described in Section 2. The problem definition is stated in Section 3. Section 4 presents the proposed HCSA. The performance evaluation is outlined in Section 5. Finally, conclusions and suggestions for future research are provided in Section 6.

2. Background Knowledge and Related Works

This section reviews relevant background and related studies. The following topics are discussed in detail:
  • Nawaz–Enscore–Ham heuristic (NEH);
  • Crow search algorithm (CSA);
  • Simulated annealing (SA);
  • Variable neighborhood search (VNS).

2.1. NEH Heuristic

The NEH [16] algorithm is the best constructive heuristic for solving the PFSP, as has previously been discussed in References [17,18,19,20,21,22,42]. Two main concepts underpin the NEH algorithm: 1. the order of the jobs; and 2. that jobs are prioritized based on priority rules and tie-breaking strategies, such as NEHD [17], NEHKK1 [18], NEHKK2 [19], CL [20], and NEHLJP1 [22]. The procedure employed by the NEH algorithm is presented in Algorithm 1.
Algorithm 1 Standard Nawaz–Enscore–Ham (NEH) algorithm.
  Input:   n jobs, s machines, and the processing time of each job on each machine
  Output:A solution with a minimized makespan
  Step:Description:
  1.Obtain a permutation π of n jobs such that each job is in a descending order of its processing time on the machines.
  2.Choose the first two jobs from job sequence π , and obtain the superior order according to the makespan values
  3.Generate a new job permutation by inserting the q-th job (q = 3, …, n) in job sequence π in every possible slot of job sequence π , and choose the order with the minimum makespan.
  4.Finally, output the current best solution—the one with the minimum makespan value.

2.2. Crow Search Algorithm (CSA)

Crows are among the most intelligent types of birds; the behavior of crows indicates substantial cognitive ability. Although crows are less intelligent than humans, crows make tools and recognize themselves in mirrors [43]. In a typical community of crows, each crow has its own cache of food, and each crow hides its cache from potential burglars. Askazadeh published the CSA [38]—a stochastic swarm-based optimization method. The main points of the CSA are as follows:
  • Crows live in flocks;
  • Crows recall where their caches are;
  • Crows pursue each other and opportunistically burglarize food caches;
  • Crows protect their hiding spaces from attackers with a probability in the interval [0, 1].
The CSA represents crows as software entities. Each crow has some flight length f L and an awareness probability A P . If the value of f L is small, then the CSA conducts a local search; if the value of f L is large, then the CSA conducts a global search. The values of A P control crow intensity and diversity. The CSA generates crow positions randomly. For N crows in a solution space with dimension d, at iteration t, Equation (1) calculates each crow’s position in the solution space:
X i t = { x i , 1 t , x i , 2 t , x i , j t , , x i , d t } for i = 1 , 2 , 3 , N ,
where x i , j t is the j-th potential position of crow i.
Our model considers that crow c may be followed by crow i, and crow c may or may not be aware of this pursuer. Equation (2) calculates the updated position of crow i at time t + 1 :
x i , j t + 1 = x i , j t + r a n d i × f l × ( m c , j i x i , j i ) if r a n d i A P , random position if r a n d i < A P ,
where x i , j t is the location of crow i in dimension j at iteration t, m i , j t is the location of the hiding place of crow i at iteration t. f L is the flight length of crow i at iteration t, A P is the awareness probability of crow c at iteration t, and r a n d i is a random variable in the range [0, 1].
The CSA algorithm is outlined in Algorithm 2.
Algorithm 2 Standard crow search algorithm (CSA).
  Input:Number of crows N, flight length f L , awareness probability A P
  Output:The best solution
  Step:Description:
  1.Randomly initialize the position of all crows.
  2.Initialize the memory of all crows.
  3.Obtain the fitness value of all crows.
  4.Obtain the memory of all crows.
  5.Update the position of all crows according to Equation (2).
  6.Repeat Steps 3 to 5 until the termination criterion is reached.
  7.Output the best solution.

2.3. Simulated Annealing (SA)

The SA algorithm is a metaheuristic method proposed by Kirkpatrick et al. [41]. The concept underlying the SA algorithm is derived from the annealing of solids. SA is extensively applied to solve complex combinatorial problems, as it employs a perturbation search strategy. Some studies have applied SA to solve the PFSP [21,44,45,46,47]. The detailed procedure employed by the SA algorithm is shown in Algorithm 3.
Algorithm 3 A standard simulated annealing (SA) algorithm.
  Input:Initial temperature T, cooling rate β , and frozen temperature T e n d
  Output:The best solution
  Step:Description:
  1.Initialize with a random solution.
  2.Generate a new solution by using the perturbation method and computing a fitness function.
  3.Calculate the energy change Δ E = f n e w f o l d .
  4.If Δ E < 0 , the new solution will be accepted and adopted as the best solution.
  5.If Δ E > 0 , calculate the probability of accepting the solution according to r p < ( e Δ E / T ) .
  6.Adjust temperature T using T = T β .
  7.Repeat Steps 2 to 6 until temperature T reaches the freezing point, T e n d .
  8.Finally, the best solution is found.
   In the above, f n e w is the new solution object value, f o l d is the current best solution value, and r p is a uniform random variable in the interval [0, 1].

2.4. Variable Neighborhood Search

VNS [23] is a local search strategy used to improve metaheuristic methods. VNS can be used to maintain the diversity of solutions throughout the metaheuristic process. Four common neighborhood perturbation operators are adopted in VNS: pair-swap, inversion, insertion, and displacement. These are described as follows::
  • Pair-swap: Randomly select two different positions from the permutation sequence and swap these positions;
  • Inversion: Invert the subsequence between two random positions from the permutation sequence;
  • Insertion: Randomly select two different positions from the permutation sequence and insert the front position before the back position;
  • Displacement: Randomly choose a position and a subsequence; then, insert the subsequence before the chosen position.
Figure 1 shows a simple example of each of these VNS local search operations. The VNS algorithm continues to run until a stopping condition is reached; this is the maximum number of iterations.

3. Problem Definition

The PFSP requires running n jobs on a sequence of m machines, such that every job must be processed on all machines in the same order; the goal is to minimize the makespan. The detailed description is as follows: Given n jobs { 1 , 2 , , n } , each job is processed on a series of s machines { 1 , 2 , , s } in a sequence defined by the job permutation. The job permutation π = { π 1 , π 2 , , π n } is calculated by ordering the operations into a permutation. The processing time of job i on machine j is denoted by p π i , j , where each job is permitted to be processed on no more than one machine at a time and each machine can process only one job at a time. The completion time of job i on the machine is denoted by C ( π i , k ) . The objective function for PFSP is expressed in Equation (3).
C ( π 1 , 1 ) = p π 1 , 1 , C ( π i , 1 ) = C ( π i 1 , 1 ) + p π i , 1 , i = 2 , 3 , , n , C ( 1 , π k ) = C ( π 1 , k 1 ) + p π 1 , k , k = 2 , 3 , , s , C ( π j , k ) = max { C ( π i , k 1 ) , C ( π i 1 , k ) } + p π i , k , for i = 2 , 3 , , n ; k = 2 , 3 , , s .
The optimal permutation π from the set of all different permutations F m (i.e., the permutation with the minimum makespan) can be found by the optimization in Equation (4):
π = arg min C ( π ) , π F m , C m a x ( π ) = C ( π n , s ) .

4. Proposed Algorithm

4.1. Solution Representation

The CSA uses continuous number encoding for a swarm-based metaheuristic representation. Whereas typical metaheuristics generate permutations to solve NP-hard problems, the CSA does not directly generate permutations. In this study, we use a smallest position value (SPV) rule inspired by the random keys approach [48] to convert continuous numbers into a job sequence [49]. With the SPV rule, the position values of all crows are first sorted in ascending order, and then the job permutation is determined based on the results. Table 1 presents an example of the SPV transformation of continuous values into permutations of a job sequence. Assuming we have six jobs and six dimensions, the position values of the crows are presented in the second column as (1.35, −2,46, −1.52, 2.31, 0.52, and −1.68). Based on the SPV values, the smallest position value is −2.46; therefore, the first job order is two. The second smallest position value is −1.68; therefore, the second job order is six. Similarly, the ranking order values produce the job values and the output permutation is [2, 6, 3, 5, 1, 4].

4.2. Initial Population

The NEH algorithm is an efficient heuristic method for solving the PFSP. Thus, to guarantee that the initial population can receive a quality solution with high diversity, one of the crows uses an NEH-based algorithm to generate the job permutation, which is then mapped to the position values according to the SPV rule. Table 2 shows how the SPV is applied to transform the permutations of the job sequence into a continuous value. To limit the search space, Equation (5) is applied to the first crow population:
x i , d = x m i n + ( x m a x x m i n ) r i . d = 1 , 2 , , n ,
where d is the quantity of jobs at the agent position and x i , d refers to the dth position value of crow i. Further, r i is a random variable uniformly distributed in the interval [0, 1]. x m i n = 5 . 0 and x m a x = 5 . 0 are respectively the lower and upper bounds of the position or velocity values.

4.3. SA-VNS Local Search

The VNS procedure can be combined with other heuristic algorithms to ameliorate the solution quality [23]. Thus, in this study, the SA algorithm was combined with VNS for a local search after the CSA procedure. This improved the local search ability and avoided premature convergence. Incorporating the SA algorithm into the VNS process enables an appropriate balance between exploration and exploitation to be maintained at different temperatures.

4.4. The Proposed HCSA Algorithm

Figure 2 presents a flowchart of the hybrid crow search algorithm (HCSA). Initially, NEH-based algorithms are used to obtain a solution for one crow; the values for the remaining crows are generated randomly. The SPV rules and the evolutionary CSA are used to generate numerous possible solutions. The SPV rules and the evolutionary CSA are then used to randomly generate numerous possible solutions, and the system delivers the global best solution (out of all known possibilities) before the stopping criterion is met. The SA-VNS local search improves the quality of the global best solution obtained from the CSA. Finally, the approximate best permutation is obtained.

5. Experimental Results

In this section, the performance of the proposed HCSA is presented.

5.1. Environment Setting

The proposed HCSA’s performance was evaluated by solving the PFSP benchmarks generated by Taillard [50], which comprise 12 instances ranging from 20 jobs with five machines (Ta01) to 500 jobs with 20 machines (Ta111), as shown in Table 3. All programs were implemented in Java and executed on a Windows 10 operating system running on a computer with an AMD Ryzen 3 1200 Quad-Core 3.10 GHz CPU, and 8 GB RAM.

5.2. Parameter Setting

Table 4 lists the parameter settings for all experiments. The performance of the CSA was assessed based on Equations (6) and (7):
ARPD = i = 1 R ( ( S i U B U B ) × 100 % ) R ,
BRPD = i = 1 R ( ( S b s t U B U B ) × 100 % ) R ,
where ARPD is the average percentage relative deviation, BRPD is the best percentage relative deviation, S i is the average values of the makespan found by the algorithm, S b s t is the best makespan found by the algorithm, U B indicates the upper bound of the benchmark, and R is the number of independent runs.

5.3. Comparison of CSA Algorithms Incorporated with Variants of the NEH

To validate the effectiveness of each initialization approach, the CSA was combined with several variants of the NEH-based algorithm and their respective performance levels were compared. A new priority rule and a new tie-breaking rule based on the NEH [16] algorithm were used; elements included NEHD [17], NEHKK1 [18], NEHKK2 [19], CL [20], and NEHLJP1 [22]. The BRPD and ARPD values for the simulation of the CSA coupled with six initialization NEH algorithms are summarized in Table 5 and the values in bold font represent the algorithms that achieve superior performance.
The best BRPD and ARPD values of the tested NEH-based algorithms were 1.88 and 2.18, respectively. From the aforementioned simulation results, it can be concluded that the CSA combined with the NEHLJP1 heuristic is more effective than other NEH-based heuristic algorithms. In addition, the original NEH algorithm had better ARPD values than the NEHD and NEHKK1 variants. Therefore, we incorporated NEHLJP1 into the CSA algorithm as the initialization strategy. To present the previous experimental results more clearly, the data from Table 5 are depicted visually in Figure 3. The x-axis shows the different sizes of the benchmarks, and the y-axis shows the ARPD and BRPD values for each test function.

5.4. Comparison of CSA Algorithms Incorporated with the SA-Based Local Search

NEHLJP1 combined with the CSA algorithm provided the best BRPD and ARPD values. In this section, we validate the effectiveness of the algorithm on the number of iterations in the SA local search. We compared two different iterations of the local search, 1000 and j o b s × ( j o b s 1 ) [49], using the BRPD and ARPD values and the average computation time t a v g , where j o b s indicates the number of jobs of the benchmark and the values in bold font represent the algorithms that achieve superior performance.
The computed BRPD and ARPD results are listed in Table 6 and Table 7. The j o b s × ( j o b s 1 ) strategy achieved better BRPD and ARPD values for small jobs (0.54 and 0.69, respectively). However, the BRPD and ARPD values were almost the same for large jobs. Additionally, the computation times were almost identical for two different numbers of iterations. In summary, the j o b s × ( j o b s 1 ) strategy achieved the most desirable BRPD and ARPD values for all benchmarks, specifically 0.61 and 0.81, respectively.

5.5. Comparison of Meta-Heuristic Algorithms

We compared the performance level of the HCSA with those of one GSA-based and two PSO-based algorithms, namely, MGELS [37], PSOVNS [49], and PSOMA [51], for 1000 iterations of the 12 Taillard benchmarks. The settings employed for HCSA are detailed in Table 4. MGELS is a GSA-based algorithm with an emulated local search algorithm. PSOVNS is an SPV-rule-based algorithm that combines PSO and VNS. PSOMA is an SPV-rule-based algorithm that combines PSO and NEH, and also adopts an SA-based local search.
As shown in Table 8 and the values in bold font represent the algorithms that achieve superior performance. In terms of the BRPD values, the HCSA was superior to the MGELS, PSOVNS, and PSOMA algorithms, except for ta081 (100*20). In terms of the ARPD values, the HCSA was better than the MGELS, PSOVNS, and PSOMA algorithms, except for ta081 (100*20) and ta091 (200*10). The average ARPD value from HCSA was 0.61%, in comparison to the values of 6.00%, 2.05%, and 0.90% obtained by MGELS, PSOVNS, and PSOMA, respectively. In addition, because MGELS does not apply any initialization algorithm or local search, its performance had significant differences from other algorithms for large benchmarks. For the local search, using an SA-based search with the metaheuristic algorithm achieved better results than not using SA. In summary, from the aforementioned simulation results, it can be concluded that HCSA is more effective than the MGELS, PSOVNS, and PSOMA metaheuristic algorithms. To present the previous experimental results more clearly, the data from Table 8 are depicted visually in Figure 4. The x-axis shows the different sizes of the benchmarks, and the y-axis shows the ARPD and BRPD values for each test function.

6. Conclusions

This paper proposed an HCSA approach to solve the PFSP. The proposed HCSA adopts an SPV rule to convert the continuous position values to job permutations and incorporates an NEH-based heuristic algorithm for population initialization. It also adopts the SA-based VNS local search method to balance exploitation and exploration, and ultimately enhance the quality of the solution. Simulation results demonstrate that the proposed HCSA, which incorporates NEHLJP1 and an SA-based VNS local search, produced a better makespan for all benchmark datasets than other algorithms. In our future research, we hope to analyze the following four aspects of the HCSA: (1) applying the HCSA to large real-world problems, such as nurse scheduling, basketball tournament scheduling, automotive manufacturing, and the vehicle routing scheduling problem; (2) adapting the Lévy searching strategy to improve the local search; (3) using graph structures and neighborhood structures to speed up the local search; and (4) implementing a new NEH-based heuristic algorithm and solving the larger benchmarks.

Author Contributions

Conceptualization and methodology, K.-W.H., A.S.G. and Z.-X.W.; writing—original draft preparation, K.-W.H., A.S.G., Z.-X.W. and Y.-W.C.; writing—review and editing, K.-W.H. and Y.-W.C.

Funding

This work was supported in part by the Ministry of Science and Technology, Taiwan, R.O.C., under grants MOST 107-2218-E-992-303.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Fournier-Viger, P.; Lin, J.C.W.; Kiran, R.U.; Koh, Y.S.; Thomas, R. A survey of sequential pattern mining. Data Sci. Pattern Recognit. 2017, 1, 54–77. [Google Scholar]
  2. Gan, W.; Lin, J.C.W.; Fournier-Viger, P.; Chao, H.C.; Philip, S.Y. HUOPM: High-Utility Occupancy Pattern Mining. IEEE Trans. Cybern. 2019, 1–14. [Google Scholar] [CrossRef] [PubMed]
  3. Lin, J.C.W.; Zhang, Y.; Zhang, B.; Fournier-Viger, P.; Djenouri, Y. Hiding sensitive itemsets with multiple objective optimization. Soft Comput. 2019, 1–19. [Google Scholar] [CrossRef]
  4. Lin, J.C.W.; Yang, L.; Fournier-Viger, P.; Hong, T.P. Mining of skyline patterns by considering both frequent and utility constraints. Eng. Appl. Artif. Intell. 2019, 77, 229–238. [Google Scholar] [CrossRef]
  5. Huang, K.W.; Chen, J.L.; Yang, C.S.; Tsai, C.W. A memetic particle swarm optimization algorithm for solving the DNA fragment assembly problem. Neural Comput. Appl. 2014, 26, 495–506. [Google Scholar] [CrossRef]
  6. Allaoui, M.; Ahiod, B.; El Yafrani, M. A hybrid crow search algorithm for solving the DNA fragment assembly problem. Expert Syst. Appl. 2018, 102, 44–56. [Google Scholar] [CrossRef]
  7. Zhu, Z.; Zhou, J.; Ji, Z.; Shi, Y.H. DNA Sequence Compression Using Adaptive Particle Swarm Optimization-Based Memetic Algorithm. IEEE Trans. Evol. Comput. 2011, 15, 643–658. [Google Scholar] [CrossRef]
  8. Fan, Q.; Ansari, N. Application aware workload allocation for edge computing-based IoT. IEEE Internet Things J. 2018, 5, 2146–2153. [Google Scholar] [CrossRef]
  9. Lin, J.C.W.; Wu, J.M.T.; Fournier-Viger, P.; Djenouri, Y.; Chen, C.H.; Zhang, Y. A Sanitization Approach to Secure Shared Data in an IoT Environment. IEEE Access 2019. [Google Scholar] [CrossRef]
  10. López, L.F.M.; Blas, N.G.; Albert, A.A. Multidimensional knapsack problem optimization using a binary particle swarm model with genetic operations. Soft Comput. 2018, 22, 2567–2582. [Google Scholar] [CrossRef]
  11. De Mingo López, L.F.; Gómez Blas, N.; Arteta, A. Optimal Performance: Underlying Octahedron Graph of Evolutionary Processors. Comput. Inform. 2016, 34, 858–876. [Google Scholar]
  12. Kadri, R.L.; Boctor, F.F. An efficient genetic algorithm to solve the resource-constrained project scheduling problem with transfer times: The single mode case. Eur. J. Oper. Res. 2018, 265, 454–462. [Google Scholar] [CrossRef]
  13. Chen, W.N.; Chung, H.S.H.; Zhong, W.L.; Wu, W.G.; Shi, Y.H. A Novel Set-Based Particle Swarm Optimization Method for Discrete Optimization Problems. IEEE Trans. Evol. Comput. 2010, 14, 278–300. [Google Scholar] [CrossRef]
  14. Johnson, S.M. Optimal two and three-stage production schedules with setup times included. Naval Res. Logist. Q. 1954, 1, 61–68. [Google Scholar] [CrossRef]
  15. Rinnooy Kan, A.H.G. Machine Scheduling Problems: Classification, Complexity And Computations; Martinus Nijhoff: The Hague, The Netherlands, 1976. [Google Scholar]
  16. Nawaz, M.; Enscore, E.E., Jr.; Ham, I. A heuristic algorithm for the m-machine, n-job flow-shop sequencing problem. Omega 1983, 11, 91–95. [Google Scholar] [CrossRef]
  17. Dong, X.; Huang, H.; Chen, P. An improved NEH-based heuristic for the permutation flowshop problem. Comput. Oper. Res. 2008, 35, 3962–3968. [Google Scholar] [CrossRef]
  18. Kalczynski, P.J.; Kamburowski, J. An improved NEH heuristic to minimize makespan in permutation flow shops. Comput. Oper. Res. 2008, 35, 3001–3008. [Google Scholar] [CrossRef]
  19. Kalczynski, P.J.; Kamburowski, J. An empirical analysis of the optimality rate of flow shop heuristics. Eur. J. Oper. Res. 2009, 198, 93–101. [Google Scholar] [CrossRef]
  20. Ying, K.C.; Lin, S.W. A high-performing constructive heuristic for minimizing makespan in permutation flowshops. J. Ind. Prod. Eng. 2013, 30, 355–362. [Google Scholar] [CrossRef]
  21. Xiong, F.; Xing, K.; Wang, F. Scheduling a hybrid assembly-differentiation flowshop to minimize total flow time. Eur. J. Oper. Res. 2015, 240, 338–354. [Google Scholar] [CrossRef]
  22. Liu, W.; Jin, Y.; Price, M. A new improved NEH heuristic for permutation flowshop scheduling problems. Int. J. Prod. Econ. 2017, 193, 21–30. [Google Scholar] [CrossRef]
  23. Gendreau, M.; Potvin, J.Y. Handbook of Metaheuristics, 2nd ed.; Springer: New York, NY, USA, 2010. [Google Scholar]
  24. Karaboga, D. An Idea Based on Honey Bee Swarm for Numerical Optimization; Technical Report TR06; Erciyes University: Kayseri, Turkey, 2005. [Google Scholar]
  25. Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B 1996, 26, 29–41. [Google Scholar] [CrossRef] [PubMed]
  26. Yang, X.S. Chapter 9—Cuckoo Search. In Nature-Inspired Optimization Algorithms; Yang, X.S., Ed.; Elsevier: Oxford, UK, 2014; pp. 129–139. [Google Scholar]
  27. Das, S.; Suganthan, P. Differential Evolution: A Survey of the State-of-the-Art. IEEE Trans. Evol. Comput. 2011, 15, 4–31. [Google Scholar] [CrossRef]
  28. Yang, X.S. Firefly Algorithms for Multimodal Optimization. In Stochastic Algorithms: Foundations and Applications; Watanabe, O., Zeugmann, T., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2009; Volume 5792, pp. 169–178. [Google Scholar]
  29. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  30. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  31. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  32. Chiang, M.C.; Tsai, C.W.; Yang, C.S. A time-efficient pattern reduction algorithm for k-means clustering. Inf. Sci. 2011, 181, 716–731. [Google Scholar] [CrossRef]
  33. Fong, S.; Lou, H.L.; Zhuang, Y.; Deb, S.; Hanne, T. Solving the Permutation Flow Shop Problem with Firefly Algorithm. In Proceedings of the 2014 2nd International Symposium on Computational and Business Intelligence (ISCBI), New Delhi, India, 7–8 December 2014; pp. 25–29. [Google Scholar]
  34. Ding, J.Y.; Song, S.; Gupta, J.N.; Zhang, R.; Chiong, R.; Wu, C. An improved iterated greedy algorithm with a Tabu-based reconstruction strategy for the no-wait flowshop scheduling problem. Appl. Soft Comput. 2015, 30, 604–613. [Google Scholar] [CrossRef]
  35. Marichelvam, M.; Tosun, Ö.; Geetha, M. Hybrid monkey search algorithm for flow shop scheduling problem under makespan and total flow time. Appl. Soft Comput. 2017, 55, 82–92. [Google Scholar] [CrossRef]
  36. Abdel-Basset, M.; Manogaran, G.; El-Shahat, D.; Mirjalili, S. A hybrid whale optimization algorithm based on local search strategy for the permutation flow shop scheduling problem. Future Gener. Comput. Syst. 2018, 85, 129–145. [Google Scholar] [CrossRef]
  37. Sanjeev Kumar, R.; Padmanaban, K.; Rajkumar, M. Minimizing makespan and total flow time in permutation flow shop scheduling problems using modified gravitational emulation local search algorithm. Proc. Inst. Mech. Eng. Part B J. Eng. Manuf. 2018, 232, 534–545. [Google Scholar] [CrossRef]
  38. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  39. Abdelaziz, A.Y.; Fathy, A. A novel approach based on crow search algorithm for optimal selection of conductor size in radial distribution networks. Eng. Sci. Technol. 2017, 20, 391–402. [Google Scholar] [CrossRef]
  40. Jain, M.; Rani, A.; Singh, V. An improved Crow Search Algorithm for high-dimensional problems. J. Intell. Fuzzy Syst. 2017, 33, 3597–3614. [Google Scholar] [CrossRef]
  41. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  42. Taillard, E. Some efficient heuristic methods for the flow shop sequencing problem. Eur. J. Oper. Res. 1990, 47, 65–74. [Google Scholar] [CrossRef]
  43. Prior, H.; Schwarz, A.; Güntürkün, O. Mirror-Induced Behavior in the Magpie (Pica pica): Evidence of Self-Recognition. PLoS Biol. 2008, 6, 1–9. [Google Scholar] [CrossRef]
  44. Osman, I.H.; Potts, C.N. Simulated annealing for permutation flow-shop scheduling. Omega 1989, 17, 551–557. [Google Scholar] [CrossRef]
  45. Low, C.; Yeh, J.Y.; Huang, K.I. A robust simulated annealing heuristic for flow shop scheduling problems. Int. J. Adv. Manuf. Technol. 2004, 23, 762–767. [Google Scholar] [CrossRef]
  46. Low, C. Simulated annealing heuristic for flow shop scheduling problems with unrelated parallel machines. Comput. Oper. Res. 2005, 32, 2013–2025. [Google Scholar] [CrossRef]
  47. Dai, M.; Tang, D.; Giret, A.; Salido, M.A.; Li, W. Energy-efficient scheduling for a flexible flow shop using an improved genetic-simulated annealing algorithm. Robot. Comput. Integr. Manuf. 2013, 29, 418–429. [Google Scholar] [CrossRef]
  48. Bean, J.C. Genetic Algorithms and Random Keys for Sequencing and Optimization. ORSA J. Comput. 1994, 6, 154–160. [Google Scholar] [CrossRef]
  49. Tasgetiren, M.F.; Liang, Y.C.; Sevkli, M.; Gencyilmaz, G. A particle swarm optimization algorithm for makespan and total flowtime minimization in the permutation flowshop sequencing problem. Eur. J. Oper. Res. 2007, 177, 1930–1947. [Google Scholar] [CrossRef]
  50. Taillard, E. Benchmarks for basic scheduling problems. Eur. J. Oper. Res. 1993, 64, 278–285. [Google Scholar] [CrossRef]
  51. Liu, B.; Wang, L.; Jin, Y. An Effective PSO-Based Memetic Algorithm for Flow Shop Scheduling. IEEE Trans. Syst. Man Cybern. Part B 2007, 37, 18–27. [Google Scholar] [CrossRef]
Figure 1. Examples of the different operations in the variable neighborhood search (VNS) local search algorithm.
Figure 1. Examples of the different operations in the variable neighborhood search (VNS) local search algorithm.
Applsci 09 01353 g001
Figure 2. Proposed hybrid CSA (HCSA) for solving the permutation flow shop scheduling problem (PFSP).
Figure 2. Proposed hybrid CSA (HCSA) for solving the permutation flow shop scheduling problem (PFSP).
Applsci 09 01353 g002
Figure 3. Comparison of the results for all different NEH-based algorithms.
Figure 3. Comparison of the results for all different NEH-based algorithms.
Applsci 09 01353 g003
Figure 4. Comparison of the results for various metaheuristic algorithms.
Figure 4. Comparison of the results for various metaheuristic algorithms.
Applsci 09 01353 g004
Table 1. Example of the smallest position value (SPV) approach.
Table 1. Example of the smallest position value (SPV) approach.
Dimension dPosition ValueJob Permutation
11.352
2−2.466
3−1.523
42.315
50.521
6−1.684
Table 2. Mapping a job permutation to position values.
Table 2. Mapping a job permutation to position values.
Dimension dJob Permutation from NEH HeuristicInitial ValueRemapped Values
12−1.021.31
260.45−1.02
331.310.45
45−0.762.87
512.870.56
640.56−0.76
Table 3. Taillard [50] benchmarks for the permutation flow shop scheduling problem (PFSP).
Table 3. Taillard [50] benchmarks for the permutation flow shop scheduling problem (PFSP).
ProblemJobsMachines
ta01205
ta112010
ta212020
ta31505
ta415010
ta515020
ta611005
ta7110010
ta8110020
ta9120010
ta10120020
ta11150020
Table 4. Parameter settings of the proposed algorithm.
Table 4. Parameter settings of the proposed algorithm.
AlgorithmParameterValue
CSA
Number of crows20
Number of iterations t m a x 1000
Number of independent runs R20
Flight length f L 10
Awareness probability A P 0.25
SA
Initial temperature T100
cooling coefficient β 0.99
Table 5. Comparison of different variants of initialization algorithms. ARPD: average percentage relative deviation; BRPD: best percentage relative deviation.
Table 5. Comparison of different variants of initialization algorithms. ARPD: average percentage relative deviation; BRPD: best percentage relative deviation.
NEHNEHDNEHKK1NEHKK2CLNEHLJP1
Problem
ta01BRPD0.701.490.471.480.701.48
ARPD0.731.551.361.480.711.48
ta11BRPD2.522.842.142.462.722.66
ARPD3.283.664.793.734.083.12
ta21BRPD3.222.822.391.922.052.66
ARPD4.222.823.932.422.733.37
ta31BRPD0.180.001.020.180.180.18
ARPD0.180.391.980.210.250.18
ta41BRPD4.743.813.514.544.543.51
ARPD5.055.624.334.545.484.57
ta51BRPD3.455.614.524.554.504.16
ARPD3.657.205.214.964.714.38
ta61BRPD0.380.381.360.430.040.09
ARPD0.380.431.370.430.040.45
ta71BRPD1.061.870.880.920.740.88
ARPD1.192.101.060.941.140.98
ta81BRPD2.853.143.642.773.193.17
ARPD3.214.143.933.313.563.31
ta91BRPD0.740.200.720.520.680.75
ARPD0.750.490.720.610.680.75
ta101BRPD2.461.572.132.762.281.89
ARPD2.661.692.242.882.592.24
ta111BRPD1.471.591.541.221.521.15
ARPD1.791.691.701.431.771.38
AverageBRPD1.982.112.031.981.931.88
ARPD2.262.652.722.252.312.18
Table 6. Comparison of different iterations of the local search for small job sizes.
Table 6. Comparison of different iterations of the local search for small job sizes.
1000 jobs × ( jobs 1 )
Problem
ta01BRPD0.000.00
ARPD0.000.00
t a v g 1.431.43
ta11BRPD0.000.00
ARPD0.060.00
t a v g 1.671.65
ta21BRPD0.000.00
ARPD0.270.18
t a v g 2.262.23
ta31BRPD0.000.00
ARPD0.000.00
t a v g 3.083.21
ta41BRPD1.471.24
ARPD2.291.70
t a v g 3.753.82
ta51BRPD2.181.95
ARPD2.422.25
t a v g 5.225.31
AverageBRPD0.610.54
ARPD0.840.69
t a v g 2.902.94
Table 7. Comparison of different iterations of the local search for large job sizes.
Table 7. Comparison of different iterations of the local search for large job sizes.
1000 jobs × ( jobs 1 )
Problem
ta61BRPD0.000.00
ARPD0.020.02
t a v g 6.026.22
ta71BRPD0.260.23
ARPD0.510.50
t a v g 7.327.56
ta81BRPD1.721.59
ARPD2.112.10
t a v g 10.2810.32
ta91BRPD0.160.16
ARPD0.480.56
t a v g 14.7015.32
ta101BRPD1.081.26
ARPD1.411.39
t a v g 19.8820.79
ta111BRPD1.470.89
ARPD1.791.06
t a v g 54.1455.92
AverageBRPD0.690.69
ARPD0.940.94
t a v g 18.7219.36
Table 8. Comparison tests of various meta-heuristic algorithms.
Table 8. Comparison tests of various meta-heuristic algorithms.
HCSAMGELSPSOVNSPSOMA
Problem
ta01BRPD0.001.480.230.00
ARPD0.002.451.630.00
ta11BRPD0.002.840.260.00
ARPD0.004.952.280.10
ta21BRPD0.002.840.740.00
ARPD0.184.141.770.20
ta31BRPD0.000.770.000.00
ARPD0.001.650.510.04
ta41BRPD1.248.421.611.61
ARPD1.708.821.752.43
ta51BRPD1.987.742.912.15
ARPD2.2510.264.312.54
ta61BRPD0.000.520.000.00
ARPD0.021.100.020.11
ta71BRPD0.234.921.190.26
ARPD0.505.701.980.50
ta81BRPD1.599.382.331.29
ARPD2.1010.513.821.62
ta91BRPD0.162.171.120.16
ARPD0.564.171.170.21
ta101BRPD1.268.501.961.45
ARPD1.399.473.031.53
ta111BRPD0.807.401.641.35
ARPD1.068.822.371.48
AverageBRPD0.614.751.150.69
ARPD0.816.002.050.90

Share and Cite

MDPI and ACS Style

Huang, K.-W.; Girsang, A.S.; Wu, Z.-X.; Chuang, Y.-W. A Hybrid Crow Search Algorithm for Solving Permutation Flow Shop Scheduling Problems. Appl. Sci. 2019, 9, 1353. https://doi.org/10.3390/app9071353

AMA Style

Huang K-W, Girsang AS, Wu Z-X, Chuang Y-W. A Hybrid Crow Search Algorithm for Solving Permutation Flow Shop Scheduling Problems. Applied Sciences. 2019; 9(7):1353. https://doi.org/10.3390/app9071353

Chicago/Turabian Style

Huang, Ko-Wei, Abba Suganda Girsang, Ze-Xue Wu, and Yu-Wei Chuang. 2019. "A Hybrid Crow Search Algorithm for Solving Permutation Flow Shop Scheduling Problems" Applied Sciences 9, no. 7: 1353. https://doi.org/10.3390/app9071353

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop