Skip to main content
Article Open Access

Improved material generation algorithm by opposition-based learning and laplacian crossover for global optimization and advances in real-world engineering problems

  • Pranav Mehta is an Assistant Professor at the Department of Mechanical Engineering, Dharmsinh Desai University, Nadiad-387001, Gujarat, India. He is currently a PhD research scholar at Dharmsinh Desai University, Nadiad, Gujarat, India. His major research interest includes metaheuristics techniques, multi-objective optimization, solar-thermal technologies, and renewable energy.

    ,

    Sumit Kumar received the BEng degree (Hons.) in mechanical engineering from Dr. A.P.J. Abdul Kalam Technical University, Lucknow, India, in 2012, and the Meng degree (Hons.) in design engineering from the Malaviya National Institute of Technology (NIT), Jaipur, India, in 2015. He is currently a PhD research scholar with the College of Sciences and Engineering, Australian Maritime College, University of Tasmania, Launceston, Australia. His major research interests include metaheuristics techniques, multiobjective optimization, evolutionary algorithm and renewable energy systems.

    ,

    Dr. Sadiq M. Sait received his Bachelor’s degree in Electronics Engineering from Bangalore University, India, in 1981, and his Master’s and PhD degrees in Electrical Engineering from the King Fahd University of Petroleum and Minerals (KFUPM), Dhahran, in 1983 and 1987, respectively. He is currently a Professor of Computer Engineering and Director of the Center for Communications and IT Research, KFUPM, Dhahran, Saudi Arabia.

    ,

    Dr. Betül S. Yildiz is an Associate Professor at Bursa Uludağ University, Bursa, Turkey. Dr. Betül Sultan Yıldız completedher BSc and MSc degrees at Uludağ University, Bursa, Turkey, and received her PhD in Mechanical Engineering from Bursa Technical University, Turkey. Her research interests are optimal design, shape optimization, topology optimization, topography optimization, structural optimization methods, meta-heuristic optimization algorithms, and applications to industrial problems.

    and

    Dr. Ali Riza Yildiz is a Professor in the Department of Mechanical Engineering, Bursa Uludağ University, Bursa, Turkey. His research interests are the finite element analysis of structural components, light-weight design, vehicle design, vehicle crashworthiness, shape and topology optimization of vehicle components, meta-heuristic optimization techniques, and additive manufacturing.

    EMAIL logo
Published/Copyright: March 6, 2025
Become an author with De Gruyter Brill

Abstract

The current study aims to utilize a unique hybrid optimizer called oppositional-based learning and laplacian crossover augmented material generation algorithm (MGA-OBL-LP) to solve engineering design problems. The oppositional-based learning and laplacian crossover approaches are used to address the local optima trap weakness of a recently discovered MGA algorithm that has been added to the fundamental MGA structure. The proposed hybridization strategy aimed to make it easier to improve the exploration-exploitation behavior of the MGA algorithm. The performance of the proposed hybridized algorithm was compared with other notable metaheuristics collected from the literature for four constrained engineering design problems in order to determine whether it would be practical in real-world applications. A comparison analysis is undertaken to confirm the MGA-OBL-LP algorithm’s competence in terms of solution quality and stability, and it is discovered to be robust in addressing difficult practical problems.

1 Introduction

Optimization is an integral part of any system design and is one of the active areas of research in the field of engineering, civil, computer, and operational research applications. Optimization manifests itself in every aspect of life, and almost all real-world problems can be formulated as optimization problem. Three fundamental elements of an optimization problem are the “objective function” that needs to be maximized or minimized, “set of variables” that can be manipulated to optimize the objective, and “group of constraints” that governs the bounds on the variable’s values [1]. Fundamentally the process of finding the best solution out of all available alternatives with minimum utilization of resources is known as the optimisation process. Broadly two types of methods have been applied for solving optimization issues viz. classical and metaheuristic approaches. The former is the mathematical or gradient-based approach that is susceptible to the initial starting point and loses its ground due to computational challenges and incompetency in solving high-dimensional complex and large-scale optimization problems that have multiple design variables and non-linear constraints [2], 3]. On the other hand, metaheuristics (MHs) are powerful stochastic optimization techniques that have demonstrated their potential to solve optimization problems with versatile domains, even without sufficient problem information [4], [5], [6]. Though MHs do not guarantee the optimal solution, but can successfully find good enough and acceptable solutions in a reasonable time with relatively few assumptions, which makes them highly flexible and suitable for challenging NP-hard problems [7], 8].

Over the last few decades, many MHs have been developed and applied to a variety of real-world optimization problems. MHs development is inspired by a wide range of sources, and as a result, MHs are as diverse [9]. Different classifications of MHs have been suggested in the literature from time to time, but “Trajectory-based and Population-based” and “Nature-inspired and Non-nature” MHs classifications are probably the two most common categories [10]. In trajectory-based MHs such as simulated annealing, the search starts with a single solution that gets replaced by a newly updated solution at each iteration. Population-based MHs, including particle swarm optimization and firefly algorithms, on the other hand, initialize the search process with random generation of a set of the population that gets substituted by a newly generated best solution (either completely or part of it) at each iteration. Typically, trajectory-based methods are more centered on search exploitation (intensification), whereas population-based solutions are focused on exploration (diversification) [11]. The second category, nature-inspired and non-nature-inspired, is depicted in Figure 1, where nature is the inspiration source for the former while later, one is looking away from nature, based on diversified sources, including social, emotional, etc. [12].

Figure 1: 
Classification of metaheuristic algorithms.
Figure 1:

Classification of metaheuristic algorithms.

The nature-inspired MHs, nowadays, are the most popular and widely used algorithms due to their efficient strategies, including learning during iterations, sharing information across numerous agents, co-evolution, and self-organization, which are adapted from the best aspect of natural phenomena [13], [14], [15], [16]. They have been proven to solve complex real-world optimization problems, including truss design [17], [18], [19], wind turbine layout optimization [20], 21], aircraft wing design [17], 22], traveling salesman problems [23], 24], medical image processing [25], 26], feature selection [27], [28], [29] and many others. Nature has influenced numerous scholars in a variety of ways, making it a rich source of inspiration. As a result, nearly all new algorithms can be classified as nature-inspired. Few of the recently developed nature-inspired MHs includes whale optimization algorithm [30], salp swarm algorithm [31], snake optimizer [32], horse herd algorithm [33], Harris Hawks algorithm [34], lévyflight distribution algorithm [35], ebola optimization search [36], reptile search algorithm [37], starfish optimization algorithm [38], ship rescue optimization algorithm [39], and Aquila optimizer [40].

MHs are developed as black-box algorithms or generic frameworks that can harmonize the search exploration and exploitation mechanisms so that they can be applied to almost any optimization issues. Exploration strategy is in charge of searching for the best solution around diverse locations, but the exploitation has a proclivity for invading new searching places [41], 42]. However, this fundamental foundation is missing in most of the developed MHs which often leads to their inferior performance while solving real-world high-dimensional problems [43]. As MHs are stochastic they cannot guarantee global optimum and high-quality solutions. Moreover, poor convergence rate, stranding into local solutions, and high computing costs still plague MHs, especially for non-linear complex problems [19], 44]. Typically the solution quality of MHs is determined by parameter settings, which are often established on some standard values that have already been used to solve other problems. It does not, however, ensure that it will solve all difficulties with the same effectiveness and thus require appropriate parameter control settings [45]. According to the “No free lunch” hypothesis, one MHs cannot solve every category of the optimisation problem [46]. This theorem and other aforementioned factors work as the main motivational factor for scholars to develop novel high-performance optimisation methods that can solve practical challenging design optimisation problems.

The material generation algorithm (MGA) is a modern MH which is inspired by the chemical reaction process involved in the creation of new materials in the field of chemistry [47]. MGA was found to be more effective than other popular algorithms in solving numerical benchmarks and engineering design problems based on the results of the preliminary testing and evaluation. Nonetheless, the MGA algorithm exhibits premature convergence difficulties during the search process when evaluated using benchmark functions in simulations, suggesting that it cannot effectively manage both exploration and exploitation. Furthermore, it is recognized that the primary approach has some limitations, such as stagnation or premature convergence, even though it has theoretical potential, which could be improved further [10]. Furthermore, in algorithms of this nature, a higher degree of exploration potential corresponds to a relatively lower ability to exploit, and vice versa [10]. Therefore, achieving a suitable equilibrium between the algorithm’s potential for global diversification and local intensification is crucial to obtaining superior outcomes. Since MGA is a recently developed optimizer, discovering various modifications that can enhance its efficiency and performance is always intriguing.

In light of the challenges and opportunities mentioned above, a refined version of MGA called IMGA has been introduced to enhance its performance. The following are the primary contributions of the study:

  1. An opposition-based learning mechanism is employed in basic MGA to control and improve population diversity, which allows for a more thorough exploration of the entire search space.

  2. To improve the ability to exploit local information and effectively search the solution space, the Laplace crossover operator is incorporated.

  3. The effectiveness of IMGA was evaluated by using it to solve five real-world engineering problems in different domains, such as mechanical, civil, thermal, and industrial operations.

  4. A comparison was made between the performance of IMGA and that of eight other established metaheuristic algorithms.

The rest of the paper is organized as follows: Section 2 presents the details of the fundamental MGA method, while the suggested enhanced IMGA methodology is mathematically illustrated in Section 3. Discussion, whereas the overall conclusion and prospects are shown in Section 4.

2 MGA algorithm

The MGA optimizer is constructed with inspiration from different chemical reactions, chain reactions, and specifications for various chemical compounds. Furthermore, the new material generation by combining two or more existing materials to improve the further characteristics is also the benchmark concept behind developing the algorithm. Ultimately, the three most influenced concepts (composites, reactions and firmness) were involved in structuring the MGA. The mathematical structure can be identified in the following sub-sections [47].

2.1 Initialization of MGA

In this phase, pre-selected materials (Mt) considered an initial random solution that comprised elements from the periodic table of chemistry were identified. Additionally, decision parameters are periodic table elements considered for the algorithms. The detailed matrix structure can be found in the literature. The formulation of PTE (periodic table element) can be modeled as Equation (1) [47]:

(1) PTE i j 0 = PTE i , min j + rand 0 , 1 PTE i , max j PTE i , min j

with rand(0,1): Random number ranges from (0,1).

On the other hand, initial PTE values with i-th material and j-th element are designated by PTE(0). Also, the least and highest values are denoted by PTE i, min, and PTEi, max, respectively.

2.2 Chemical compound modeling

Chemical compounding, which includes sharing, losing, or gaining electrons, can be realized by external excitations of PTEs in many ways. For instance, light or photon excitations, magnetic fields, and other individual electrons. Ultimately, this may result in ionic or covalent bonds with the other elements of the periodic table and convert them into optimized compounds. Additionally, selected elements from the initial population of the materials are modeled according to the probability theory, and new elements have been generated using the following mathematical Equation (2) [47].

(2) PTE new k = PTE r 1 r 2 ± e

(3) Mt new 1 = PTE new 1 PTE new 2 PTE new k . PTE new d

with newly generated elements and randomly selected elements from the available materials are denoted by PTEk new and PTEr1 r2, respectively. The factor that affects the overall probabilistic modeling is given by e. Also the random numbers (r1, r2) are ranging from [1, n] and [1, d]. Furthermore, recently generated elements are utilized to built-up the new material (Mtnew1) has superior characteristics compared to the previous one, which is modeled by Equation (3) [47].

2.3 Chemical reactions modeling

In this phase, a process of getting new materials from the selected reactants called chemical reactions is idealized for the generation. For that, a random internal number l is defined based on the number of materials that take part in the chemical reaction process. Hence, this would further identify the new location for the generated materials in the initial material matrix. Additionally, the participation factor p is also considered, which provides the involvement of various materials and their content. The newly developed material, due to the result of the chemical reaction, can be mathematically modeled by Equation (4) [47].

(4) Mt new 2 = m = 1 l p m Mat mj m = 1 l Mat mj , j = 1 , 2 , 3 , . , n

with M−Th: Randomly identified material from the matrix of the material is denoted by Matm.

2.4 Chemical stability modeling

The newly developed materials are stable in their chemical nature and are of better quality. Additionally, this would further provide the capabilities to attain local and global solution equilibrium. Hence, the updated materials least provide the best and worst solutions according to the stability and characteristics of the developed new materials. Accordingly, the opted-optimized materials from the considered three phases are modeled as follows [47].

(5) Mt = Mt 1 Mt 1 . . . . Mt n Mt new 1 Mt new 2

with MTnew: Newly developed materials from the considered three phases of MGA.

3 Oppositional-based learning (OBL) and laplacian crossover (LP) techniques

In order to achieve the globally optimized solution effectively with the considered algorithm, OBL, and LP techniques are enhanced with the algorithm. Accordingly, the probability of attaining the best solution can be achieved by generating the two candidate solutions using OBL techniques. For instance, if Y is the candidate solution for the defined objective function, then the opposite to Y is also considered as the solution for the same function. The detailed understanding of the OBL was studied by Wang et al. [48]. The mathematical concept can be realized by a real number Y that ranges from an interval a to b. The opposite of this number can be given by Equation (6).

(6) Y = a + b Y

with a and b: Constants.

Whereas Y indicates the opposite new solution. Moreover, in the D-dimensional search domain, Equation (6) can be written as considered with Equation (7).

(7) Y j d = a j d + b j d Y j d

In this technique, initially based on the fitness function, the optimized candidate solution is realized from the set of solutions (i.e., Y= (Y 1, Y 2,…….Y d)). If the fitness function value is better, then Y will be the best solution; otherwise, an opposite of Y will be selected from the opposite candidate solution domain. Here, both Y′ and Y are calculated, and the best one is pursued within the solution.

Deep and Thakur [49] proposed a new crossover technique named laplacian crossover for enhancing any MHs algorithm to improve the searching dynamics of the algorithm. They proposed the two functions for the foundation of the LP crossover, namely density and distribution functions, as given in Equations (8) and (9), respectively.

(8) f x = 1 2 b exp x a b , < x <

(9) F x = 1 2 exp x a b , x a 1 1 2 exp x a b , x > a

where a belongs to the set of real numbers. Moreover, a set of offspring has been generated from the parent functions that were placed equally with respect to the position of the parents. According to the value of the b relative position of the offspring is decided with respect to the parent.

4 Applications of MGA-OBL-LP to engineering design problems

A numerical investigation of the proposed algorithm has been carried out in the previous section. Additionally, in order to identify further effectiveness, MGA-OBL-LP is applied to four benchmark engineering problems. Moreover, the statistical results for the particular design problem are compared with six effective MH algorithms. This approach may possibly test the proposed algorithm efficiently.

4.1 Compression/tension spring design problem

An important objective of compression springs is to minimize the weight (Arora, 1989). 3D schematic of the spring is shown in Figure 2. The variables that need to be optimized are the diameter of the spring wire d, the coil diameter D, and the number of coils N. Moreover, critical constraints such as deviation due to improper force, surging, and stress are imposed over the objective function.

Figure 2: 
Schematic view of the tension/compression spring problem.
Figure 2:

Schematic view of the tension/compression spring problem.

Table 1 represents the best output obtained from MGA-OBL-LP along with each compared optimizer for the objective function as well as for the selected design variables (d, D, N). Accordingly, MGA-OBL-LP obtained the superior functional value of 0.012665. Statistical results obtained by all MHs are displayed in Table 2. It can be noted that MGA-OBL-LP finds the best objective function value of 0.0126652358401 for the spring test example with the least SD (i.e. 3.589E–06). Moreover, a 100 %SR is observed for the MGA-OBL-LP compared to the rest of the MHs.

Table 1:

Best solutions obtained by the applied algorithms on compression/tension spring design problem.

Variables Algorithms
MGA- OBL-LP MGA MRFO Chimp BMO SAR GBO
d 0.051701 0.051689061 0.051444 0.050231 0.052448 0.051746 0.051814
D 0.357029 0.35671774 0.350851 0.322049 0.375256 0.358085 0.359741
N 11.27074 11.28896576 11.64295 13.72903 10.28056 11.21006 11.11389
f best 0.01266523 0.01266523 0.012667 0.012781 0.012677 0.012666 0.012665
Table 2:

Comparison of various statistical values by the applied algorithms on compression/tension spring design problem.

Methods Best Mean Worst SD FEs
MGA- OBL-LP 0.0126652358401 0.0126674012369 0.0126743256987 3.589E–06 18,000
MGA 0.012665523 0.0127664474257 0.0131204224119 1.124E–04 18,000
Chimp 0.0127811837289 0.0135908682588 0.0167672527834 9.526E–04 18,000
BMO 0.0126770592778 0.0130432883699 0.0154604535736 4.650E–04 18,000
SAR 0.0126664622629 0.0127067731478 0.0127517074681 2.279E–05 18,000
GBO 0.0126655220331 0.0127664474257 0.0131204224119 1.124E–04 18,000
MRFO 0.0126678422932 0.0127746820923 0.0130644512874 8.452E–05 18,000

4.2 Design problem of spur gear (SG)

In this problem, weight minimization is the major objective. Five design variables Z 1 (teeth on pinion), H (height), b (width of the face), d 1, d 2 (diameters for pinion and wheel shaft) are considered keeping m (module) constant. Also, eight inequality constraints are accounted for in the problem definition.

Table 3 depicts the best design variable and functional values for all MHs for the second case of the SG design problem. According to Table 4, which illustrates statistical results, the proposed algorithm found superior best and mean objective function values amongst all other MHs for SG-Case-2 (Figure 3).

Table 3:

Best solutions obtained by the applied algorithms on spur gear design problem-2.

Variables Algorithms
MGA-OBL-LP MGA MRFO Chimp BMO SAR GBO
b 26.893775 26.893775986 26.893775 26.906755 26.893789 26.894713 26.8937758
d 1 30.000000 30.000000000 29.999999 30.000000 30.000000 29.997272 30.0000000
d 2 17.174962 17.189388216 17.174962 17.821566 17.174990 17.179803 17.1749623
Z 1 18.000000 18.000000000 18.448848 18.000000 18.000129 18.438884 18.0147456
m 2 2.0000000000 2 2 2 2 2
H 400 400.00000000 399.79672 400 400 399.89866 399.989880
f best 1,538.9446 1,539.0543551 1,538.9446 1,544.3255 1,538.9454 1,539.0443 1,538.94468
Table 4:

Comparison of various statistical values by the applied algorithms on spur gear design problem-2.

Algorithms Best Mean Worst SD FEs SR (%)
MGA-OBL-LP 1,538.9446818 1,538.9446818 1,538.9446818 1.304E–12 25,000 100
MGA 1,539.0543551 1701.9047796 2,877.9899055 192.996307 25,000 0
Chimp 1,544.3255052 1,573.6606289 1,683.0684900 1.972E+01 25,000 0
BMO 1,538.9454182 1,606.4447272 2,870.1569595 2.620E+02 25,000 26
SAR 1,539.0443250 1,539.6226677 1,541.6678840 5.622E–01 25,000 0
GBO 1,538.9446818 1,538.9446818 1,538.9446818 1.271E–12 25,000 100
MRFO 1,538.9446871 1,547.9319207 1,608.8195371 1.597E+01 25,000 66
Figure 3: 
Schematic view of spur gear geometry of web-type structure.
Figure 3:

Schematic view of spur gear geometry of web-type structure.

4.3 Reinforced concrete beam design

The detailed analysis of concrete beam design with overall cost reduction as the prime objective for optimization is discussed in the literature. Moreover, a clear 3D image of the simply supported RCB (reinforced concrete beam) is shown in Figure 4, which consists of design variables such as h (depth of beam), a (area of the RCB), w (breadth of the beam).

Figure 4: 
Reinforced concrete beam structure.
Figure 4:

Reinforced concrete beam structure.

Table 5 depicts the best output achieved for the considered parameters and particular optimizers. Statistical results from Table 6 suggest that MGA-OBL-LP found the best values for the set objective. Also, MGA-OBL-LP was found superior compared to the rest in terms of SR (100 %), SD (1.102E–9), and best mean value (359.2080000000040). Here, SAR, GBO, and MRFO provide competitive SR of 50 %, 100 %, and 100 %, respectively, with the best value as that achieved by MGA-OBL-LP.

Table 5:

Best solutions obtained by the applied algorithms on the reinforced concrete beam design problem.

Variables Algorithms
MGA-OBL-LP MGA COA Chimp BMO SAR GBO
a 2.8987509 6.320000000 3.0300774 2.8611962 3.3410072 3.08740651 2.6790078
w 33.912841 34.00000000 33.851854 33.723199 33.710545 33.7582627 34.038247
h 8.5000000 8.521888534 8.5000000 8.5140487 8.5000152 8.50000377 8.5000000
f 359.20800 359.6545261 359.20800 359.49459 359.20831 359.208077 359.20800
Table 6:

Comparison of various statistical values by the applied algorithms on the reinforced concrete beam design problem.

Algorithms Best Mean Worst SD FEs SR (%)
MGA-OBL-LP 359.208000000 359.208000000 359.20800000 1.102E–9 10,000 100
MGA 359.65452611275 368.5330618233 408.881477608 7.91878806 10,000 0
Chimp 359.494593839 362.809805892 366.28534467 1.810E+00 10,000 0
BMO 359.208310509 359.968274751 362.65935170 1.281E+00 10,000 10
SAR 359.208077041 359.209961559 359.21570379 2.135E–03 10,000 50
GBO 359.208000000 359.208003787 359.20817500 2.475E–05 10,000 100
MRFO 359.208000000 359.208000013 359.20800040 6.396E–08 10,000 100

4.4 Piston lever design

The inspiration of the study is directed by the minimum volume occupancy of oil inside the PL (piston lever) when it is made to advance from 0°–45°. A clear 3D image with multivariate is illustrated in Figure 5.

Figure 5: 
Piston problem design.
Figure 5:

Piston problem design.

Table 7 indicates optimized operative variables for the PL and best values for the objective function by each algorithm. Moreover, detailed statistical data can be inferred from Table 8. MGA-OBL-LP shows its robustness in terms of best functional value (8.41269832358378) and SD (1.254E–01) with 100 % SR. Moreover, GBO and MRFO also seem to be competitive in terms of best-achieved values with 26 % SR.

Table 7:

Best solutions obtained by the applied algorithms on piston lever problem.

Variables Algorithms
MGA-OBL-LP MGA MRFO Chimp BMO SAR GBO
Q 0.0500000 0.050000000 0.0500052 0.0541652 0.0500000 0.0540348 0.0500000
L 2.0415135 2.052477907 2.0419650 2.0550735 2.0470086 2.0759710 2.0415135
B 120.00000 120.0000000 119.97263 120.00000 119.97163 119.99921 119.99999
H 4.0830271 4.084141617 4.0839231 4.0848070 4.0845040 4.1004079 4.0830271
f 8.4126983 8.460640111 8.4182520 8.5127196 8.4405427 8.6599430 8.4126983
Table 8:

Comparison of various statistical values by the applied algorithms on piston lever problem.

Algorithms Best Mean Worst SD FEs SR (%)
MGA-OBL-LP 8.41269832358378 8.41269838628300 8.41269882829863 1.254E–01 15,000 100
MGA 8.460640111435 9.0194381502393 9.7828137073630 0.31692008 25,000 6
Chimp 8.51271963850081 8.85733997255565 9.52960452746221 2.114E–01 15,000 0
BMO 8.44054270414359 105.901248236575 544.937949743053 1.218E+02 15,000 4
SAR 8.65994300451169 9.67810281867552 11.6398901625688 7.693E–01 15,000 0
GBO 8.41269836293783 126.117133205497 167.472820984244 7.048E+01 15,000 26
MRFO 8.41825207072613 40.7614060388657 167.561620170694 5.349E+01 15,000 26

5 Conclusions

The present article deals with the novel improved optimization algorithm named material generation algorithm with oppositional-based techniques and Laplacian crossover. However, in some critical challenges, metaheuristics algorithms found it difficult to identify one of the best global optimum solutions. To overcome such cases, the material generation algorithm was augmented by oppositional-based techniques vis-a-vis the Laplacian crossover factor that improves the efficiency of the algorithm in terms of exploration of the search domain and realizes the best global solution with a superior convergence rate. This suggests the wide applicability of this algorithm and, simultaneously, better scope of adoption of techniques like a Laplacian crossover, chaotic maps, and oppositional techniques in the future by researchers to improve the base algorithm’s efficiency.


Corresponding author: Ali Riza Yildiz, Department of Mechanical Engineering, Bursa Uludag Universitesi, Bursa, Türkiye, E-mail:

About the authors

Pranav Mehta

Pranav Mehta is an Assistant Professor at the Department of Mechanical Engineering, Dharmsinh Desai University, Nadiad-387001, Gujarat, India. He is currently a PhD research scholar at Dharmsinh Desai University, Nadiad, Gujarat, India. His major research interest includes metaheuristics techniques, multi-objective optimization, solar-thermal technologies, and renewable energy.

Sumit Kumar

Sumit Kumar received the BEng degree (Hons.) in mechanical engineering from Dr. A.P.J. Abdul Kalam Technical University, Lucknow, India, in 2012, and the Meng degree (Hons.) in design engineering from the Malaviya National Institute of Technology (NIT), Jaipur, India, in 2015. He is currently a PhD research scholar with the College of Sciences and Engineering, Australian Maritime College, University of Tasmania, Launceston, Australia. His major research interests include metaheuristics techniques, multiobjective optimization, evolutionary algorithm and renewable energy systems.

Sadiq M. Sait

Dr. Sadiq M. Sait received his Bachelor’s degree in Electronics Engineering from Bangalore University, India, in 1981, and his Master’s and PhD degrees in Electrical Engineering from the King Fahd University of Petroleum and Minerals (KFUPM), Dhahran, in 1983 and 1987, respectively. He is currently a Professor of Computer Engineering and Director of the Center for Communications and IT Research, KFUPM, Dhahran, Saudi Arabia.

Betül S. Yildiz

Dr. Betül S. Yildiz is an Associate Professor at Bursa Uludağ University, Bursa, Turkey. Dr. Betül Sultan Yıldız completedher BSc and MSc degrees at Uludağ University, Bursa, Turkey, and received her PhD in Mechanical Engineering from Bursa Technical University, Turkey. Her research interests are optimal design, shape optimization, topology optimization, topography optimization, structural optimization methods, meta-heuristic optimization algorithms, and applications to industrial problems.

Ali Riza Yildiz

Dr. Ali Riza Yildiz is a Professor in the Department of Mechanical Engineering, Bursa Uludağ University, Bursa, Turkey. His research interests are the finite element analysis of structural components, light-weight design, vehicle design, vehicle crashworthiness, shape and topology optimization of vehicle components, meta-heuristic optimization techniques, and additive manufacturing.

  1. Research ethics: Not applicable.

  2. Informed consent: Not applicable.

  3. Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

  4. Use of Large Language Models, AI and Machine Learning Tools: None declared.

  5. Conflict of interest: All other authors state no conflict of interest.

  6. Research funding: None declared.

  7. Data availability: Not applicable.

References

[1] E. K. Chong and S. H. Zak, An Introduction to Optimization, New York, NY, USA, John Wiley & Sons, 2004.Search in Google Scholar

[2] J. O. Agushaka and A. E. Ezugwu, “Evaluation of several initialization methods on arithmetic optimization algorithm performance,” J. Intell. Syst., vol. 31, no. 1, pp. 70–94, 2022, https://doi.org/10.1515/jisys-2021-0164.Search in Google Scholar

[3] S. Kumar, et al.., “Chaotic marine predators algorithm for global optimization of real-world engineering problems,” Knowl. Base. Syst., vol. 261, p. 110192, 2023, https://doi.org/10.1016/j.knosys.2022.110192.Search in Google Scholar

[4] J. H. Holland, Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence, Cambridge, MA, USA, MIT Press, 1992.10.7551/mitpress/1090.001.0001Search in Google Scholar

[5] J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of ICNN’95-International Conference on Neural Networks, vol. 4, 1995, pp. 1942–1948, https://doi.org/10.1109/icnn.1995.488968.Search in Google Scholar

[6] S. Kumar, G. G. Tejani, N. Pholdee, S. Bureerat, and P. Mehta, “Hybrid heat transfer search and passing vehicle search optimizer for multi-objective structural optimization,” Knowl. Base. Syst., vol. 212, p. 106556, 2021, https://doi.org/10.1016/j.knosys.2020.106556.Search in Google Scholar

[7] K. Deb, A. Pratap, S. Agarwal, and T. A. M. T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGA-II,” IEEE Trans. Evol. Comput., vol. 6, no. 2, pp. 182–197, 2002, https://doi.org/10.1109/4235.996017.Search in Google Scholar

[8] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,” Adv. Eng. Software, vol. 69, pp. 46–61, 2014, https://doi.org/10.1016/j.advengsoft.2013.12.007.Search in Google Scholar

[9] P. Mehta, et al.., “A Nelder Mead-infused INFO algorithm for optimization of mechanical design problems,” Mater. Test., vol. 64, no. 8, pp. 1172–1182, 2022, https://doi.org/10.1515/mt-2022-0119.Search in Google Scholar

[10] M. Abdel-Basset, L. Abdel-Fatah, and A. K. Sangaiah, “Metaheuristic algorithms: A comprehensive review,” in Computational Intelligence for Multimedia Big Data on the Cloud with Engineering Applications, 2018, pp. 185–231.10.1016/B978-0-12-813314-9.00010-4Search in Google Scholar

[11] M. Gendreau and J. Y. Potvin, “Metaheuristics in combinatorial optimization,” Ann. Oper. Res., vol. 140, no. 1, pp. 189–213, 2005, https://doi.org/10.1007/s10479-005-3971-7.Search in Google Scholar

[12] Fister, Jr., X. S. Yang, I. Fister, J. Brest, and D. Fister, “A brief review of nature-inspired algorithms for optimization,” arXiv preprint, arXiv:1307.4186. 2013Search in Google Scholar

[13] X. S. Yang, Ed.in Nature-Inspired Algorithms and Applied Optimization, vol. 744, Cham, Switzerland, Springer, 2017.Search in Google Scholar

[14] B. S. Yıldız, S. Kumar, N. Pholdee, S. Bureerat, S. M. Sait, and A. R. Yildiz, “A new chaotic Lévy flight distribution optimization algorithm for solving constrained engineering problems,” Expert Syst., vol. 39, no. 2, p. e12992, 2022, https://doi.org/10.1111/exsy.12992.Search in Google Scholar

[15] M. O. Okwu and L. K. Tartibu, Metaheuristic Optimization: Nature-Inspired Algorithms Swarm and Computational Intelligence, Theory and Applications, vol. 927, Cham, Switzerland, Springer Nature, 2020, https://doi.org/10.1007/978-3-030-61111-8.Search in Google Scholar

[16] L. Abualigah, et al.., “Nature-inspired optimization algorithms for text document clustering – A comprehensive analysis,” Algorithms, vol. 13, no. 12, p. 345, 2020, https://doi.org/10.3390/a13120345.Search in Google Scholar

[17] S. Kumar, G. G. Tejani, N. Pholdee, S. Bureerat, and P. Jangir, “Multi-objective teaching-learning-based optimization for structure optimization,” Smart Sci., vol. 10, no. 1, pp. 56–67, 2022, https://doi.org/10.1080/23080477.2021.1975074.Search in Google Scholar

[18] B. S. Yildiz, et al.., “A novel hybrid flow direction optimizer-dynamic oppositional based learning algorithm for solving complex constrained mechanical design problems,” Mater. Test., vol. 65, no. 1, pp. 134–143, 2023, https://doi.org/10.1515/mt-2022-0183.Search in Google Scholar

[19] P. Mehta, et al.., “A novel generalized normal distribution optimizer with elite oppositional based learning for optimization of mechanical engineering problems,” Mater. Test., vol. 65, no. 2, pp. 210–223, 2023, https://doi.org/10.1515/mt-2022-0259.Search in Google Scholar

[20] T. Kunakote, et al.., “Comparative performance of twelve metaheuristics for wind farm layout optimisation,” Arch. Comput. Methods Eng., vol. 29, no. 1, pp. 717–730, 2022, https://doi.org/10.1007/s11831-021-09586-7.Search in Google Scholar

[21] X. Gao, H. Yang, and L. Lu, “Optimization of wind turbine layout position in a wind farm using a newly-developed two-dimensional wake model,” Appl. Energy, vol. 174, pp. 192–200, 2016, https://doi.org/10.1016/j.apenergy.2016.04.098.Search in Google Scholar

[22] B. S. Yildiz, P. Mehta, S. M. Sait, N. Panagant, S. Kumar, and A. R. Yildiz, “A new hybrid artificial hummingbird-simulated annealing algorithm to solve constrained mechanical engineering problems,” Mater. Test., vol. 64, no. 7, pp. 1043–1050, 2022, https://doi.org/10.1515/mt-2022-0123.Search in Google Scholar

[23] H. Halim and I. Ismail, “Combinatorial optimization: Comparison of heuristic algorithms in travelling salesman problem,” Arch. Comput. Methods Eng., vol. 26, no. 2, pp. 367–380, 2019, https://doi.org/10.1007/s11831-017-9247-y.Search in Google Scholar

[24] S. S. Juneja, P. Saraswat, K. Singh, J. Sharma, R. Majumdar, and S. Chowdhary, “Travelling salesman problem optimization using genetic algorithm,” in 2019 Amity International Conference on Artificial Intelligence (AICAI), 2019, pp. 264–268.10.1109/AICAI.2019.8701246Search in Google Scholar

[25] N. Pholdee, S. Kumar, B. S. Yildiz, P. Mehta, S. Bureerat, and A. R. Yildiz, “A new hybrid flow direction optimizer for solving complex constrained engineering design problems,” Expert Syst., p. e12989, 2023, https://doi.org/10.1111/exsy.12989.Search in Google Scholar

[26] L. Rundo, A. Tangherloni, C. Militello, M. C. Gilardi, and G. Mauri, “Multimodal medical image registration using particle swarm optimization: A review,” in 2016 IEEE Symposium Series on Computational Intelligence (SSCI), 2016, pp. 1–8.10.1109/SSCI.2016.7850261Search in Google Scholar

[27] A. G. Hussien, A. E. Hassanien, E. H. Houssein, S. Bhattacharyya, and M. Amin, “S-shaped binary whale optimization algorithm for feature selection,” in Recent Trends in Signal and Image Processing, 2019, pp. 79–87.10.1007/978-981-10-8863-6_9Search in Google Scholar

[28] A. G. Hussien, E. H. Houssein, and A. E. Hassanien, “A binary whale optimization algorithm with hyperbolic tangent fitness function for feature selection,” in 2017 Eighth International Conference on Intelligent Computing and Information Systems (ICICIS), 2017, pp. 166–172.10.1109/INTELCIS.2017.8260031Search in Google Scholar

[29] A. G. Hussien and M. Amin, “A self-adaptive Harris Hawks optimization algorithm with opposition-based learning and chaotic local search strategy for global optimization and feature selection,” Int. J. Mach. Learn. Cybern., vol. 12, no. 1, pp. 1–28, 2021, https://doi.org/10.1007/s13042-021-01326-4.Search in Google Scholar

[30] S. Mirjalili and A. Lewis, “The whale optimization algorithm,” Adv. Eng. Software, vol. 95, pp. 51–67, 2016, https://doi.org/10.1016/j.advengsoft.2016.01.008.Search in Google Scholar

[31] S. Mirjalili, A. H. Gandomi, S. Z. Mirjalili, S. Saremi, H. Faris, and S. M. Mirjalili, “Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems,” Adv. Eng. Software, vol. 114, pp. 163–191, 2017, https://doi.org/10.1016/j.advengsoft.2017.07.002.Search in Google Scholar

[32] F. A. Hashim and A. G. Hussien, “Snake Optimizer: A novel meta-heuristic optimization algorithm,” Knowl. Base. Syst., vol. 242, p. 108320, 2022, https://doi.org/10.1016/j.knosys.2022.108320.Search in Google Scholar

[33] F. MiarNaeimi, G. Azizyan, and M. Rashki, “Horse herd optimization algorithm: A nature-inspired algorithm for high-dimensional optimization problems,” Knowl. Base. Syst., vol. 213, p. 106711, 2021, https://doi.org/10.1016/j.knosys.2020.106711.Search in Google Scholar

[34] A. Heidari, S. Mirjalili, H. Faris, I. Aljarah, M. Mafarja, and H. Chen, “Harris hawks optimization: Algorithm and applications,” Future Gener. Comput. Syst., vol. 97, pp. 849–872, 2019, https://doi.org/10.1016/j.future.2019.02.028.Search in Google Scholar

[35] E. H. Houssein, M. R. Saad, F. A. Hashim, H. Shaban, and M. Hassaballah, “Lévy flight distribution: A new metaheuristic algorithm for solving engineering optimization problems,” Eng. Appl. Artif. Intell., vol. 94, p. 103731, 2020, https://doi.org/10.1016/j.engappai.2020.103731.Search in Google Scholar

[36]. O. N. Oyelade, A. E. S. Ezugwu, T. I. Mohamed, and L. Abualigah, “Ebola optimization search algorithm: A new nature-inspired metaheuristic optimization algorithm,” IEEE Access, vol. 10, pp. 16150–16177, 2022, https://doi.org/10.1109/ACCESS.2022.3147821.Search in Google Scholar

[37] L. Abualigah, M. AbdElaziz, P. Sumari, Z. W. Geem, and A. H. Gandomi, “Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer,” Expert Syst. Appl., vol. 191, p. 116158, 2022, https://doi.org/10.1016/j.eswa.2021.116158.Search in Google Scholar

[38] C. Zhong, G. Li, Z. Meng, H. Li, A. R. Yildiz, and S. Mirjalili, “Starfish optimization algorithm (SFOA): A bio-inspired metaheuristic algorithm for global optimization compared with 100 optimizers,” Neural Comput. Appl., vol. 37, no. 5, pp. 3641–3683, 2025, https://doi.org/10.1007/s00521-024-10694-1.Search in Google Scholar

[39] S. C. Chu, A. R. Yildiz, J. S. Pan, and T. T. Wang, “Ship rescue optimization: A new metaheuristic algorithm for solving engineering problems,” J. Internet Technol., vol. 25, no. 1, pp. 61–78, 2024, https://doi.org/10.53106/160792642024012501006.Search in Google Scholar

[40] L. Abualigah, D. Yousri, M. AbdElaziz, A. A. Ewees, M. A. Al-Qaness, and A. H. Gandomi, “Aquila optimizer: A novel meta-heuristic optimization algorithm,” Comput. Ind. Eng., vol. 157, p. 107250, 2021, https://doi.org/10.1016/j.cie.2021.107250.Search in Google Scholar

[41] S. Kumar, G. G. Tejani, N. Pholdee, and S. Bureerat, “Performance enhancement of meta-heuristics through random mutation and simulated annealing-based selection for concurrent topology and sizing optimization of truss structures,” Soft Comput., vol. 26, pp. 1–23, 2022, https://doi.org/10.1007/s00500-022-06930-2.Search in Google Scholar

[42] H. R. Maier, et al.., “Evolutionary algorithms and other metaheuristics in water resources: Current status, research challenges and future directions,” Environ. Model. Software, vol. 62, pp. 271–299, 2014, https://doi.org/10.1016/j.envsoft.2014.09.013.Search in Google Scholar

[43] R. R. Mostafa, A. G. Hussien, M. A. Khan, S. Kadry, and F. A. Hashim, “Enhanced coot optimization algorithm for dimensionality reduction,” in 2022 Fifth International Conference of Women in Data Science at Prince Sultan University (WiDS PSU), 2022, pp. 43–48.10.1109/WiDS-PSU54548.2022.00020Search in Google Scholar

[44] S. Wang, A. G. Hussien, H. Jia, L. Abualigah, and R. Zheng, “Enhanced remora optimization algorithm for solving constrained engineering optimization problems,” Mathematics, vol. 10, no. 10, p. 1696, 2022, https://doi.org/10.3390/math10101696.Search in Google Scholar

[45] A. Chopard and M. Tomassini, “Performance and limitations of metaheuristics,” in An Introduction to Metaheuristics for Optimization, Cham, Switzerland, Springer, 2018, pp. 191–203.10.1007/978-3-319-93073-2_11Search in Google Scholar

[46] H. Wolpert and W. G. Macready, “No free lunch theorems for optimization,” IEEE Trans. Evol. Comput., vol. 1, no. 1, pp. 67–82, 1997, https://doi.org/10.1109/4235.585893.Search in Google Scholar

[47] S. Talatahari, M. Azizi, and A. H. Gandomi, “Material generation algorithm: A novel metaheuristic algorithm for optimization of engineering problems,” Processes, vol. 9, no. 5, p. 859, 2021, https://doi.org/10.3390/pr9050859.Search in Google Scholar

[48] H. Wang, Z. Wu, S. Rahnamayan, Y. Liu, and M. Ventresca, “Enhancing particle swarm optimization using generalized opposition-based learning,” Inf. Sci., vol. 181, no. 20, pp. 4699–4714, 2011, https://doi.org/10.1016/j.ins.2011.03.016.Search in Google Scholar

[49] K. Deep and M. Thakur, “A new crossover operator for real coded genetic algorithms,” Appl. Math. Comput., vol. 188, no. 1, pp. 895–911, 2007, https://doi.org/10.1016/j.amc.2006.10.047.Search in Google Scholar

Published Online: 2025-03-06
Published in Print: 2025-04-28

© 2025 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Articles in the same Issue

  1. Frontmatter
  2. Bending of wavy spider web honeycomb sandwich structures using PLA
  3. Tensile testing of high-speed rail wheels using tiny specimens
  4. Impact breaking energies and transition temperatures of construction steels produced from scrap
  5. Creep failure analysis of directionally solidified CM247LC superalloy
  6. Corrosion behavior of a dissimilar Inconel 625 superalloy and AISI 316L stainless steel weld
  7. Electrochemical investigation on the resistance of batch hot-dip galvanized coatings using a gel-type electrolyte
  8. Effect of austempering on wear resistance of GGG50 nodular cast iron
  9. Enhanced hippopotamus optimization algorithm and artificial neural network for mechanical component design
  10. Application of the incremental hole-drilling method for residual stress determination in type 4 pressure vessels
  11. Fatigue strength of welded armor steels with different weld penetration
  12. Effect of hybridization type, hole-distance and number of holes on the resistance of intraply hybrid composite plates against multiple impact
  13. Effect of hygrothermal aging and ply-stacking sequence on low-velocity impact properties of CFRP composite plates
  14. Modelling the phase homogenization subsequent to the boriding process
  15. Optimal hydraulic engine mount parameters using design of experiment (DoE) and response surface methodology
  16. Improved material generation algorithm by opposition-based learning and laplacian crossover for global optimization and advances in real-world engineering problems
Downloaded on 25.4.2026 from https://www.degruyterbrill.com/document/doi/10.1515/mt-2024-0515/html?lang=en
Scroll to top button