Objective functions of which an analytical solution is very difficult or time-consuming are solved using stochastic optimization algorithms. Those optimization algorithms compute an approximate solution for objective functions. For a specific search space, the objective function might have one or more local optima along with the global optimum. When a comparison is made among the algorithms, one optimization algorithm could be more effective than others in finding a solution for certain objective functions. The most important factors affecting the success of optimization algorithms are the greatness of search space and the complexity of the objective function. Reaching the global optimum in huge search spaces is very difficult. In complex objective functions that have many local optima or where the differences between global optimum and local optima are very small, the probability of trapping into the local optimum is high. Existing optimization algorithms could be improved using the search space scanned more successfully to give a better performance. To achieve this aim, we present a novel algorithm, called Army-Inspired Genetic Algorithm (AIGA), which is inspired from military movement. The presented algorithm, apart from other optimization algorithms, searches global optima effectively by dividing the entire search area into territories instead of searching in one piece. Thus, the probability of getting trapped in a local optimum reduces and the probability of finding the global optimum increases. The presented algorithm was tested on well-known benchmark problems. The results shows that AIGA is more efficient algorithm in finding the global optimum than traditional algorithms.
Stochastic optimization Evolutionary algorithm Genetic algorithm Army-inspired strategy Local optima
Primary Language | English |
---|---|
Subjects | Civil Engineering (Other) |
Journal Section | Articles |
Authors | |
Early Pub Date | July 5, 2024 |
Publication Date | July 28, 2024 |
Submission Date | December 31, 2023 |
Acceptance Date | February 29, 2024 |
Published in Issue | Year 2024 Volume: 8 Issue: 3 |