Dépôt DSpace/Manakin

A hybridizing-enhanced differential evolution for optimization

Afficher la notice abrégée

dc.rights.license CC BY eng
dc.contributor.author Ghasemi, Mojtaba cze
dc.contributor.author Zare, Mohsen cze
dc.contributor.author Trojovský, Pavel cze
dc.contributor.author Zahedibialvaei, Amir cze
dc.contributor.author Trojovská, Eva cze
dc.date.accessioned 2025-12-05T12:48:18Z
dc.date.available 2025-12-05T12:48:18Z
dc.date.issued 2023 eng
dc.identifier.issn 2376-5992 eng
dc.identifier.uri http://hdl.handle.net/20.500.12603/1805
dc.description.abstract Differential evolution (DE) belongs to the most usable optimization algorithms, presented in many improved and modern versions in recent years. Generally, the low convergence rate is the main drawback of the DE algorithm. In this article, the gray wolf optimizer (GWO) is used to accelerate the convergence rate and the final optimal results of the DE algorithm. The new resulting algorithm is called Hunting Differential Evolution (HDE). The proposed HDE algorithm deploys the convergence speed of the GWO algorithm as well as the appropriate searching capability of the DE algorithm. Furthermore, by adjusting the crossover rate and mutation probability parameters, this algorithm can be adjusted to pay closer attention to the strengths of each of these two algorithms. The HDE/current-to-rand/ 1 performed the best on CEC-2019 functions compared to the other eight variants of HDE. HDE/current-to-best/1 is also chosen as having superior performance to other proposed HDE compared to seven improved algorithms on CEC-2014 functions, outperforming them in 15 test functions. Furthermore, jHDE performs well by improving in 17 functions, compared with jDE on these functions. The simulations indicate that the proposed HDE algorithm can provide reliable outcomes in finding the optimal solutions with a rapid convergence rate and avoiding the local minimum compared to the original DE algorithm. eng
dc.format p. "Article number: e1420" eng
dc.language.iso eng eng
dc.publisher PeerJ Inc eng
dc.relation.ispartof PeerJ Computer Science, volume 9, issue: June 2023 eng
dc.subject CEC-2019 benchmark functions eng
dc.subject Differential evolution eng
dc.subject Exploation eng
dc.subject Exploration eng
dc.subject Generalized gray wolf optimization eng
dc.subject Gray wolf optimizer eng
dc.subject Hybrid optimization eng
dc.subject Metaheuristic eng
dc.subject Optimization eng
dc.subject Stochastic optimization eng
dc.title A hybridizing-enhanced differential evolution for optimization eng
dc.type article eng
dc.identifier.obd 43880106 eng
dc.identifier.doi 10.7717/peerj-cs.1420 eng
dc.publicationstatus postprint eng
dc.peerreviewed yes eng
dc.source.url https://peerj.com/articles/cs-1420/ cze
dc.relation.publisherversion https://peerj.com/articles/cs-1420/ eng
dc.rights.access Open Access eng


Fichier(s) constituant ce document

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée

Chercher dans le dépôt


Parcourir

Mon compte