Araştırma Makalesi
BibTex RIS Kaynak Göster

Evrimsel algoritmalar için yeni bir meta-iyileştirici: bipolar eşleşme eğilimi

Yıl 2022, Cilt: 28 Sayı: 2, 313 - 323, 30.04.2022

Öz

Evrimsel Algoritmalar üzerine yapılan son araştırmalar, bu algoritmaların performansının, genellikle, uygun parametre yapılandırmalarının seçilmesine bağlı olduğunu göstermektedir. Araştırmacılar, ya literatürdeki benzer araştırma alanlarına bakarak ya da Izgara Arama’da (Grid Search) olduğu gibi uygun parametreleri tek tek deneyerek bu parametre yapılandırmalarını bulmaya çalışmışlardır. Ancak, parametrenin tek tek aranması zahmetli ve zaman alıcıdır; bu nedenle, meta-optimizasyon teknikleri, bir algoritmanın parametrelerini ayarlamak için yaygın olarak kullanılan yöntemler haline gelmiştir. Meta-optimizasyon teknikleri, yaygın olan iki biçimde sınıflandırılabilirler: çevrimdışı (algoritma başlamadan önce bir algoritmanın parametrelerini ayarlamak) ve çevrimiçi (çalışma sırasında parametreleri ayarlamak). Bu makalede, bir Genetik Algoritmanın (GA) seçim yöntemi olarak, Bipolar Eşleşme Eğilimi (BMT) algoritması seçilmiştir. Oluşan yeni algoritma, GA-BMT olarak adlandırılmış ve çevrimiçi meta-iyileştirici olarak ilk kez kullanılmıştır. Ayrıca makale, 17 test fonksiyonu için Standart GA’nın (SGA) en iyi parametre ayarlarının bulunmasında, iki arama algoritması (Izgara Arama, Kabadan İnceye Arama) ve üç meta-optimizasyon yöntemini (SGA, Parçacık Sürü Optimizasyonu, GA-BMT) kullanmakta ve sonuçlarını karşılaştıran bir çalışma sunmaktadır. Bununla birlikte, elde edilen sonuçları anlamlandırmak için istatistiksel testler, Friedman ve Wilcoxon İşaretli Sıralar, kullanılmıştır. Elde edilen tüm sonuçlar incelendiğinde, GA-BMT’nin makul bir başarı sunduğu aşikârdır.

Kaynakça

  • [1] Lobo FG, Goldberg DE. “The parameter-less genetic algorithm in practice”. Information Science, 167(1-4), 217-232, 2004.
  • [2] Mercer RE, Sampson JR. “Adaptive search using a reproductive meta-plan”. Kybernetes, 7(3), 215-228, 1978.
  • [3] Grefenstette JJ. “Optimization of Control Parameters for Genetic Algorithms”. IEEE Transactions on Systems, Man, and Cybernetics, 16(1), 122-128, 1986.
  • [4] Freisleben B, Härtfelder M. “Optimization of genetic algorithms by genetic algorithms”. International Conference on Artificial Neural Nets and Genetic Algorithms-ANNGA’93, Innsbruck, Austria, 1-5 May 1993.
  • [5] Bäck T. “Parallel optimization of evolutionary algorithms”. The Third Conference on Parallel Problem Solving from Nature-PPSN III, Jerusalem, Israel, 9-14 October 1994.
  • [6] Hinterding R. “Gaussian mutation and self-adaption for numeric genetic algorithms”. IEEE International Conference, Perth, Australia, 29 November-1 December 1995.
  • [7] Keane AJ. “Genetic algorithm optimization of multi-peak problems: studies in convergence and robustness”. Artificial Intelligence Engineering, 9(2), 75-83, 1995.
  • [8] Cortez P, Rocha M, Neves J. “A meta-genetic algorithm for time series forecasting”. 10th Portuguese Conference on Artificial Intelligence, Porto, Portugal, 1-3 December 2001.
  • [9] Hong SH, Cornelius J, Wang Y, Pant K. “Fault compensation by online updating of genetic algorithm-selected neural network model for model predictive control”. Applied Science, 1(11), 1-16, 2019.
  • [10] Camilleri M, Neri F, Papoutsidakis M. “An algorithmic approach to parameter selection in machine learning using meta-optimization techniques”. WSEAS Transactions on systems, 13(1), 203-212, 2014
  • [11] Li H, Huang Z, Liu X, Zeng C, Zou P. “Multi-Fidelity metaoptimization for nature inspired optimization algorithms”. Applied Soft Computing, 2020. https://doi.org/10.1016/j.asoc.2020.106619.
  • [12] Łapa K. “Meta-optimization of multi-objective populationbased algorithms using multi-objective performance metrics”. Information Science, 489, 193-204, 2019.
  • [13] Magliani F, Cagnoni S, Sani L, Prati A. “Genetic algorithms for the optimization of diffusion parameters in contentbased image retrieval”. Proceedings of the 13th International Conference on Distributed Smart Cameras, Trento, Italy, 9-11 September 2019.
  • [14] Magliani F, Cagnoni S, Sani L, Prati A. “Diffusion Parameters Analysis in a Content-Based Image Retrieval Task for Mobile Vision”. Sensors, 20(16), 4449-4472, 2020.
  • [15] Matrenin PV, Sekaev VG. “Particle Swarm optimization with velocity restriction and evolutionary parameters selection for scheduling problem”. 2015 International Siberian Conference on Control and Communications, Omsk, Russia, 21-23 May 2015.
  • [16] Meissner M, Schmuker M, Schneider G. “Optimized particle swarm optimization (OPSO) and its application to artificial neural network training”. BMC Bioinformatics, 7(1), 1-11, 2006.
  • [17] Pedersen MEH, Chipperfield AJ. “Simplifying Particle Swarm Optimization”. Applied Soft Computing, 10(2), 618-628, 2010.
  • [18] Pedersen MEH, Chipperfield AJ. “Tuning Differential Evolution for Artificial Neural Networks”. Hvass Laboratories, Technical Report, HL0803, 2008.
  • [19] Mason K, Duggan J, Howley E. “A meta optimization analysis of particle swarm optimization velocity update equations for watershed management learning”. Applied Soft Computing, 62, 148-161, 2018.
  • [20] Jaafari A, Panahi M, Pham BT, Shahabi H, Bui DT, Rezaie F. “Meta optimization of an adaptive neuro-fuzzy inference system with grey wolf optimizer and biogeography-based optimization algorithms for spatial prediction of landslide susceptibility”. Catena, 175, 430-445, 2019.
  • [21] Chen W, Hong H, Panahi M, Shahabi H, Wang Y, Shirzadi A. “Spatial prediction of landslide susceptibility using GIS-based data mining techniques of ANFIS with whale optimization algorithm (WOA) and grey wolf optimizer (GWO)”. Applied Science, 9(18), 3755-3787, 2019.
  • [22] Birattari M, Stützle T, Paquete L, Varrentrapp K. “A racing algorithm for configuring metaheuristics”. GECCO 2002 Proceedings of the Genetic and Evolutionary Computation Conference, New York, USA, 9-13 July 2002.
  • [23] Balaprakash P, Birattari M, Stuetzle T. “Improvement strategies for the F-Race algorithm: Sampling design and iterative refinement”. Hybrid Metaheuristics, 4771, 108-122, 2007.
  • [24] Birattari M, Yuan Z, Balaprakash P, Stützle T. “F-Race and Iterated F-Race: An overview”. Institut de Recherches Interdisciplinaireset de D´eveloppements en Intelligence Artificielle, Technical Report, TR/IRIDIA/2009-018, 2009.
  • [25] Trujillo L, González EÁ, Galván E, Tapia JJ, Ponsich A. “On the analysis of hyper-parameter space for a genetic programming system with iterated F-Race”. Soft Computing, 24(19), 14757-14770, 2020.
  • [26] François O, Lavergne C. “Design of evolutionary algorithms - A statistical perspective”. IEEE Transactions on Evolutionary Computation, 5(2), 129-148, 2001.
  • [27] Nannen V, Eiben AE. “A method for parameter calibration and relevance estimation in evolutionary algorithms”. Proceedings of the 8th annual conference on Genetic and evolutionary computation, GECCO ’06, New York, USA, 8-12 July 2006.
  • [28] Adenso-Díaz B, Laguna M. “Fine-Tuning of algorithms using fractional experimental designs and local search”. Operational Research, 54(1), 99-114, 2006.
  • [29] Hutter F, Hoos HH, Stützle, T. “Automatic algorithm configuration based on local search”. BMC Bioinformatics, 7, 1152-1157, 2007.
  • [30] Bartz-Beielstein T, Lasarczyk CWG, Preuß M. “Sequential parameter optimization”. 2005 IEEE Congress, Edinburgh, UK, 2-5 September 2005.
  • [31] Hutter F, Hoos HH, Leyton-Brown K, Stützle T. “ParamILS: An automatic algorithm configuration framework”. Artificial Intelligence Research, 36, 267-306, 2009.
  • [32] Hutter F, Hoos HH, Leyton-Brown K. “Sequential modelbased optimization for general algorithm configuration”. Learning and Intelligent Optimization-5th International Conference, LION 5, Rome, Italy, 17-21 January 2011.
  • [33] Branke J, Elomari J. “Meta-Optimization for parameter tuning with a flexible computing budget”. Proceeding Fourteenth International Conference, Philadelphia, USA, 7-11 July 2012.
  • [34] Francesca G, Pellegrini P, Stützle T, Birattari M. “Off-line and on-line tuning: a study on operator selection for a memetic algorithm applied to the QAP”. European Conference on Evolutionary Computation in Combinatorial Optimization, Torino, Italy, 27-29 April 2011.
  • [35] Aburomman AA, Ibne Reaz M Bin. “A novel SVM-kNN-PSO ensemble method for intrusion detection system”. Applied Soft Computing, 38, 360-372, 2016.
  • [36] Meyer-Nieberg S, Beyer HG. “Self-adaptation in evolutionary algorithms”. Computational Intelligence, 54, 47-75, 2007.
  • [37] Bendre N, Marín HT, Najafirad P. “Learning from few samples: A survey”. arXiv, 2020. https://arxiv.org/pdf/2007.15484.pdf.
  • [38] Staelin C. “Parameter Selection for Support Vector Machines”. Hewlett-Packard Company, Technical Report, HPL-2002-354, 2003.
  • [39] Holland JH. Adaptation in Natural and Artificial Systems. 2nd ed. Michigan, USA, Ann Arbor, University of Michigan Press, 1975.
  • [40] Goldberg DE. Genetic Algorithms in Search, Optimization, and Machine Learning. 3rd ed. Boston, USA, Addison Wesley Publishing Company, 1989.
  • [41] Mitchell M. An Introduction to Genetic Algorithms. 2nd ed. London, England, MIT Press, 1996.
  • [42] Blickle T, Thiele L. “A comparison of selection schemes used in evolutionary algorithms”. Evolutionary Computation, 4, 361-394, 1996.
  • [43] Jiacheng L, Lei L. “A hybrid genetic algorithm based on information entropy and game theory”. IEEE Access, 8, 36602-36611, 2020.
  • [44] Eberhart R, Kennedy J. “A new optimizer using particle swarm theory”. Micro Machine and Human Science, 1995 MHS’95, Proceedings of the Sixth International Symposium, Nagoya, Japan, 4-6 October 1995.
  • [45] Oral M, Gençal MC. “Harmony between the best and the worst individuals in tournament selection”. Majlesi Journal of Mechatronic Systems, 6(3), 8-25, 2017.
  • [46] Michalewicz Z, Janikow C. Z. “Handling constraints in genetic algorithms”. Adaptive Computing in Design and Manufacture, 1991. https://doi.org/10.1007/978-0- 85729-345-9_23.
  • [47] Mirjalili S, Lewis A. “Grey wolf optimizer”. Advances in Engineering Software, 69, 46-61, 2014.
  • [48] Molga M, Smutnicki C. “Test functions for optimization needs”. https://marksmannet.com/RobertMarks/Classes/ENGR 5358/Papers/functions.pdf (23.01.2021)
  • [49] Conover WJ. Practical Nonparametric Statistics. 3rd ed. New Delhi, India, Wiley India Pvt. Limited, 2006.
  • [50] Everitt B, Skrondal A. The Cambridge Dictionary of Statistics. 4th ed. London, England, Cambridge University Press, 2002.
  • [51] Friedman M. “The use of ranks to avoid the assumption of normality implicit in the analysis of variance”. Journal of the American Statistical Association, 32(200), 675-701, 1937.
  • [52] Sheskin DJ. Handbook of Parametric and Nonparametric Statistical Procedures. 5th ed. New York, USA, Chapman and Hall/CRC, 2003.
  • [53] Wilcoxon F. “Individual comparisons by ranking methods”. In Breakthroughs in Statistics, 1(6), 196-202, 1992.
  • [54] Karaboga D, Akay B. “A comparative study of artificial bee colony algorithm”. Applied Mathematics and Computation, Elsevier, Netherlands, 214, 108-132, 2009.

A novel meta-optimizer for evolutionary algorithms: bipolar mating tendency

Yıl 2022, Cilt: 28 Sayı: 2, 313 - 323, 30.04.2022

Öz

Recent studies show that the performance of Evolutionary Algorithms often depends on choosing appropriate parameter configurations. Thus, researchers have generally tuned these parameters either looking at the similar research areas in the literature or manually, e.g. Grid Search. However, searching the parameter manually is laborious and timeconsuming; therefore, meta-optimization techniques have become commonly used methods to adjust parameters of an algorithm. These techniques can be classified in two widespread manners: off-line, tuning parameters of an algorithm before the algorithm initiates, and on-line, tuning the parameters while it is working. In this paper, Bipolar Matching Tendency (BMT) algorithm has been chosen as the selection method of a Genetic Algorithm (GA). The new obtained algorithm is named GA-BMT and has been used for the first time as an online metaoptimizer. In addition, the paper utilizes two search algorithms (Grid Search, Coarse to Fine Search) with three meta-optimization methods (Standard GA, Particle Swarm Optimization, GA-BMT) to investigate the best parameter settings of the Standard GA for 17 test functions, and offers a comparative work by comparing their results. Furthermore, non-parametric statistical tests, Friedman and Wilcoxon Signed Rank, were performed to demonstrate the significance of the results. Based on the all results that achieved, GA-BMT presents a reasonable achievement.

Kaynakça

  • [1] Lobo FG, Goldberg DE. “The parameter-less genetic algorithm in practice”. Information Science, 167(1-4), 217-232, 2004.
  • [2] Mercer RE, Sampson JR. “Adaptive search using a reproductive meta-plan”. Kybernetes, 7(3), 215-228, 1978.
  • [3] Grefenstette JJ. “Optimization of Control Parameters for Genetic Algorithms”. IEEE Transactions on Systems, Man, and Cybernetics, 16(1), 122-128, 1986.
  • [4] Freisleben B, Härtfelder M. “Optimization of genetic algorithms by genetic algorithms”. International Conference on Artificial Neural Nets and Genetic Algorithms-ANNGA’93, Innsbruck, Austria, 1-5 May 1993.
  • [5] Bäck T. “Parallel optimization of evolutionary algorithms”. The Third Conference on Parallel Problem Solving from Nature-PPSN III, Jerusalem, Israel, 9-14 October 1994.
  • [6] Hinterding R. “Gaussian mutation and self-adaption for numeric genetic algorithms”. IEEE International Conference, Perth, Australia, 29 November-1 December 1995.
  • [7] Keane AJ. “Genetic algorithm optimization of multi-peak problems: studies in convergence and robustness”. Artificial Intelligence Engineering, 9(2), 75-83, 1995.
  • [8] Cortez P, Rocha M, Neves J. “A meta-genetic algorithm for time series forecasting”. 10th Portuguese Conference on Artificial Intelligence, Porto, Portugal, 1-3 December 2001.
  • [9] Hong SH, Cornelius J, Wang Y, Pant K. “Fault compensation by online updating of genetic algorithm-selected neural network model for model predictive control”. Applied Science, 1(11), 1-16, 2019.
  • [10] Camilleri M, Neri F, Papoutsidakis M. “An algorithmic approach to parameter selection in machine learning using meta-optimization techniques”. WSEAS Transactions on systems, 13(1), 203-212, 2014
  • [11] Li H, Huang Z, Liu X, Zeng C, Zou P. “Multi-Fidelity metaoptimization for nature inspired optimization algorithms”. Applied Soft Computing, 2020. https://doi.org/10.1016/j.asoc.2020.106619.
  • [12] Łapa K. “Meta-optimization of multi-objective populationbased algorithms using multi-objective performance metrics”. Information Science, 489, 193-204, 2019.
  • [13] Magliani F, Cagnoni S, Sani L, Prati A. “Genetic algorithms for the optimization of diffusion parameters in contentbased image retrieval”. Proceedings of the 13th International Conference on Distributed Smart Cameras, Trento, Italy, 9-11 September 2019.
  • [14] Magliani F, Cagnoni S, Sani L, Prati A. “Diffusion Parameters Analysis in a Content-Based Image Retrieval Task for Mobile Vision”. Sensors, 20(16), 4449-4472, 2020.
  • [15] Matrenin PV, Sekaev VG. “Particle Swarm optimization with velocity restriction and evolutionary parameters selection for scheduling problem”. 2015 International Siberian Conference on Control and Communications, Omsk, Russia, 21-23 May 2015.
  • [16] Meissner M, Schmuker M, Schneider G. “Optimized particle swarm optimization (OPSO) and its application to artificial neural network training”. BMC Bioinformatics, 7(1), 1-11, 2006.
  • [17] Pedersen MEH, Chipperfield AJ. “Simplifying Particle Swarm Optimization”. Applied Soft Computing, 10(2), 618-628, 2010.
  • [18] Pedersen MEH, Chipperfield AJ. “Tuning Differential Evolution for Artificial Neural Networks”. Hvass Laboratories, Technical Report, HL0803, 2008.
  • [19] Mason K, Duggan J, Howley E. “A meta optimization analysis of particle swarm optimization velocity update equations for watershed management learning”. Applied Soft Computing, 62, 148-161, 2018.
  • [20] Jaafari A, Panahi M, Pham BT, Shahabi H, Bui DT, Rezaie F. “Meta optimization of an adaptive neuro-fuzzy inference system with grey wolf optimizer and biogeography-based optimization algorithms for spatial prediction of landslide susceptibility”. Catena, 175, 430-445, 2019.
  • [21] Chen W, Hong H, Panahi M, Shahabi H, Wang Y, Shirzadi A. “Spatial prediction of landslide susceptibility using GIS-based data mining techniques of ANFIS with whale optimization algorithm (WOA) and grey wolf optimizer (GWO)”. Applied Science, 9(18), 3755-3787, 2019.
  • [22] Birattari M, Stützle T, Paquete L, Varrentrapp K. “A racing algorithm for configuring metaheuristics”. GECCO 2002 Proceedings of the Genetic and Evolutionary Computation Conference, New York, USA, 9-13 July 2002.
  • [23] Balaprakash P, Birattari M, Stuetzle T. “Improvement strategies for the F-Race algorithm: Sampling design and iterative refinement”. Hybrid Metaheuristics, 4771, 108-122, 2007.
  • [24] Birattari M, Yuan Z, Balaprakash P, Stützle T. “F-Race and Iterated F-Race: An overview”. Institut de Recherches Interdisciplinaireset de D´eveloppements en Intelligence Artificielle, Technical Report, TR/IRIDIA/2009-018, 2009.
  • [25] Trujillo L, González EÁ, Galván E, Tapia JJ, Ponsich A. “On the analysis of hyper-parameter space for a genetic programming system with iterated F-Race”. Soft Computing, 24(19), 14757-14770, 2020.
  • [26] François O, Lavergne C. “Design of evolutionary algorithms - A statistical perspective”. IEEE Transactions on Evolutionary Computation, 5(2), 129-148, 2001.
  • [27] Nannen V, Eiben AE. “A method for parameter calibration and relevance estimation in evolutionary algorithms”. Proceedings of the 8th annual conference on Genetic and evolutionary computation, GECCO ’06, New York, USA, 8-12 July 2006.
  • [28] Adenso-Díaz B, Laguna M. “Fine-Tuning of algorithms using fractional experimental designs and local search”. Operational Research, 54(1), 99-114, 2006.
  • [29] Hutter F, Hoos HH, Stützle, T. “Automatic algorithm configuration based on local search”. BMC Bioinformatics, 7, 1152-1157, 2007.
  • [30] Bartz-Beielstein T, Lasarczyk CWG, Preuß M. “Sequential parameter optimization”. 2005 IEEE Congress, Edinburgh, UK, 2-5 September 2005.
  • [31] Hutter F, Hoos HH, Leyton-Brown K, Stützle T. “ParamILS: An automatic algorithm configuration framework”. Artificial Intelligence Research, 36, 267-306, 2009.
  • [32] Hutter F, Hoos HH, Leyton-Brown K. “Sequential modelbased optimization for general algorithm configuration”. Learning and Intelligent Optimization-5th International Conference, LION 5, Rome, Italy, 17-21 January 2011.
  • [33] Branke J, Elomari J. “Meta-Optimization for parameter tuning with a flexible computing budget”. Proceeding Fourteenth International Conference, Philadelphia, USA, 7-11 July 2012.
  • [34] Francesca G, Pellegrini P, Stützle T, Birattari M. “Off-line and on-line tuning: a study on operator selection for a memetic algorithm applied to the QAP”. European Conference on Evolutionary Computation in Combinatorial Optimization, Torino, Italy, 27-29 April 2011.
  • [35] Aburomman AA, Ibne Reaz M Bin. “A novel SVM-kNN-PSO ensemble method for intrusion detection system”. Applied Soft Computing, 38, 360-372, 2016.
  • [36] Meyer-Nieberg S, Beyer HG. “Self-adaptation in evolutionary algorithms”. Computational Intelligence, 54, 47-75, 2007.
  • [37] Bendre N, Marín HT, Najafirad P. “Learning from few samples: A survey”. arXiv, 2020. https://arxiv.org/pdf/2007.15484.pdf.
  • [38] Staelin C. “Parameter Selection for Support Vector Machines”. Hewlett-Packard Company, Technical Report, HPL-2002-354, 2003.
  • [39] Holland JH. Adaptation in Natural and Artificial Systems. 2nd ed. Michigan, USA, Ann Arbor, University of Michigan Press, 1975.
  • [40] Goldberg DE. Genetic Algorithms in Search, Optimization, and Machine Learning. 3rd ed. Boston, USA, Addison Wesley Publishing Company, 1989.
  • [41] Mitchell M. An Introduction to Genetic Algorithms. 2nd ed. London, England, MIT Press, 1996.
  • [42] Blickle T, Thiele L. “A comparison of selection schemes used in evolutionary algorithms”. Evolutionary Computation, 4, 361-394, 1996.
  • [43] Jiacheng L, Lei L. “A hybrid genetic algorithm based on information entropy and game theory”. IEEE Access, 8, 36602-36611, 2020.
  • [44] Eberhart R, Kennedy J. “A new optimizer using particle swarm theory”. Micro Machine and Human Science, 1995 MHS’95, Proceedings of the Sixth International Symposium, Nagoya, Japan, 4-6 October 1995.
  • [45] Oral M, Gençal MC. “Harmony between the best and the worst individuals in tournament selection”. Majlesi Journal of Mechatronic Systems, 6(3), 8-25, 2017.
  • [46] Michalewicz Z, Janikow C. Z. “Handling constraints in genetic algorithms”. Adaptive Computing in Design and Manufacture, 1991. https://doi.org/10.1007/978-0- 85729-345-9_23.
  • [47] Mirjalili S, Lewis A. “Grey wolf optimizer”. Advances in Engineering Software, 69, 46-61, 2014.
  • [48] Molga M, Smutnicki C. “Test functions for optimization needs”. https://marksmannet.com/RobertMarks/Classes/ENGR 5358/Papers/functions.pdf (23.01.2021)
  • [49] Conover WJ. Practical Nonparametric Statistics. 3rd ed. New Delhi, India, Wiley India Pvt. Limited, 2006.
  • [50] Everitt B, Skrondal A. The Cambridge Dictionary of Statistics. 4th ed. London, England, Cambridge University Press, 2002.
  • [51] Friedman M. “The use of ranks to avoid the assumption of normality implicit in the analysis of variance”. Journal of the American Statistical Association, 32(200), 675-701, 1937.
  • [52] Sheskin DJ. Handbook of Parametric and Nonparametric Statistical Procedures. 5th ed. New York, USA, Chapman and Hall/CRC, 2003.
  • [53] Wilcoxon F. “Individual comparisons by ranking methods”. In Breakthroughs in Statistics, 1(6), 196-202, 1992.
  • [54] Karaboga D, Akay B. “A comparative study of artificial bee colony algorithm”. Applied Mathematics and Computation, Elsevier, Netherlands, 214, 108-132, 2009.
Toplam 54 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Mühendislik
Bölüm Elektrik Elektornik Müh. / Bilgisayar Müh.
Yazarlar

Mashar Cenk Gencal Bu kişi benim

Mustafa Oral Bu kişi benim

Yayımlanma Tarihi 30 Nisan 2022
Yayımlandığı Sayı Yıl 2022 Cilt: 28 Sayı: 2

Kaynak Göster

APA Gencal, M. C., & Oral, M. (2022). Evrimsel algoritmalar için yeni bir meta-iyileştirici: bipolar eşleşme eğilimi. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, 28(2), 313-323.
AMA Gencal MC, Oral M. Evrimsel algoritmalar için yeni bir meta-iyileştirici: bipolar eşleşme eğilimi. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi. Nisan 2022;28(2):313-323.
Chicago Gencal, Mashar Cenk, ve Mustafa Oral. “Evrimsel Algoritmalar için Yeni Bir Meta-iyileştirici: Bipolar eşleşme eğilimi”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi 28, sy. 2 (Nisan 2022): 313-23.
EndNote Gencal MC, Oral M (01 Nisan 2022) Evrimsel algoritmalar için yeni bir meta-iyileştirici: bipolar eşleşme eğilimi. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi 28 2 313–323.
IEEE M. C. Gencal ve M. Oral, “Evrimsel algoritmalar için yeni bir meta-iyileştirici: bipolar eşleşme eğilimi”, Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, c. 28, sy. 2, ss. 313–323, 2022.
ISNAD Gencal, Mashar Cenk - Oral, Mustafa. “Evrimsel Algoritmalar için Yeni Bir Meta-iyileştirici: Bipolar eşleşme eğilimi”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi 28/2 (Nisan 2022), 313-323.
JAMA Gencal MC, Oral M. Evrimsel algoritmalar için yeni bir meta-iyileştirici: bipolar eşleşme eğilimi. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi. 2022;28:313–323.
MLA Gencal, Mashar Cenk ve Mustafa Oral. “Evrimsel Algoritmalar için Yeni Bir Meta-iyileştirici: Bipolar eşleşme eğilimi”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, c. 28, sy. 2, 2022, ss. 313-2.
Vancouver Gencal MC, Oral M. Evrimsel algoritmalar için yeni bir meta-iyileştirici: bipolar eşleşme eğilimi. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi. 2022;28(2):313-2.





Creative Commons Lisansı
Bu dergi Creative Commons Al 4.0 Uluslararası Lisansı ile lisanslanmıştır.