Research Article
BibTex RIS Cite

SINIFLANDIRMA PROBLEMLERİ İÇİN AGDE-TABANLI META-SEZGİSEL BOYUT İNDİRGEME ALGORİTMASININ GELİŞTİRİLMESİ

Year 2020, Volume: 8 Issue: 5, 206 - 217, 29.12.2020
https://doi.org/10.21923/jesd.828518

Abstract

Sınıflandırma problemleri yapay zekanın geniş bir uygulama alanını temsil etmektedirler. Sınıflandırma problemlerinin özelliklerine bağlı olarak algoritmaların performansları değişebilmektedir. Bir sınıflandırma probleminin boyutunun ya da nitelik sayısının değişmesi durumunda performansı bundan etkilenmeyen bir algoritma yoktur. Hemen tüm algoritmalar için problem boyutunun artması performansı olumsuz yönde etkileyen bir faktördür. Dolayısıyla sınıflandırma problemlerinde problem boyutu azaltma üzerinde yoğun bir şekilde çalışılan önemli bir konudur. Özellikle yüksek boyutlu nitelik uzaylarında problem için en uygun modeli yaratmak zor bir görevdir. Karmaşıklık düzeyi yüksek arama uzaylarında genel en iyi çözümü bulmak olanaksız olarak tanımlanmaktadır. Dolayısıyla bu tür arama problemleri için en ideal çözüm yolu meta-sezgisel arama algoritmalarından faydalanmaktır. Bu makale çalışmasında sınıflandırma problemlerinde boyut azaltmak için AGDE (adaptive guided differential evolution) algoritması temelli meta-sezgisel nitelik seçim yöntemi geliştirilmektedir. Geliştirilen yöntem niteliklerin ağırlıklandırılmasını ve eşik değerden küçük ağırlıklı niteliklerin tespit edilmesini sağlamaktadır. Deneysel çalışmalardan elde edilen sonuçlar AGDE-tabanlı nitelik seçim yönteminin sınıflandırma problemleri için etkili bir boyut azaltma aracı olduğunu göstermektedir.

Supporting Institution

TÜBİTAK

Project Number

1919B011904088

References

  • Abualigah, L. M., Khader, A. T., Al-Betar, M. A., & Alomari, O. A. (2017). Text feature selection with a robust weight scheme and dynamic dimension reduction to text document clustering. Expert Systems with Applications, 84, 24-36.
  • Aljarah, I., Mafarja, M., Heidari, A. A., Faris, H., & Mirjalili, S. (2020). Multi-verse optimizer: theory, literature review, and application in data clustering. In Nature-Inspired Optimizers (pp. 123-141). Springer, Cham.
  • Balint Antal, Andras Hajdu: An ensemble-based system for automatic screening of diabetic retinopathy, Knowledge-Based Systems 60 (April 2014), 20-27.
  • Booker, L. B., Goldberg, D. E., & Holland, J. H. (1989). Classifier systems and genetic algorithms. Artificial intelligence, 40(1-3), 235-282.
  • Cerrada, M., Aguilar, J., Altamiranda, J., & Sánchez, R. V. (2019). A hybrid heuristic algorithm for evolving models in simultaneous scenarios of classification and clustering. Knowledge and Information Systems, 61(2), 755-798. Dash, M., & Liu, H. (1997). Feature selection for classification. Intelligent data analysis, 1(3), 131-156.
  • Fausto, F., Reyna-Orta, A., Cuevas, E., Andrade, Á. G., & Perez-Cisneros, M. (2020). From ants to whales: metaheuristics for all tastes. Artificial Intelligence Review, 53(1), 753-810.
  • Inbarani, H. H., Bagyamathi, M., & Azar, A. T. (2015). A novel hybrid feature selection method based on rough set and improved harmony search. Neural Computing and Applications, 26(8), 1859-1880.
  • Johnson, B., 2013. High resolution urban land cover classification using a competitive multi-scale object-based approach. Remote Sensing Letters, 4 (2), 131-140.
  • Kahraman, H. T. (2016). A novel and powerful hybrid classifier method: Development and testing of heuristic k-nn algorithm with fuzzy distance metric. Data & Knowledge Engineering, 103, 44-59.
  • Kahraman, H. T., Bayindir, R., & Sagiroglu, S. (2012). A new approach to predict the excitation current and parameter weightings of synchronous machines based on genetic algorithm-based k-NN estimator. Energy Conversion and Management, 64, 129-138.
  • Kahraman, H. T., Sagiroglu, S., & Colak, I. (2013). The development of intuitive knowledge classifier and the modeling of domain dependent data. Knowledge-Based Systems, 37, 283-295.
  • Kulkarni, A. J., Singh, P. K., Satapathy, S. C., Kashan, A. H., & Tai, K. (Eds.). (2019). Socio-cultural Inspired Metaheuristics (Vol. 828). Springer.
  • Kwak, N., & Choi, C. H. (2002). Input feature selection for classification problems. IEEE transactions on neural networks, 13(1), 143-159.
  • Li, K., Cao, X., Ge, X., Wang, F., Lu, X., Shi, M., ... & Chang, S. (2020). Meta-Heuristic Optimization Based Two-stage Residential Load Pattern Clustering Approach Considering Intracluster Compactness and Inter-cluster Separation. IEEE Transactions on Industry Applications.
  • Lin, K. C., Zhang, K. Y., Huang, Y. H., Hung, J. C., & Yen, N. (2016). Feature selection based on an improved cat swarm optimization algorithm for big data classification. The Journal of Supercomputing, 72(8), 3210-3221.
  • Mohamed, A. W., & Mohamed, A. K. (2019). Adaptive guided differential evolution algorithm with novel mutation for numerical optimization. International Journal of Machine Learning and Cybernetics, 10(2), 253-277.
  • O.L. Mangasarian, W.N. Street and W.H. Wolberg. Breast cancer diagnosis and prognosis via linear programming. Operations Research, 43(4), pages 570-577, July-August 1995.
  • R. Alizadehsani, J. Habibi, M. J. Hosseini, H. Mashayekhi, R. Boghrati, A. Ghandeharioun, et al., 'A data mining approach for diagnosis of coronary artery disease,' Computer Methods and Programs in Biomedicine, vol. 111, pp. 52-61, 2013/07/01/ 2013.
  • Ramos-Figueroa, O., Quiroz-Castellanos, M., Mezura-Montes, E., & Schütze, O. (2020). Metaheuristics to solve grouping problems: A review and a case study. Swarm and Evolutionary Computation, 100643.
  • Rouhi, A., & Nezamabadi-pour, H. (2017, March). A hybrid feature selection approach based on ensemble method for high-dimensional data. In 2017 2nd Conference on Swarm Intelligence and Evolutionary Computation (CSIEC) (pp. 16-20). IEEE.
  • Santhanam, T., & Padmavathi, M. S. (2015). Application of K-means and genetic algorithms for dimension reduction by integrating SVM for diabetes diagnosis. Procedia Computer Science, 47, 76-83.
  • Sibalija, T. V. (2019). Particle swarm optimisation in designing parameters of manufacturing processes: A review (2008–2018). Applied Soft Computing, 84, 105743.
  • Storn, R., & Price, K. (1997). Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 11(4), 341-359.
  • Tang, J., Alelyani, S., & Liu, H. (2014). Feature selection for classification: A review. Data classification: Algorithms and applications, 37.
  • Thangavel, K., & Pethalakshmi, A. (2009). Dimensionality reduction based on rough set theory: A review. Applied Soft Computing, 9(1), 1-12.
  • Tran, B., Xue, B., & Zhang, M. (2018). Variable-length particle swarm optimization for feature selection on high-dimensional classification. IEEE Transactions on Evolutionary Computation, 23(3), 473-487.
  • Xue, B., Zhang, M., & Browne, W. N. (2012). Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE transactions on cybernetics, 43(6), 1656-1671.
  • Yadav, M., & Prakash, V. P. (2020). A Comparison of the Effectiveness of Two Novel Clustering-Based Heuristics for the p-Centre Problem. In Advances in Data and Information Sciences (pp. 247-255). Springer, Singapore. Yilmaz, C., Kahraman, H. T., & Söyler, S. (2018). Passive mine detection and classification method based on hybrid model. IEEE Access, 6, 47870-47888.
  • Yusta, S. C. (2009). Different metaheuristic strategies to solve the feature selection problem. Pattern Recognition Letters, 30(5), 525-534.
  • Zhou, Q., Benlic, U., Wu, Q., & Hao, J. K. (2019). Heuristic search to the capacitated clustering problem. European Journal of Operational Research, 273(2), 464-487.

DEVELOPMENT OF AGDE-BASED META-HEURISTIC DIMENSION REDUCTION ALGORITHM FOR CLASSIFICATION PROBLEMS

Year 2020, Volume: 8 Issue: 5, 206 - 217, 29.12.2020
https://doi.org/10.21923/jesd.828518

Abstract

Classification problems represent a wide range of applications of artificial intelligence. Depending on the properties of classification problems, algorithms' performances may vary. If the number of attributes/features of a problem changes, the performance of the algorithm used to solve this problem also changes. For almost all algorithms, increasing problem dimension is a factor that negatively affects performance. Therefore, dimensionality reduction is an important issue that is studied extensively. It is a difficult task to create the most suitable model for the problem, especially in high-dimensional search spaces (with a high number of independent variables). It is impossible to find the best overall solution in highly complex search spaces. Therefore, the ideal solution for such search problems is to use meta-heuristic search algorithms. In this article, meta-heuristic feature selection method based on AGDE (adaptive guided differential evolution) algorithm is developed for dimensionality reduction in classification problems. The developed method enables the weighting of the attributes and the determination of the weighted features less than the threshold value. The results obtained from the experimental studies show that AGDE-based dimensionality reduction method has very effective performance for classification problems.

Project Number

1919B011904088

References

  • Abualigah, L. M., Khader, A. T., Al-Betar, M. A., & Alomari, O. A. (2017). Text feature selection with a robust weight scheme and dynamic dimension reduction to text document clustering. Expert Systems with Applications, 84, 24-36.
  • Aljarah, I., Mafarja, M., Heidari, A. A., Faris, H., & Mirjalili, S. (2020). Multi-verse optimizer: theory, literature review, and application in data clustering. In Nature-Inspired Optimizers (pp. 123-141). Springer, Cham.
  • Balint Antal, Andras Hajdu: An ensemble-based system for automatic screening of diabetic retinopathy, Knowledge-Based Systems 60 (April 2014), 20-27.
  • Booker, L. B., Goldberg, D. E., & Holland, J. H. (1989). Classifier systems and genetic algorithms. Artificial intelligence, 40(1-3), 235-282.
  • Cerrada, M., Aguilar, J., Altamiranda, J., & Sánchez, R. V. (2019). A hybrid heuristic algorithm for evolving models in simultaneous scenarios of classification and clustering. Knowledge and Information Systems, 61(2), 755-798. Dash, M., & Liu, H. (1997). Feature selection for classification. Intelligent data analysis, 1(3), 131-156.
  • Fausto, F., Reyna-Orta, A., Cuevas, E., Andrade, Á. G., & Perez-Cisneros, M. (2020). From ants to whales: metaheuristics for all tastes. Artificial Intelligence Review, 53(1), 753-810.
  • Inbarani, H. H., Bagyamathi, M., & Azar, A. T. (2015). A novel hybrid feature selection method based on rough set and improved harmony search. Neural Computing and Applications, 26(8), 1859-1880.
  • Johnson, B., 2013. High resolution urban land cover classification using a competitive multi-scale object-based approach. Remote Sensing Letters, 4 (2), 131-140.
  • Kahraman, H. T. (2016). A novel and powerful hybrid classifier method: Development and testing of heuristic k-nn algorithm with fuzzy distance metric. Data & Knowledge Engineering, 103, 44-59.
  • Kahraman, H. T., Bayindir, R., & Sagiroglu, S. (2012). A new approach to predict the excitation current and parameter weightings of synchronous machines based on genetic algorithm-based k-NN estimator. Energy Conversion and Management, 64, 129-138.
  • Kahraman, H. T., Sagiroglu, S., & Colak, I. (2013). The development of intuitive knowledge classifier and the modeling of domain dependent data. Knowledge-Based Systems, 37, 283-295.
  • Kulkarni, A. J., Singh, P. K., Satapathy, S. C., Kashan, A. H., & Tai, K. (Eds.). (2019). Socio-cultural Inspired Metaheuristics (Vol. 828). Springer.
  • Kwak, N., & Choi, C. H. (2002). Input feature selection for classification problems. IEEE transactions on neural networks, 13(1), 143-159.
  • Li, K., Cao, X., Ge, X., Wang, F., Lu, X., Shi, M., ... & Chang, S. (2020). Meta-Heuristic Optimization Based Two-stage Residential Load Pattern Clustering Approach Considering Intracluster Compactness and Inter-cluster Separation. IEEE Transactions on Industry Applications.
  • Lin, K. C., Zhang, K. Y., Huang, Y. H., Hung, J. C., & Yen, N. (2016). Feature selection based on an improved cat swarm optimization algorithm for big data classification. The Journal of Supercomputing, 72(8), 3210-3221.
  • Mohamed, A. W., & Mohamed, A. K. (2019). Adaptive guided differential evolution algorithm with novel mutation for numerical optimization. International Journal of Machine Learning and Cybernetics, 10(2), 253-277.
  • O.L. Mangasarian, W.N. Street and W.H. Wolberg. Breast cancer diagnosis and prognosis via linear programming. Operations Research, 43(4), pages 570-577, July-August 1995.
  • R. Alizadehsani, J. Habibi, M. J. Hosseini, H. Mashayekhi, R. Boghrati, A. Ghandeharioun, et al., 'A data mining approach for diagnosis of coronary artery disease,' Computer Methods and Programs in Biomedicine, vol. 111, pp. 52-61, 2013/07/01/ 2013.
  • Ramos-Figueroa, O., Quiroz-Castellanos, M., Mezura-Montes, E., & Schütze, O. (2020). Metaheuristics to solve grouping problems: A review and a case study. Swarm and Evolutionary Computation, 100643.
  • Rouhi, A., & Nezamabadi-pour, H. (2017, March). A hybrid feature selection approach based on ensemble method for high-dimensional data. In 2017 2nd Conference on Swarm Intelligence and Evolutionary Computation (CSIEC) (pp. 16-20). IEEE.
  • Santhanam, T., & Padmavathi, M. S. (2015). Application of K-means and genetic algorithms for dimension reduction by integrating SVM for diabetes diagnosis. Procedia Computer Science, 47, 76-83.
  • Sibalija, T. V. (2019). Particle swarm optimisation in designing parameters of manufacturing processes: A review (2008–2018). Applied Soft Computing, 84, 105743.
  • Storn, R., & Price, K. (1997). Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 11(4), 341-359.
  • Tang, J., Alelyani, S., & Liu, H. (2014). Feature selection for classification: A review. Data classification: Algorithms and applications, 37.
  • Thangavel, K., & Pethalakshmi, A. (2009). Dimensionality reduction based on rough set theory: A review. Applied Soft Computing, 9(1), 1-12.
  • Tran, B., Xue, B., & Zhang, M. (2018). Variable-length particle swarm optimization for feature selection on high-dimensional classification. IEEE Transactions on Evolutionary Computation, 23(3), 473-487.
  • Xue, B., Zhang, M., & Browne, W. N. (2012). Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE transactions on cybernetics, 43(6), 1656-1671.
  • Yadav, M., & Prakash, V. P. (2020). A Comparison of the Effectiveness of Two Novel Clustering-Based Heuristics for the p-Centre Problem. In Advances in Data and Information Sciences (pp. 247-255). Springer, Singapore. Yilmaz, C., Kahraman, H. T., & Söyler, S. (2018). Passive mine detection and classification method based on hybrid model. IEEE Access, 6, 47870-47888.
  • Yusta, S. C. (2009). Different metaheuristic strategies to solve the feature selection problem. Pattern Recognition Letters, 30(5), 525-534.
  • Zhou, Q., Benlic, U., Wu, Q., & Hao, J. K. (2019). Heuristic search to the capacitated clustering problem. European Journal of Operational Research, 273(2), 464-487.
There are 30 citations in total.

Details

Primary Language Turkish
Subjects Computer Software
Journal Section Research Articles
Authors

Hamdi Kahraman 0000-0001-9985-6324

Büşra Aras 0000-0001-9214-771X

Orhun Yıldız This is me 0000-0002-6191-766X

Project Number 1919B011904088
Publication Date December 29, 2020
Submission Date November 19, 2020
Acceptance Date December 29, 2020
Published in Issue Year 2020 Volume: 8 Issue: 5

Cite

APA Kahraman, H., Aras, B., & Yıldız, O. (2020). SINIFLANDIRMA PROBLEMLERİ İÇİN AGDE-TABANLI META-SEZGİSEL BOYUT İNDİRGEME ALGORİTMASININ GELİŞTİRİLMESİ. Mühendislik Bilimleri Ve Tasarım Dergisi, 8(5), 206-217. https://doi.org/10.21923/jesd.828518