BibTex RIS Cite

Özellik Seçim Yöntemleri ve Yeni Bir Yaklaşım

Year 2018, Volume: 22 Issue: Special, 21 - 31, 05.10.2018

Abstract

Veri madenciliği sürecinin önemli aşamalarından biri veri boyutunun azaltılması işlemidir. Veri boyutunun azaltılması kısaca, büyük veri kümelerinin depolanması ve analiz edilmesinde karşılaşılan sorunları aşmak için veri kümesinden ilgisiz veya gereksiz değişkenlerin çıkartılması olarak tanımlanmaktadır. Veri boyutunun azaltılması için kullanılan yöntemlerin başında özellik seçimi gelmektedir. Özellik seçimi, orijinal veri setini temsil edebilecek en iyi altkümenin seçimi olarak tanımlanmaktadır. Bu işlem, ilgilenilen problem için en faydalı ve en önemli özellikleri seçerek veri kümesindeki özellik sayısını azaltmayı yani veri boyutunu düşürmeyi amaçlamaktadır. Bu çalışmada, özellik seçim yöntemleri incelenmiş ve alternatif bir yöntem önerilmiştir.

References

  • [1] Bozdağ, H. 2004. Intelligent Statistical Data Mining with Information Complexity and Genetic Algorithms, Statistical Data Mining and Knowledge Discovery, Chapman and Hall/CRC, Florida.
  • [2] Forman, G. 2003. An Extensive Empirical Study of Feature Selection Metrics for Text Classification, Journal of Machine Learning Research, 3, 1289–1305.
  • [3] Ladha, L., Deepa, T. 2011. Feature Selection Methods And Algorithms, International Journal on Computer Science and Engineering, 3(5), 1787-1797.
  • [4] Liu, H., Yu, L. 2005. Towards Integrating Feature Selection Algorithms For Classification And Clustering, Knowledge and Data Engineering, IEEE Transactions on Computers, 17(4), 491-502.
  • [5] Narendra, P., Fukunaga, K. 1977. A Branch and Bound Algorithm for Feature Subset Selection, IEEE Transactions on Computers, 26(9), 917-922.
  • [6] Doak, J. 1992. An Evaluation of Feature Selection Methods and Their Application to Computer Security, University of California at Davis,Technical Report, California.
  • [7] Brassard, G., Bratley, P. 1996. Fundamentals of Algorithms,Prentice Hall Professional, New Jersey.
  • [8] Saeys, Y., Inza, I., Larranaga, P. 2007. A review of feature selection techniques in bioinformatics, Bioinformatics, 23(19), 2507-2517.
  • [9] Yıldız, O., Tez, M., Bilge, H.Ş., Akcayol ,M.A., Güler, İ. 2012. Meme Kanseri Sınıflandırması için Gen Seçimi, IEEE 20. Sinyal İşleme ve İletişim Uygulamaları Kurultayı, 18-20 Nisan, Muğla.
  • [10] Inc, S. 2007. SPSS Clementine 12.0 Algorithms Guide, SPSS Inc, Chicago.
  • [11] Ünver, Ö., Gamgam, H. 2006. Uygulamalı Temel İstatistik Yöntemler, Seçkin Yayıncılık, Ankara.
  • [12] Hall, M. 1999. Correlation-based Feature Selection for Machine Learning, The University of Waikato, PhD Thesis,Hamilton.
  • [13] Novakavic, J., Strbac, P., Bulatovic, D. 2011. Toward Optimal Feature Selection Using Ranking Methods and Classification Algorithms, Yugoslav Journal of Operations Research, 21(1), 119-135.
  • [14] http://en.wikipedia.org/wiki/Relief_(feature_selection (Erişim Tarihi: 31.08.2014).
  • [15] Biricik, G. 2012. Sınıf Bilgisini Kullanan Boyut İndirgeme Yöntemlerinin Metin Sınıflandırmadaki Etkilerinin Karşılaştırılması, IEEE 20. Sinyal İşleme ve İletişim Uygulamaları Kurultayı, 18-20 Nisan,Muğla.
  • [16] Kira, K., Rendell, L. 1992. The Feature Selection Problem: Traditional Methods and a New Algorithm, AAAI-92, 129-134.
  • [17] Holte, R. 1993. Very simple classification rules perform well on most commonly used datasets, Machine Learning, 11, 63-91.
  • [18] Holmes, G., Nevill-Manning, C. 1995. Feature selection via the discovery of simple classification rules, To appear in Proceedings of Symposium on Intelligent Data Analysis (IDA–95), 17-19 Ağustos, Baden-Baden.
  • [19] Guyon, I. Elisseeff, A. 2003. An Introduction to Variable and Feature Selection, Journal of Machine Learning Research, 3, 1157-1182.
  • [20] Whitney, A. 1972. A direct method of nonparametric measurement selection, IEEE Transactions on Computers, 20(9), 1100-1103.
  • [21] Pratama, S., Muda, A., Choo, Y., Muda, N. 2011. Computationally Inexpensive Sequential Forward Floating Selection for Acquiring Significant Features for Authorship Invarianceness in Writer Identification, International Journal of New Computer Architectures and their Applications, 1(9), 581-598.
  • [22] Marill, T., Green, D. 1963. On the effectiveness of receptors in recognition system, IEEE Trans. Inform. Theory, 9, 11-17.
  • [23] Stearns, S. 1976. On selecting features for pattern classifiers, 3rd International Conference on Pattern Recognition, 8-11 Kasım, Coronado.
  • [24] http://www.facweb.iitkgp.ernet.in/~sudeshna/courses/ML06/featsel.pdf(Erişim Tarihi: 29.10.2014).
  • [25] Pudil, P., Novovicova, J., Kittler, J. 1994. Floating search methods in feature selection, Pattern Recognition Letters, 15, 1119-1125.
  • [26] Eskidere, Ö. 2012. Ses Ölçümlerinden Parkinson Hastalığının Teşhisi İçin Öznitelik Seçme Yöntemlerinin Karşılaştırılması, Sigma, 20, 402-414.
  • [27] Guyon, I., Weston, J., Barnhill, S., Vapnik, V. 2002. Gene selection for cancer classification using support vector machines, Machine Learning, 46, 389-422.
  • [28] http://www.nipsfsc.ecs.soton.ac.uk/datasets/ (Erişim Tarihi: 11.01.2015)
  • [29] http://www.nipsfsc.ecs.soton.ac.uk/papers/NIPS2003-Datasets.pdf (Erişim Tarihi: 11.01.2015)
  • [30] Petricoin, E. F., Ardekani, A. M., Hitt, B. A., Levine, P. J., Fusaro, V. A., Steinberg, S. M., ... & Liotta, L. A., 2002. Use of proteomic patterns in serum to identify ovarian cancer, The lancet, 359(9306), 572-577.
  • [31] LeCun, Y., Bottou, L., Bengio, Y. and Haffner, P., 1998. Gradient-based learning applied to document recognition, Proceedings of the IEEE, 86(11), 2278-2324.
  • [32] Dash, M., Liu, H. 1997. Feature Selection for Classification, Intelligent Data Analysis, Elsevier, 131-156.
  • [33] Liu, T., Liu, S., Chen, Z. and Ma, WY., 2003. An Evaluation on Feature Selection for Text Clustering, Proceedings of the Twentieth International Conference on Machine Learning (ICML-2003), 21-24 Ağustos, Washington D.C.
  • [34] Pouramirarsalani, A., Khalilian, M. and Nikravanshalmani, A. 2017. Fraud detection in E-banking by using the hybrid feature selection and evolutionary algorithms, IJCSNS, 17(8), 271-279.
  • [35] Subramanya, KB., Somani, A. 2017. Enhanced feature mining and classifier models to predict customer churn for an E-retailer, Cloud Computing, Data Science & Engineering- Confluence, 2017 7th International Conference on, 12-13 Ocak, Noida.
  • [36] Mohamad,M. and Selamat, A. 2015. An evaluation on the efficiency of hybrid feature selection in spam email classification, Computer, Communications, and Control Technology (I4CT), 2015 International Conference on, 21-23 Nisan, Kuching.
  • [37] Wang, D., Zhang, Z., Bai, R. and Mao, Y. 2017. A hybrid system with filter approach and multiple population genetic algorithm for feature selection in credit scoring, Journal of Computational and Applied Mathematics, 329, 307-321.
  • [38] Bolon-Canedo, V., Sanchez-Marono, N., Alonso-Betanzos, A., Benítez, J.M. and Herrera, F. 2014. A review of microarray datasets and applied feature selection methods, Information Sciences, 282, 111–135.
  • [39] Furey, T.S., Cristianini, N., Duffy, N., Bednarski, D.W., Schummer, M. and Haussler, D. 2000. Support vector machine classification and validation of cancer tissue samples using microarray expression data, Bioinformatics, 16(10), 906–914.
  • [40] Kumar, V., Minz, S. 2014. Feature Selection: A literature Review, Smart Computing Review, 4(3), 211-229.
Year 2018, Volume: 22 Issue: Special, 21 - 31, 05.10.2018

Abstract

References

  • [1] Bozdağ, H. 2004. Intelligent Statistical Data Mining with Information Complexity and Genetic Algorithms, Statistical Data Mining and Knowledge Discovery, Chapman and Hall/CRC, Florida.
  • [2] Forman, G. 2003. An Extensive Empirical Study of Feature Selection Metrics for Text Classification, Journal of Machine Learning Research, 3, 1289–1305.
  • [3] Ladha, L., Deepa, T. 2011. Feature Selection Methods And Algorithms, International Journal on Computer Science and Engineering, 3(5), 1787-1797.
  • [4] Liu, H., Yu, L. 2005. Towards Integrating Feature Selection Algorithms For Classification And Clustering, Knowledge and Data Engineering, IEEE Transactions on Computers, 17(4), 491-502.
  • [5] Narendra, P., Fukunaga, K. 1977. A Branch and Bound Algorithm for Feature Subset Selection, IEEE Transactions on Computers, 26(9), 917-922.
  • [6] Doak, J. 1992. An Evaluation of Feature Selection Methods and Their Application to Computer Security, University of California at Davis,Technical Report, California.
  • [7] Brassard, G., Bratley, P. 1996. Fundamentals of Algorithms,Prentice Hall Professional, New Jersey.
  • [8] Saeys, Y., Inza, I., Larranaga, P. 2007. A review of feature selection techniques in bioinformatics, Bioinformatics, 23(19), 2507-2517.
  • [9] Yıldız, O., Tez, M., Bilge, H.Ş., Akcayol ,M.A., Güler, İ. 2012. Meme Kanseri Sınıflandırması için Gen Seçimi, IEEE 20. Sinyal İşleme ve İletişim Uygulamaları Kurultayı, 18-20 Nisan, Muğla.
  • [10] Inc, S. 2007. SPSS Clementine 12.0 Algorithms Guide, SPSS Inc, Chicago.
  • [11] Ünver, Ö., Gamgam, H. 2006. Uygulamalı Temel İstatistik Yöntemler, Seçkin Yayıncılık, Ankara.
  • [12] Hall, M. 1999. Correlation-based Feature Selection for Machine Learning, The University of Waikato, PhD Thesis,Hamilton.
  • [13] Novakavic, J., Strbac, P., Bulatovic, D. 2011. Toward Optimal Feature Selection Using Ranking Methods and Classification Algorithms, Yugoslav Journal of Operations Research, 21(1), 119-135.
  • [14] http://en.wikipedia.org/wiki/Relief_(feature_selection (Erişim Tarihi: 31.08.2014).
  • [15] Biricik, G. 2012. Sınıf Bilgisini Kullanan Boyut İndirgeme Yöntemlerinin Metin Sınıflandırmadaki Etkilerinin Karşılaştırılması, IEEE 20. Sinyal İşleme ve İletişim Uygulamaları Kurultayı, 18-20 Nisan,Muğla.
  • [16] Kira, K., Rendell, L. 1992. The Feature Selection Problem: Traditional Methods and a New Algorithm, AAAI-92, 129-134.
  • [17] Holte, R. 1993. Very simple classification rules perform well on most commonly used datasets, Machine Learning, 11, 63-91.
  • [18] Holmes, G., Nevill-Manning, C. 1995. Feature selection via the discovery of simple classification rules, To appear in Proceedings of Symposium on Intelligent Data Analysis (IDA–95), 17-19 Ağustos, Baden-Baden.
  • [19] Guyon, I. Elisseeff, A. 2003. An Introduction to Variable and Feature Selection, Journal of Machine Learning Research, 3, 1157-1182.
  • [20] Whitney, A. 1972. A direct method of nonparametric measurement selection, IEEE Transactions on Computers, 20(9), 1100-1103.
  • [21] Pratama, S., Muda, A., Choo, Y., Muda, N. 2011. Computationally Inexpensive Sequential Forward Floating Selection for Acquiring Significant Features for Authorship Invarianceness in Writer Identification, International Journal of New Computer Architectures and their Applications, 1(9), 581-598.
  • [22] Marill, T., Green, D. 1963. On the effectiveness of receptors in recognition system, IEEE Trans. Inform. Theory, 9, 11-17.
  • [23] Stearns, S. 1976. On selecting features for pattern classifiers, 3rd International Conference on Pattern Recognition, 8-11 Kasım, Coronado.
  • [24] http://www.facweb.iitkgp.ernet.in/~sudeshna/courses/ML06/featsel.pdf(Erişim Tarihi: 29.10.2014).
  • [25] Pudil, P., Novovicova, J., Kittler, J. 1994. Floating search methods in feature selection, Pattern Recognition Letters, 15, 1119-1125.
  • [26] Eskidere, Ö. 2012. Ses Ölçümlerinden Parkinson Hastalığının Teşhisi İçin Öznitelik Seçme Yöntemlerinin Karşılaştırılması, Sigma, 20, 402-414.
  • [27] Guyon, I., Weston, J., Barnhill, S., Vapnik, V. 2002. Gene selection for cancer classification using support vector machines, Machine Learning, 46, 389-422.
  • [28] http://www.nipsfsc.ecs.soton.ac.uk/datasets/ (Erişim Tarihi: 11.01.2015)
  • [29] http://www.nipsfsc.ecs.soton.ac.uk/papers/NIPS2003-Datasets.pdf (Erişim Tarihi: 11.01.2015)
  • [30] Petricoin, E. F., Ardekani, A. M., Hitt, B. A., Levine, P. J., Fusaro, V. A., Steinberg, S. M., ... & Liotta, L. A., 2002. Use of proteomic patterns in serum to identify ovarian cancer, The lancet, 359(9306), 572-577.
  • [31] LeCun, Y., Bottou, L., Bengio, Y. and Haffner, P., 1998. Gradient-based learning applied to document recognition, Proceedings of the IEEE, 86(11), 2278-2324.
  • [32] Dash, M., Liu, H. 1997. Feature Selection for Classification, Intelligent Data Analysis, Elsevier, 131-156.
  • [33] Liu, T., Liu, S., Chen, Z. and Ma, WY., 2003. An Evaluation on Feature Selection for Text Clustering, Proceedings of the Twentieth International Conference on Machine Learning (ICML-2003), 21-24 Ağustos, Washington D.C.
  • [34] Pouramirarsalani, A., Khalilian, M. and Nikravanshalmani, A. 2017. Fraud detection in E-banking by using the hybrid feature selection and evolutionary algorithms, IJCSNS, 17(8), 271-279.
  • [35] Subramanya, KB., Somani, A. 2017. Enhanced feature mining and classifier models to predict customer churn for an E-retailer, Cloud Computing, Data Science & Engineering- Confluence, 2017 7th International Conference on, 12-13 Ocak, Noida.
  • [36] Mohamad,M. and Selamat, A. 2015. An evaluation on the efficiency of hybrid feature selection in spam email classification, Computer, Communications, and Control Technology (I4CT), 2015 International Conference on, 21-23 Nisan, Kuching.
  • [37] Wang, D., Zhang, Z., Bai, R. and Mao, Y. 2017. A hybrid system with filter approach and multiple population genetic algorithm for feature selection in credit scoring, Journal of Computational and Applied Mathematics, 329, 307-321.
  • [38] Bolon-Canedo, V., Sanchez-Marono, N., Alonso-Betanzos, A., Benítez, J.M. and Herrera, F. 2014. A review of microarray datasets and applied feature selection methods, Information Sciences, 282, 111–135.
  • [39] Furey, T.S., Cristianini, N., Duffy, N., Bednarski, D.W., Schummer, M. and Haussler, D. 2000. Support vector machine classification and validation of cancer tissue samples using microarray expression data, Bioinformatics, 16(10), 906–914.
  • [40] Kumar, V., Minz, S. 2014. Feature Selection: A literature Review, Smart Computing Review, 4(3), 211-229.
There are 40 citations in total.

Details

Journal Section Articles
Authors

Hüseyin Budak

Publication Date October 5, 2018
Published in Issue Year 2018 Volume: 22 Issue: Special

Cite

APA Budak, H. (2018). Özellik Seçim Yöntemleri ve Yeni Bir Yaklaşım. Süleyman Demirel Üniversitesi Fen Bilimleri Enstitüsü Dergisi, 22, 21-31.
AMA Budak H. Özellik Seçim Yöntemleri ve Yeni Bir Yaklaşım. J. Nat. Appl. Sci. October 2018;22:21-31.
Chicago Budak, Hüseyin. “Özellik Seçim Yöntemleri Ve Yeni Bir Yaklaşım”. Süleyman Demirel Üniversitesi Fen Bilimleri Enstitüsü Dergisi 22, October (October 2018): 21-31.
EndNote Budak H (October 1, 2018) Özellik Seçim Yöntemleri ve Yeni Bir Yaklaşım. Süleyman Demirel Üniversitesi Fen Bilimleri Enstitüsü Dergisi 22 21–31.
IEEE H. Budak, “Özellik Seçim Yöntemleri ve Yeni Bir Yaklaşım”, J. Nat. Appl. Sci., vol. 22, pp. 21–31, 2018.
ISNAD Budak, Hüseyin. “Özellik Seçim Yöntemleri Ve Yeni Bir Yaklaşım”. Süleyman Demirel Üniversitesi Fen Bilimleri Enstitüsü Dergisi 22 (October 2018), 21-31.
JAMA Budak H. Özellik Seçim Yöntemleri ve Yeni Bir Yaklaşım. J. Nat. Appl. Sci. 2018;22:21–31.
MLA Budak, Hüseyin. “Özellik Seçim Yöntemleri Ve Yeni Bir Yaklaşım”. Süleyman Demirel Üniversitesi Fen Bilimleri Enstitüsü Dergisi, vol. 22, 2018, pp. 21-31.
Vancouver Budak H. Özellik Seçim Yöntemleri ve Yeni Bir Yaklaşım. J. Nat. Appl. Sci. 2018;22:21-3.

e-ISSN :1308-6529
Linking ISSN (ISSN-L): 1300-7688

All published articles in the journal can be accessed free of charge and are open access under the Creative Commons CC BY-NC (Attribution-NonCommercial) license. All authors and other journal users are deemed to have accepted this situation. Click here to access detailed information about the CC BY-NC license.