Research Article
BibTex RIS Cite
Year 2024, Volume: 5 Issue: 1, 11 - 23
https://doi.org/10.55195/jscai.1464638

Abstract

References

  • S. Solorio-Fernández, J. Ariel Carrasco-Ochoa, J.F. Martínez-Trinidad, A systematic evaluation of filter Unsupervised Feature Selection methods, Expert Syst. Appl. 162 (2020) 113745. https://doi.org/10.1016/j.eswa.2020.113745.
  • Z.A. Zhao, H. Liu, Spectral Feature Selection for Data Mining, Chapman and Hall/CRC, 2011. https://doi.org/10.1201/b11426.
  • P. Mitra, S.K. Pal, Pattern Recognition Algorithms for Data Mining, 1st. ed., Chapman & Hall/CRC, 2004.
  • E. Hancer, B. Xue, M. Zhang, A survey on feature selection approaches for clustering, Artif. Intell. Rev. 53 (2020) 4519–4545. https://doi.org/10.1007/s10462-019-09800-w.
  • X. He, D. Cai, P. Niyogi, Laplacian score for feature selection, in: NIPS’05 Proc. 18th Int. Conf. Neural Inf. Process. Syst., 2005: pp. 507–514.
  • Y. Liu, D. Ye, W. Li, H. Wang, Y. Gao, Robust neighborhood embedding for unsupervised feature selection, Knowledge-Based Syst. 193 (2020) 105462. https://doi.org/10.1016/j.knosys.2019.105462.
  • D. Cai, C. Zhang, X. He, Unsupervised feature selection for multi-cluster data, in: Proc. 16th ACM SIGKDD Int. Conf. Knowl. Discov. Data Min. - KDD ’10, ACM Press, New York, New York, USA, 2010: p. 333. https://doi.org/10.1145/1835804.1835848.
  • P. Zhu, W. Zuo, L. Zhang, Q. Hu, S.C.K. Shiu, Unsupervised feature selection by regularized self-representation, Pattern Recognit. 48 (2015) 438–446. https://doi.org/10.1016/j.patcog.2014.08.006.
  • Y. Liu, K. Liu, C. Zhang, J. Wang, X. Wang, Unsupervised feature selection via Diversity-induced Self-representation, Neurocomputing. 219 (2017) 350–363. https://doi.org/10.1016/j.neucom.2016.09.043.
  • S.-L. Huang, L. Zhang, L. Zheng, An information-theoretic approach to unsupervised feature selection for high-dimensional data, in: 2017 IEEE Inf. Theory Work., IEEE, 2017: pp. 434–438. https://doi.org/10.1109/ITW.2017.8277927.
  • H. Lim, D.-W. Kim, Pairwise dependence-based unsupervised feature selection, Pattern Recognit. 111 (2021) 107663. https://doi.org/10.1016/j.patcog.2020.107663.
  • Z. Zhao, H. Liu, Spectral feature selection for supervised and unsupervised learning, in: Proc. 24th Int. Conf. Mach. Learn. - ICML ’07, ACM Press, New York, New York, USA, 2007: pp. 1151–1157. https://doi.org/10.1145/1273496.1273641.
  • D. Huang, X. Cai, C.-D. Wang, Unsupervised feature selection with multi-subspace randomization and collaboration, Knowledge-Based Syst. 182 (2019) 104856. https://doi.org/10.1016/j.knosys.2019.07.027.
  • X. Yan, S. Nazmi, B.A. Erol, A. Homaifar, B. Gebru, E. Tunstel, An efficient unsupervised feature selection procedure through feature clustering, Pattern Recognit. Lett. 131 (2020) 277–284. https://doi.org/10.1016/j.patrec.2019.12.022.
  • F. Wang, L. Zhu, J. Li, H. Chen, H. Zhang, Unsupervised soft-label feature selection, Knowledge-Based Syst. 219 (2021) 106847. https://doi.org/10.1016/j.knosys.2021.106847.
  • A. Di Crescenzo, M. Longobardi, On cumulative entropies, J. Stat. Plan. Inference. 139 (2009) 4072–4087. https://doi.org/10.1016/j.jspi.2009.05.038.
  • C.E. Shannon, A Mathematical Theory of Communication, Bell Syst. Tech. J. 27 (1948) 379–423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x.

A New Fast Filter-based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy

Year 2024, Volume: 5 Issue: 1, 11 - 23
https://doi.org/10.55195/jscai.1464638

Abstract

The feature selection process is indispensable for the machine learning area to avoid the curse of dimensionality. Hereof, the feature selection techniques endeavor to handle this issue. Yet, the feature selection techniques hold several weaknesses: (i) the efficacy of the machine learning methods could be quite different on the chosen features (ii) by depending on the selected subset, substantial differences in the effectiveness of the machine learning algorithms could also be monitored (iii) the feature selection algorithms can consume much time on massive data. In this work, to address the issues above, we suggest a new and quick unsupervised feature selection procedure, which is based on a filter and univariate technique. The offered approach together regards both the Shannon entropy computed by the symmetry of the distribution and the cumulative entropy of the distribution. As a consequence of comparisons done with some cutting-edge feature selection strategies, the empirical results indicate that the presented algorithm solves these problems in a better way than other methods.

References

  • S. Solorio-Fernández, J. Ariel Carrasco-Ochoa, J.F. Martínez-Trinidad, A systematic evaluation of filter Unsupervised Feature Selection methods, Expert Syst. Appl. 162 (2020) 113745. https://doi.org/10.1016/j.eswa.2020.113745.
  • Z.A. Zhao, H. Liu, Spectral Feature Selection for Data Mining, Chapman and Hall/CRC, 2011. https://doi.org/10.1201/b11426.
  • P. Mitra, S.K. Pal, Pattern Recognition Algorithms for Data Mining, 1st. ed., Chapman & Hall/CRC, 2004.
  • E. Hancer, B. Xue, M. Zhang, A survey on feature selection approaches for clustering, Artif. Intell. Rev. 53 (2020) 4519–4545. https://doi.org/10.1007/s10462-019-09800-w.
  • X. He, D. Cai, P. Niyogi, Laplacian score for feature selection, in: NIPS’05 Proc. 18th Int. Conf. Neural Inf. Process. Syst., 2005: pp. 507–514.
  • Y. Liu, D. Ye, W. Li, H. Wang, Y. Gao, Robust neighborhood embedding for unsupervised feature selection, Knowledge-Based Syst. 193 (2020) 105462. https://doi.org/10.1016/j.knosys.2019.105462.
  • D. Cai, C. Zhang, X. He, Unsupervised feature selection for multi-cluster data, in: Proc. 16th ACM SIGKDD Int. Conf. Knowl. Discov. Data Min. - KDD ’10, ACM Press, New York, New York, USA, 2010: p. 333. https://doi.org/10.1145/1835804.1835848.
  • P. Zhu, W. Zuo, L. Zhang, Q. Hu, S.C.K. Shiu, Unsupervised feature selection by regularized self-representation, Pattern Recognit. 48 (2015) 438–446. https://doi.org/10.1016/j.patcog.2014.08.006.
  • Y. Liu, K. Liu, C. Zhang, J. Wang, X. Wang, Unsupervised feature selection via Diversity-induced Self-representation, Neurocomputing. 219 (2017) 350–363. https://doi.org/10.1016/j.neucom.2016.09.043.
  • S.-L. Huang, L. Zhang, L. Zheng, An information-theoretic approach to unsupervised feature selection for high-dimensional data, in: 2017 IEEE Inf. Theory Work., IEEE, 2017: pp. 434–438. https://doi.org/10.1109/ITW.2017.8277927.
  • H. Lim, D.-W. Kim, Pairwise dependence-based unsupervised feature selection, Pattern Recognit. 111 (2021) 107663. https://doi.org/10.1016/j.patcog.2020.107663.
  • Z. Zhao, H. Liu, Spectral feature selection for supervised and unsupervised learning, in: Proc. 24th Int. Conf. Mach. Learn. - ICML ’07, ACM Press, New York, New York, USA, 2007: pp. 1151–1157. https://doi.org/10.1145/1273496.1273641.
  • D. Huang, X. Cai, C.-D. Wang, Unsupervised feature selection with multi-subspace randomization and collaboration, Knowledge-Based Syst. 182 (2019) 104856. https://doi.org/10.1016/j.knosys.2019.07.027.
  • X. Yan, S. Nazmi, B.A. Erol, A. Homaifar, B. Gebru, E. Tunstel, An efficient unsupervised feature selection procedure through feature clustering, Pattern Recognit. Lett. 131 (2020) 277–284. https://doi.org/10.1016/j.patrec.2019.12.022.
  • F. Wang, L. Zhu, J. Li, H. Chen, H. Zhang, Unsupervised soft-label feature selection, Knowledge-Based Syst. 219 (2021) 106847. https://doi.org/10.1016/j.knosys.2021.106847.
  • A. Di Crescenzo, M. Longobardi, On cumulative entropies, J. Stat. Plan. Inference. 139 (2009) 4072–4087. https://doi.org/10.1016/j.jspi.2009.05.038.
  • C.E. Shannon, A Mathematical Theory of Communication, Bell Syst. Tech. J. 27 (1948) 379–423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x.
There are 17 citations in total.

Details

Primary Language English
Subjects Computer Vision and Multimedia Computation (Other)
Journal Section Research Articles
Authors

Samet Demirel 0000-0002-7531-1124

Fatih Aydın 0000-0001-9679-0403

Early Pub Date June 3, 2024
Publication Date
Submission Date April 3, 2024
Acceptance Date May 27, 2024
Published in Issue Year 2024 Volume: 5 Issue: 1

Cite

APA Demirel, S., & Aydın, F. (2024). A New Fast Filter-based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy. Journal of Soft Computing and Artificial Intelligence, 5(1), 11-23. https://doi.org/10.55195/jscai.1464638
AMA Demirel S, Aydın F. A New Fast Filter-based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy. JSCAI. June 2024;5(1):11-23. doi:10.55195/jscai.1464638
Chicago Demirel, Samet, and Fatih Aydın. “A New Fast Filter-Based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy”. Journal of Soft Computing and Artificial Intelligence 5, no. 1 (June 2024): 11-23. https://doi.org/10.55195/jscai.1464638.
EndNote Demirel S, Aydın F (June 1, 2024) A New Fast Filter-based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy. Journal of Soft Computing and Artificial Intelligence 5 1 11–23.
IEEE S. Demirel and F. Aydın, “A New Fast Filter-based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy”, JSCAI, vol. 5, no. 1, pp. 11–23, 2024, doi: 10.55195/jscai.1464638.
ISNAD Demirel, Samet - Aydın, Fatih. “A New Fast Filter-Based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy”. Journal of Soft Computing and Artificial Intelligence 5/1 (June 2024), 11-23. https://doi.org/10.55195/jscai.1464638.
JAMA Demirel S, Aydın F. A New Fast Filter-based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy. JSCAI. 2024;5:11–23.
MLA Demirel, Samet and Fatih Aydın. “A New Fast Filter-Based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy”. Journal of Soft Computing and Artificial Intelligence, vol. 5, no. 1, 2024, pp. 11-23, doi:10.55195/jscai.1464638.
Vancouver Demirel S, Aydın F. A New Fast Filter-based Unsupervised Feature Selection Algorithm Using Cumulative and Shannon Entropy. JSCAI. 2024;5(1):11-23.