Research Article
BibTex RIS Cite

MACHINE LEARNING APPROACH TOWARDS TELEMARKETING ESTIMATION

Year 2024, Volume: 10 Issue: 1, 21 - 40, 30.06.2024
https://doi.org/10.51477/mejs.1427004

Abstract

Machine learning empowers us to extract insights from large datasets beyond human capacity. It involves training computers to identify patterns within data, enabling them to glean valuable information and apply it to novel tasks. This study focuses on analyzing a specific telemarketing dataset using various machine learning algorithms to determine if accurate predictions can be made to support company decision-making. The findings highlight that customer "Age" and "Product ID" are the primary factors influencing "Sales" numbers, indicating their significance in the predictive model.

References

  • Moro, S., Cortez, P., Rita, P., “A data-driven approach to predict the success of bank telemarketing”, Decision Support Systems, 62, 22-31, 2014.
  • Kotler, P., Keller, K.L., “Framework for Marketing Management (6th edition)”, Pearson, London-UK, 2015.
  • Rust, R.T., Moorman, C., Bhalla, G., “Rethinking Marketing”, Harvard Business Review, 1, 1, 2010.
  • The reader can find detailed information about the European Organization for Nuclear Research (CERN) at https://www.home.cern.
  • R. Krawczyk, Colombo, T., Neufeld, N., Pisani, F., Valat, S., “Ethernet for high-throughput computing at CERN”, IEEE Transactions on Parallel and Distributed Systems, 33, 3640-3650, 2022.
  • Buhl, H.U., Roglinger, M., Moser, F., Heidemann, J., “Big Data”, Business & Information Systems Engineering, 5, 65-69, 2013.
  • Verleysen, M., Francois, D., “The Curse of Dimensionality in Data Mining and Time Series Prediction”, Computational Intelligence and Bioinspired Systems, 3512, 758-770, 2005.
  • Marron, B. A., de Maine, P. A. D., “Communications of the ACM”, Communications of the ACM, 10, 711-715, 1967.
  • Heavens, A. F., Jimenez, R., Lahav, O., “Massive lossless data compression and multiple parameter estimation from galaxy spectra”, Monthly Notices of the Royal Astronomical Society, 317, 965-972, 2000.
  • Zhaoping, L., “Theoretical understanding of the early visual processes by data compression and data selection”, Network: Computation in Neural Systems, 17, 301-334, 2006.
  • Suarjaya, I.M.A.D., “A New Algorithm for Data Compression Optimization”, Int. J. Adv. Comp. Sci. and Appl., 3, 14-17, 2012.
  • Adek, R.T., Ula, M., “A Survey on The Accuracy of Machine Learning Techniques for Intrusion and Anomaly Detection on Public Data Sets”, 2020 International Conference on Data Science, Artificial Intelligence, and Business Analytics (DATABIA), 19-27, 2020.
  • Thomas, R.N., Gupta, R., “A Survey on Machine Learning Approaches and Its Techniques” 2020 IEEE International Students' Conference on Electrical, Electronics and Computer Science (SCEECS), 1-6, 2020.
  • Dixon, M.F., Halperin, I., Bilokon, P., “Machine Learning in Finance from Theory to Practice”, Springer Nature Press, 2020.
  • Bose, I., Mahapatra, R.K., “Business data mining a machine learning perspective.”, Inf. Management, 39, 211-225, 2001.
  • Stirnberg, R., Cermak, J., Kotthaus, S., Haeffelin, M., Andersen, H., Fuchs, J., Kim, M., Petit, J.E., Favez, O.,, “Meteorology-driven variability of air pollution (PM1) revealed with explainable machine learning”, Atmos. Chem. Phys., 21, 3919–3948, 2021.
  • Vogt, N., “Machine learning in neuroscience”, Nature Methods, 15, 33, 2018.
  • Bektas, J., Bektas, Y., Kangal, E.E., “Integrating a novel SRCRN network for segmentation with representative batch-mode experiments for detecting melanoma”, Biomedical Signal Processing and Control, 71, 103218, 2022.
  • Ramakrishnan, R., von Lilienfeld, O.A., “Chapter 5 in Reviews in Computational Chemistry”, Wiley-VCH, Weinheim, Germany, 225–256, 2017.
  • Ibrahim, S.K., Ahmed, A., Zeidan, M.A.E., Ziedan, I.E., “Machine Learning Methods for Spacecraft Telemetry Mining”, IEEE Trans. Aerosp. Electron. Syst., 55, 1816-1827, 2019.
  • Caldeira, J., Wu, W.L.K., Nord, B., Avestruz, C., Trivedi, S., Story, K.T., “DeepCMB: Lensing reconstruction of the cosmic microwave background with deep neural networks.”, Astron. Comput. 28, 100307, 2019.
  • Ntampaka, M. Trac, H., Sutherland, D. J., Battaglia, N., Póczos, B., Schneider, J., “A Machine Learning Approach for Dynamical Mass Measurements of Galaxy Clusters.”, Astrophys. J., 803, 50, 2015.
  • Salti, M., Kangal, E.E., Aydogdu, O., “Evolution of CMB temperature in a Chaplygin gas model from deep learning perspective”, Astronomy and Computing, 37, 100504, 2021.
  • Salti, M., Kangal, E.E., “Deep learning of CMB radiation temperature”, Annals of Physics 439, 168799, 2022.
  • Kangal, E.E., Salti, M., Aydogdu, O., “Machine learning algorithm in a caloric view point of cosmology”, Phys. Dark Univ., 26, 100369, 2019.
  • Escamilla-Rivera, C., Quintero, M.A.C., Capoziello, S., “A deep learning approach to cosmological dark energy models.”, JCAP, 03, 008, 2020.
  • Tilaver, H., Salti, M., Aydogdu, O., Kangal, E.E., “Deep learning approach to Hubble parameter.”, Comp. Phys. Commun., 261, 107809, 2021.
  • Donalek, C., “Supervised and unsupervised learning”, Astronomy Colloquia, California Institute of Technology, USA, 2011.
  • Benvenuto, F., Piana, M., Campi, C., A. M. Massone, “A Hybrid Supervised/Unsupervised Machine Learning Approach to Solar Flare Prediction”, ApJ, 853, 90, 2018.
  • Arnott, D., Pervan, G., “Eight key issues for the decision support systems discipline”, Decision Support Systems, 44, 657-672, 2008.
  • Turban, E., Sharda, R., Delen, D., “Decision Support and Business Intelligence Systems (9th edition)”, Pearson, London-UK, 2011.
  • Hastie, T., Tibshirani, R., Friedman, J., “The Elements of Statistical Learning: Data Mining, Inference, and Prediction (2nd edition)”, Springer-Verlag, NY, USA, 2008.
  • Kowsari, K., Meimandi, K.J., Heidarysafa, M., Sanjana, M., Laura, B., Brown, D., “Text Classification Algorithms: A Survey”, Information, 10, 150, 2019.
  • Harper, P.R., “A review and comparison of classification algorithms for medical decision making”, Health Policy, 71, 315-331, 2005.
  • Kumar, R., Verma, R., “Classification Algorithms for Data Mining: A Survey”, International Journal of Innovations in Engineering and Technology, 1, 2319, 2012.
  • Ilham, A., Khikmah, L., Ulumuddin, I., Indra I., “Long-term deposits prediction: a comparative framework of classification model for predict the success of bank telemarketing”, IOP Conf. Series - Journal of Physics: Conf. Series, 1175, 012035, 2019.
  • Mustapha, S.M.F.D.S., Alsufyani, A., “Application of Artificial Neural Network and information gain in building case-based reasoning for telemarketing prediction”, International Journal of Advanced Computer Science and Applications, 10, 300-306, 2019.
  • Tekouabou, S.C.K., Cherif, W., Silkan, H., “A data modeling approach for classification problems: application to bank telemarketing prediction”, Proceedings of the 2nd International Conference on Networking, Information Systems & Security (NISS19), Rabat-Morocco, 2019.
  • Keles, A., Keles, A., “IBMMS Decision Support Tool for Management of Bank Telemarketing Campaigns”, International Journal of Database Management Systems, 17, 1, 2015.
  • Kocoglu, F.O., Esnaf, S., “Machine Learning Approach and Model Performance Evaluation for Tele-Marketing Success Classification”, International Journal of Business Analytics, 9, 1-18, 2022.
  • Halim, K.N.A., Jaya, A.S.M., Fadzil, A.F.A., “Data Pre-Processing Algorithm for Neural Network Binary Classification Model in Bank Tele-Marketing”, International Journal of Innovative Technology and Exploring Engineering, 9, 272-277, 2020.
  • Shashidhara, B. M. Jain, S., Rao, V. D., Patil, N., Raghavendra, G.S., “Evaluation of machine learning frameworks on bank marketing and Higgs datasets”, Proceedings of Second International Conference on Advances in Computing and Communication Engineering, Dehradun-India, 2015.
  • Wiemer, H., Drowatzky, L., Ihlenfeldt, S., “Data Mining Methodology for Engineering Applications (DMME)-A Holistic Extension to the CRISP-DM Model”, Appl. Sci., 9, 2407, 2019.
  • Jaggia, S., Kelly, A., Lertwachara, K., Chen, L., “Applying the CRISP‐DM framework for teaching business analytics”, Decision Sciences Journal of Innovative Education, 18, 612-634, 2020.
  • Mohamed, A., “Data Analysis for Telemarketing Case”, www.kaggle.com. Date of access: 04 Jan 2022.
  • Alfeilat, H.A.A., Hassanat, A.B.A., Lasassmeh, O., Tarawneh, A.S., Alhasanat, M.B., Salman, H.S.E., Prasath, V.B.S., “Effects of Distance Measure Choice on K-Nearest Neighbor Classifier Performance: A Review”, Big Data, 7, 221-248, 2019.
  • Deza, E., Deza, M.M., “Encyclopedia of distances”, Springer, 2009.
  • Hall, M.A., “Correlation-based Feature Selection for Machine Learning”, PhD Thesis, The University of Waikato, Hamilton, NewZealand, 1999.
  • Cunningham, S. J., Littin, J., Witten. I. H., “Applications of machine learning in information retrieval”, Technical Report 97/6, University of Waikato, 1997.
  • Prasatha, V. B.S., Alfeilate, H.A.A., Hassanate, A.B.A., Lasassmehe, O., Tarawnehf, A.S., Alhasanatg, M.B., Salmane, H.S.E, “Effects of Distance Measure Choice on KNN Classifier Performance - A Review”, e-Print: 1708.04321v3, 2019.
  • Cha, S.H., “Probabilistic, Statistical and Algorithmic Aspects of the Similarity of Texts and Application to Gospels Comparison”, International Journal of Mathematical Models and Methods in Applied Sciences, 1, 300-307, 2007.
  • Jain, A., Zongker, D., “Feature selection: evaluation, application, and small sample performance”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 19, 153-158, 1997.
  • Pavlenko, T., “On feature selection, curse-of-dimensionality and error probability in discriminant analysis”, Journal of Statistical Planning and Inference, 115, 565-584, 2003.
  • Myakalwar, A.K., Spegazzini, N., Zhang, C., Anubham, S.K., Dasari, R.R., Barman, I., Gundawar, M.K., “Less is more: Avoiding the LIBS dimensionality curse through judicious feature selection for explosive detection”, Scientific Reports, 5, 13169, 2015.
  • Elkhani, N., Muniyandi, R.C., “Membrane computing inspired feature selection model for microarray cancer data”, Intell. Data Anal., 21, 137-157, 2017.
  • Last, M., Kandel, A., Maimon, O., “Information-theoretic algorithm for feature selection”, Pattern Recognition Letters, 22, 799-811, 2001.
  • Muni, D.P., Pal, N.R., Das, J., “Genetic programming for simultaneous feature selection and classifier design”, IEEE Trans. Syst. Man Cybern. Part B, 36, 106-117, 2006.
  • Strearns, S.D., “On Selecting Features for Pattern Classifiers”, 3rd International Conference on Pattern Recognition”, Colorado-CA, 1976.
  • Pudil, P., Ferri, F.J., Novovicova, J., Kittler, J., “Floating search methods for feature selection with nonmonotonic criterion functions”, Proceedings of the 12th IAPR International Conference on Pattern Recognition, Vol. II-Conference B: Pattern Recognition and Neural Networks, Jerusalem-Israel, 1994.
  • Pudil, P., Novovicova, J., Kittler, J., “Floating search methods in feature selection”, Pattern. Recogn. Lett., 15, 1119-1125, 1994.
  • Caruana, R., Niculescu-Mizil, A., “Data mining in metric space: an empirical analysis of supervised learning performance criteria”, Proceedings of the 10th ACM SIGKDD international conference on Knowledge discovery and data mining, Seattle-WA, USA, 2004.
  • Li, H., Li, H., Wei, K., “Automatic fast double KNN classification algorithm based on ACC and hierarchical clustering for big data”, Int. J. Commun. Syst., 31, e3488, 2018.
  • Madray, I., Suire, J., Desforges, J., Madani, M.R., “Relative angle correction for distance estimation using K-nearest neighbors”, IEEE Sensors Journal, 20, 8155, 2020.
  • Snchez-Crisostomo, J., Alejo, R., López-González, E., Valdovinos, R.M., Pacheco-Sánchez, J.H., "Empirical analysis of assessments metrics for multi-class imbalance learning on the back-propagation context" in “Advances in Swerm Intelligence”, Lecture Notes in Computer Science, Springer, 8795, 17-23, 2014.
  • Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P., “SMOTE: Synthetic Minority Over-sampling Technique”, Journal of Artificial Intelligence Research, 16, 321-357, 2002.
  • Torgo, L., Ribeiro, R. P., Pfahringer, B., Branco, P., “SMOTE for Regression”, Progress in Artificial Intelligence, Lecture Notes in Computer Science, Springer, Berlin, Heidelberg, 8154, 2013.
  • Fernández, A., Garcia, S., Herrera, F., Chawla, N.V., “SMOTE for learning from imbalanced data: progress and challenges, marking the 15-year anniversary”, Journal of Artificial Intelligence Research, 61 863-905, 2018.
  • Pavithra, P., Babu, S., International Journal of Scientific Research and Engineering Development, 2, 86-90, 2019.
  • Jager, M., “Improving data imbalance using Synthetic Minority Over-sampling (SMOTE)”, www.medium.com, Date of access: 04.06.2022.
  • Probst, P., Bischl, B., Boulesteix, A.L., “Tunability: Importance of Hyperparameters of Machine Learning Algorithms”, e-Print: 1802.09596, 2018.
  • Wang, B, Gong, N. Z., "Stealing Hyperparameters in Machine Learning", 2018 IEEE Symposium on Security and Privacy (SP), San Francisco-CA, USA, 2018.
  • Yang, L., Shami, A., “On hyperparameter optimization of machine learning algorithms: Theory and practice”, Neurocomputing, 415, 295-316, 2020.
Year 2024, Volume: 10 Issue: 1, 21 - 40, 30.06.2024
https://doi.org/10.51477/mejs.1427004

Abstract

References

  • Moro, S., Cortez, P., Rita, P., “A data-driven approach to predict the success of bank telemarketing”, Decision Support Systems, 62, 22-31, 2014.
  • Kotler, P., Keller, K.L., “Framework for Marketing Management (6th edition)”, Pearson, London-UK, 2015.
  • Rust, R.T., Moorman, C., Bhalla, G., “Rethinking Marketing”, Harvard Business Review, 1, 1, 2010.
  • The reader can find detailed information about the European Organization for Nuclear Research (CERN) at https://www.home.cern.
  • R. Krawczyk, Colombo, T., Neufeld, N., Pisani, F., Valat, S., “Ethernet for high-throughput computing at CERN”, IEEE Transactions on Parallel and Distributed Systems, 33, 3640-3650, 2022.
  • Buhl, H.U., Roglinger, M., Moser, F., Heidemann, J., “Big Data”, Business & Information Systems Engineering, 5, 65-69, 2013.
  • Verleysen, M., Francois, D., “The Curse of Dimensionality in Data Mining and Time Series Prediction”, Computational Intelligence and Bioinspired Systems, 3512, 758-770, 2005.
  • Marron, B. A., de Maine, P. A. D., “Communications of the ACM”, Communications of the ACM, 10, 711-715, 1967.
  • Heavens, A. F., Jimenez, R., Lahav, O., “Massive lossless data compression and multiple parameter estimation from galaxy spectra”, Monthly Notices of the Royal Astronomical Society, 317, 965-972, 2000.
  • Zhaoping, L., “Theoretical understanding of the early visual processes by data compression and data selection”, Network: Computation in Neural Systems, 17, 301-334, 2006.
  • Suarjaya, I.M.A.D., “A New Algorithm for Data Compression Optimization”, Int. J. Adv. Comp. Sci. and Appl., 3, 14-17, 2012.
  • Adek, R.T., Ula, M., “A Survey on The Accuracy of Machine Learning Techniques for Intrusion and Anomaly Detection on Public Data Sets”, 2020 International Conference on Data Science, Artificial Intelligence, and Business Analytics (DATABIA), 19-27, 2020.
  • Thomas, R.N., Gupta, R., “A Survey on Machine Learning Approaches and Its Techniques” 2020 IEEE International Students' Conference on Electrical, Electronics and Computer Science (SCEECS), 1-6, 2020.
  • Dixon, M.F., Halperin, I., Bilokon, P., “Machine Learning in Finance from Theory to Practice”, Springer Nature Press, 2020.
  • Bose, I., Mahapatra, R.K., “Business data mining a machine learning perspective.”, Inf. Management, 39, 211-225, 2001.
  • Stirnberg, R., Cermak, J., Kotthaus, S., Haeffelin, M., Andersen, H., Fuchs, J., Kim, M., Petit, J.E., Favez, O.,, “Meteorology-driven variability of air pollution (PM1) revealed with explainable machine learning”, Atmos. Chem. Phys., 21, 3919–3948, 2021.
  • Vogt, N., “Machine learning in neuroscience”, Nature Methods, 15, 33, 2018.
  • Bektas, J., Bektas, Y., Kangal, E.E., “Integrating a novel SRCRN network for segmentation with representative batch-mode experiments for detecting melanoma”, Biomedical Signal Processing and Control, 71, 103218, 2022.
  • Ramakrishnan, R., von Lilienfeld, O.A., “Chapter 5 in Reviews in Computational Chemistry”, Wiley-VCH, Weinheim, Germany, 225–256, 2017.
  • Ibrahim, S.K., Ahmed, A., Zeidan, M.A.E., Ziedan, I.E., “Machine Learning Methods for Spacecraft Telemetry Mining”, IEEE Trans. Aerosp. Electron. Syst., 55, 1816-1827, 2019.
  • Caldeira, J., Wu, W.L.K., Nord, B., Avestruz, C., Trivedi, S., Story, K.T., “DeepCMB: Lensing reconstruction of the cosmic microwave background with deep neural networks.”, Astron. Comput. 28, 100307, 2019.
  • Ntampaka, M. Trac, H., Sutherland, D. J., Battaglia, N., Póczos, B., Schneider, J., “A Machine Learning Approach for Dynamical Mass Measurements of Galaxy Clusters.”, Astrophys. J., 803, 50, 2015.
  • Salti, M., Kangal, E.E., Aydogdu, O., “Evolution of CMB temperature in a Chaplygin gas model from deep learning perspective”, Astronomy and Computing, 37, 100504, 2021.
  • Salti, M., Kangal, E.E., “Deep learning of CMB radiation temperature”, Annals of Physics 439, 168799, 2022.
  • Kangal, E.E., Salti, M., Aydogdu, O., “Machine learning algorithm in a caloric view point of cosmology”, Phys. Dark Univ., 26, 100369, 2019.
  • Escamilla-Rivera, C., Quintero, M.A.C., Capoziello, S., “A deep learning approach to cosmological dark energy models.”, JCAP, 03, 008, 2020.
  • Tilaver, H., Salti, M., Aydogdu, O., Kangal, E.E., “Deep learning approach to Hubble parameter.”, Comp. Phys. Commun., 261, 107809, 2021.
  • Donalek, C., “Supervised and unsupervised learning”, Astronomy Colloquia, California Institute of Technology, USA, 2011.
  • Benvenuto, F., Piana, M., Campi, C., A. M. Massone, “A Hybrid Supervised/Unsupervised Machine Learning Approach to Solar Flare Prediction”, ApJ, 853, 90, 2018.
  • Arnott, D., Pervan, G., “Eight key issues for the decision support systems discipline”, Decision Support Systems, 44, 657-672, 2008.
  • Turban, E., Sharda, R., Delen, D., “Decision Support and Business Intelligence Systems (9th edition)”, Pearson, London-UK, 2011.
  • Hastie, T., Tibshirani, R., Friedman, J., “The Elements of Statistical Learning: Data Mining, Inference, and Prediction (2nd edition)”, Springer-Verlag, NY, USA, 2008.
  • Kowsari, K., Meimandi, K.J., Heidarysafa, M., Sanjana, M., Laura, B., Brown, D., “Text Classification Algorithms: A Survey”, Information, 10, 150, 2019.
  • Harper, P.R., “A review and comparison of classification algorithms for medical decision making”, Health Policy, 71, 315-331, 2005.
  • Kumar, R., Verma, R., “Classification Algorithms for Data Mining: A Survey”, International Journal of Innovations in Engineering and Technology, 1, 2319, 2012.
  • Ilham, A., Khikmah, L., Ulumuddin, I., Indra I., “Long-term deposits prediction: a comparative framework of classification model for predict the success of bank telemarketing”, IOP Conf. Series - Journal of Physics: Conf. Series, 1175, 012035, 2019.
  • Mustapha, S.M.F.D.S., Alsufyani, A., “Application of Artificial Neural Network and information gain in building case-based reasoning for telemarketing prediction”, International Journal of Advanced Computer Science and Applications, 10, 300-306, 2019.
  • Tekouabou, S.C.K., Cherif, W., Silkan, H., “A data modeling approach for classification problems: application to bank telemarketing prediction”, Proceedings of the 2nd International Conference on Networking, Information Systems & Security (NISS19), Rabat-Morocco, 2019.
  • Keles, A., Keles, A., “IBMMS Decision Support Tool for Management of Bank Telemarketing Campaigns”, International Journal of Database Management Systems, 17, 1, 2015.
  • Kocoglu, F.O., Esnaf, S., “Machine Learning Approach and Model Performance Evaluation for Tele-Marketing Success Classification”, International Journal of Business Analytics, 9, 1-18, 2022.
  • Halim, K.N.A., Jaya, A.S.M., Fadzil, A.F.A., “Data Pre-Processing Algorithm for Neural Network Binary Classification Model in Bank Tele-Marketing”, International Journal of Innovative Technology and Exploring Engineering, 9, 272-277, 2020.
  • Shashidhara, B. M. Jain, S., Rao, V. D., Patil, N., Raghavendra, G.S., “Evaluation of machine learning frameworks on bank marketing and Higgs datasets”, Proceedings of Second International Conference on Advances in Computing and Communication Engineering, Dehradun-India, 2015.
  • Wiemer, H., Drowatzky, L., Ihlenfeldt, S., “Data Mining Methodology for Engineering Applications (DMME)-A Holistic Extension to the CRISP-DM Model”, Appl. Sci., 9, 2407, 2019.
  • Jaggia, S., Kelly, A., Lertwachara, K., Chen, L., “Applying the CRISP‐DM framework for teaching business analytics”, Decision Sciences Journal of Innovative Education, 18, 612-634, 2020.
  • Mohamed, A., “Data Analysis for Telemarketing Case”, www.kaggle.com. Date of access: 04 Jan 2022.
  • Alfeilat, H.A.A., Hassanat, A.B.A., Lasassmeh, O., Tarawneh, A.S., Alhasanat, M.B., Salman, H.S.E., Prasath, V.B.S., “Effects of Distance Measure Choice on K-Nearest Neighbor Classifier Performance: A Review”, Big Data, 7, 221-248, 2019.
  • Deza, E., Deza, M.M., “Encyclopedia of distances”, Springer, 2009.
  • Hall, M.A., “Correlation-based Feature Selection for Machine Learning”, PhD Thesis, The University of Waikato, Hamilton, NewZealand, 1999.
  • Cunningham, S. J., Littin, J., Witten. I. H., “Applications of machine learning in information retrieval”, Technical Report 97/6, University of Waikato, 1997.
  • Prasatha, V. B.S., Alfeilate, H.A.A., Hassanate, A.B.A., Lasassmehe, O., Tarawnehf, A.S., Alhasanatg, M.B., Salmane, H.S.E, “Effects of Distance Measure Choice on KNN Classifier Performance - A Review”, e-Print: 1708.04321v3, 2019.
  • Cha, S.H., “Probabilistic, Statistical and Algorithmic Aspects of the Similarity of Texts and Application to Gospels Comparison”, International Journal of Mathematical Models and Methods in Applied Sciences, 1, 300-307, 2007.
  • Jain, A., Zongker, D., “Feature selection: evaluation, application, and small sample performance”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 19, 153-158, 1997.
  • Pavlenko, T., “On feature selection, curse-of-dimensionality and error probability in discriminant analysis”, Journal of Statistical Planning and Inference, 115, 565-584, 2003.
  • Myakalwar, A.K., Spegazzini, N., Zhang, C., Anubham, S.K., Dasari, R.R., Barman, I., Gundawar, M.K., “Less is more: Avoiding the LIBS dimensionality curse through judicious feature selection for explosive detection”, Scientific Reports, 5, 13169, 2015.
  • Elkhani, N., Muniyandi, R.C., “Membrane computing inspired feature selection model for microarray cancer data”, Intell. Data Anal., 21, 137-157, 2017.
  • Last, M., Kandel, A., Maimon, O., “Information-theoretic algorithm for feature selection”, Pattern Recognition Letters, 22, 799-811, 2001.
  • Muni, D.P., Pal, N.R., Das, J., “Genetic programming for simultaneous feature selection and classifier design”, IEEE Trans. Syst. Man Cybern. Part B, 36, 106-117, 2006.
  • Strearns, S.D., “On Selecting Features for Pattern Classifiers”, 3rd International Conference on Pattern Recognition”, Colorado-CA, 1976.
  • Pudil, P., Ferri, F.J., Novovicova, J., Kittler, J., “Floating search methods for feature selection with nonmonotonic criterion functions”, Proceedings of the 12th IAPR International Conference on Pattern Recognition, Vol. II-Conference B: Pattern Recognition and Neural Networks, Jerusalem-Israel, 1994.
  • Pudil, P., Novovicova, J., Kittler, J., “Floating search methods in feature selection”, Pattern. Recogn. Lett., 15, 1119-1125, 1994.
  • Caruana, R., Niculescu-Mizil, A., “Data mining in metric space: an empirical analysis of supervised learning performance criteria”, Proceedings of the 10th ACM SIGKDD international conference on Knowledge discovery and data mining, Seattle-WA, USA, 2004.
  • Li, H., Li, H., Wei, K., “Automatic fast double KNN classification algorithm based on ACC and hierarchical clustering for big data”, Int. J. Commun. Syst., 31, e3488, 2018.
  • Madray, I., Suire, J., Desforges, J., Madani, M.R., “Relative angle correction for distance estimation using K-nearest neighbors”, IEEE Sensors Journal, 20, 8155, 2020.
  • Snchez-Crisostomo, J., Alejo, R., López-González, E., Valdovinos, R.M., Pacheco-Sánchez, J.H., "Empirical analysis of assessments metrics for multi-class imbalance learning on the back-propagation context" in “Advances in Swerm Intelligence”, Lecture Notes in Computer Science, Springer, 8795, 17-23, 2014.
  • Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P., “SMOTE: Synthetic Minority Over-sampling Technique”, Journal of Artificial Intelligence Research, 16, 321-357, 2002.
  • Torgo, L., Ribeiro, R. P., Pfahringer, B., Branco, P., “SMOTE for Regression”, Progress in Artificial Intelligence, Lecture Notes in Computer Science, Springer, Berlin, Heidelberg, 8154, 2013.
  • Fernández, A., Garcia, S., Herrera, F., Chawla, N.V., “SMOTE for learning from imbalanced data: progress and challenges, marking the 15-year anniversary”, Journal of Artificial Intelligence Research, 61 863-905, 2018.
  • Pavithra, P., Babu, S., International Journal of Scientific Research and Engineering Development, 2, 86-90, 2019.
  • Jager, M., “Improving data imbalance using Synthetic Minority Over-sampling (SMOTE)”, www.medium.com, Date of access: 04.06.2022.
  • Probst, P., Bischl, B., Boulesteix, A.L., “Tunability: Importance of Hyperparameters of Machine Learning Algorithms”, e-Print: 1802.09596, 2018.
  • Wang, B, Gong, N. Z., "Stealing Hyperparameters in Machine Learning", 2018 IEEE Symposium on Security and Privacy (SP), San Francisco-CA, USA, 2018.
  • Yang, L., Shami, A., “On hyperparameter optimization of machine learning algorithms: Theory and practice”, Neurocomputing, 415, 295-316, 2020.
There are 72 citations in total.

Details

Primary Language English
Subjects Statistical Data Science
Journal Section Article
Authors

Mehmet Saltı 0000-0002-1122-2464

Evrim Ersin Kangal 0000-0001-5906-3143

Bilgin Zengin 0000-0002-9355-8294

Publication Date June 30, 2024
Submission Date January 28, 2024
Acceptance Date April 3, 2024
Published in Issue Year 2024 Volume: 10 Issue: 1

Cite

IEEE M. Saltı, E. E. Kangal, and B. Zengin, “MACHINE LEARNING APPROACH TOWARDS TELEMARKETING ESTIMATION”, MEJS, vol. 10, no. 1, pp. 21–40, 2024, doi: 10.51477/mejs.1427004.

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

TRDizinlogo_live-e1586763957746.png   ici2.png     scholar_logo_64dp.png    CenterLogo.png     crossref-logo-landscape-200.png  logo.png         logo1.jpg   DRJI_Logo.jpg  17826265674769  logo.png