Araştırma Makalesi
BibTex RIS Kaynak Göster

The Multicollinearity Effect on the Performance of Machine Learning Algorithms: Case Examples in Healthcare Modelling

Yıl 2024, Cilt: 12 Sayı: 3, 68 - 80, 25.09.2024
https://doi.org/10.21541/apjess.1371070

Öz

Background: The data extracted from various fields inherently consists of extremely correlated measurements in parallel with the exponential increase in the size of the data that need to be interpreted owing to the technological advances. This problem, called the multicollinearity, influences the performance of both statistical and machine learning algorithms. Statistical models proposed as a potential remedy to this problem have not been sufficiently evaluated in the literature. Therefore, a comprehensive comparison of statistical and machine learning models is required for addressing the multicollinearity problem.

Methods: Statistical models (including Ridge, Liu, Lasso and Elastic Net regression) and the eight most important machine learning algorithms (including Cart, Knn, Mlp, MARS, Cubist, Svm, Bagging and XGBoost) are comprehensively compared by using two different healthcare datasets (including Body Fat and Cancer) having multicollinearity problem. The performance of the models is assessed through cross validation methods via root mean square error, mean absolute error and r-squared criteria.

Results: The results of the study revealed that statistical models outperformed machine learning models in terms of root mean square error, mean absolute error and r-squared criteria in both training and testing performance. Particularly the Liu regression often achieved better relative performance (up to 7.60% to 46.08% for Body Fat data set and up to 1.55% to 21.53% for Cancer data set on training performance and up to 1.56% to 38.08% for Body Fat data set and up to 3.50% to 23.29% for Cancer data set on testing performance) among regression methods as well as compared to machine algorithms.

Conclusions: Liu regression is mostly disregarded in the machine learning literature, but since it outperforms the most powerful and widely used machine learning algorithms, it appears to be a promising tool in almost all fields, especially for regression-based studies including data with multicollinearity problem.

Etik Beyan

All authors declare that the ethical principles stated by the journal have been complied with in the study.

Destekleyen Kurum

All authors declare that the study was not supported by any institution or project.

Teşekkür

All authors would like to thank in advance the journal staff and the reviewers for their contributions in the possible peer review.

Kaynakça

  • Ortiz, R., Contreras, M., & Mellado, C. (2023). Regression, multicollinearity and Markowitz. Finance Research Letters, 58, 104550.
  • Haavelmo, T. (1944). The probability approach in econometrics. Econometrica: Journal of the Econometric Society, iii-115.
  • Chan, J. Y. L., Leow, S. M. H., Bea, K. T., Cheng, W. K., Phoong, S. W., Hong, Z. W., & Chen, Y. L. (2022). Mitigating the multicollinearity problem and its machine learning approach: a review. Mathematics, 10(8), 1283.
  • A. Garg and K. Tai, ‘Comparison of statistical and machine learning methods in modelling of data with multicollinearity’, IJMIC, vol. 18, no. 4, p. 295, 2013, doi: 10.1504/IJMIC.2013.053535.
  • C. M. Stein, ‘Multiple regression contributions to probability and statistics’, Essays in Honor of Harold Hotelling, vol. 103, 1960.
  • C. M. Stein, ‘Confidence sets for the mean of a multivariate normal distribution’, Journal of the Royal Statistical Society Series B: Statistical Methodology, vol. 24, no. 2, pp. 265–285, 1962.
  • A. E. Hoerl and R. W. Kennard, ‘Ridge Regression: Applications to Nonorthogonal Problems’, Technometrics, vol. 12, no. 1, pp. 69–82, Feb. 1970, doi: 10.1080/00401706.1970.10488635.
  • L. Kejian, ‘A new class of blased estimate in linear regression’, Communications in Statistics - Theory and Methods, vol. 22, no. 2, pp. 393–402, Jan. 1993, doi: 10.1080/03610929308831027.
  • R. Tibshirani, ‘Regression shrinkage and selection via the lasso’, Journal of the Royal Statistical Society Series B: Statistical Methodology, vol. 58, no. 1, pp. 267–288, 1996.
  • H. Zou and T. Hastie, ‘Regularization and variable selection via the elastic net’, Journal of the Royal Statistical Society Series B: Statistical Methodology, vol. 67, no. 2, pp. 301–320, 2005.
  • G. Li and P. Niu, ‘An enhanced extreme learning machine based on ridge regression for regression’, Neural Computing and Applications, vol. 22, pp. 803–810, 2013.
  • T. E. Panduro and B. J. Thorsen, ‘Evaluating two model reduction approaches for large scale hedonic models sensitive to omitted variables and multicollinearity’, Letters in Spatial and Resource Sciences, vol. 7, pp. 85–102, 2014.
  • G. G. Dumancas and G. Bello, ‘Comparison of machine-learning techniques for handling multicollinearity in big data analytics and high-performance data mining’, in SC15: The International Conference for High Performance Computing Networking Storage and Analysis, 2015, pp. 41–42.
  • B. Kilinc, B. Aşıkgil, A. Erar, and B. Yazici, ‘Variable selection with genetic algorithm and multivariate adaptive regression splines in the presence of multicollinearity’, International Journal of Advanced and Applied Sciences, vol. 3, no. 12, 2016.
  • A. Katrutsa and V. Strijov, ‘Comprehensive study of feature selection methods to solve multicollinearity problem according to evaluation criteria’, Expert Systems with Applications, vol. 76, pp. 1–11, Jun. 2017, doi: 10.1016/j.eswa.2017.01.048.
  • E. Hoseinzade and S. Haratizadeh, ‘CNNpred: CNN-based stock market prediction using a diverse set of variables’, Expert Systems with Applications, vol. 129, pp. 273–285, 2019.
  • J.-M. Kim, N. Wang, Y. Liu, and K. Park, ‘Residual control chart for binary response with multicollinearity covariates by neural network model’, Symmetry, vol. 12, no. 3, p. 381, 2020.
  • C. P. Obite, N. P. Olewuezi, G. U. Ugwuanyim, and D. C. Bartholomew, ‘Multicollinearity Effect in Regression Analysis: A Feed Forward Artificial Neural Network Approach’, Asian Journal of Probability and Statistics, pp. 22–33, Jan. 2020, doi: 10.9734/ajpas/2020/v6i130151.
  • Hua, Y. (2020, May). An efficient traffic classification scheme using embedded feature selection and lightgbm. In 2020 Information Communication Technologies Conference (ICTC) (pp. 125-130). IEEE.
  • Qaraad, M., Amjad, S., Manhrawy, I. I., Fathi, H., Hassan, B. A., & El Kafrawy, P. (2021). A hybrid feature selection optimization model for high dimension data classification. IEEE Access, 9, 42884-42895.
  • Y. Bi, C. Li, Y. Benezeth, and F. Yang, ‘Impacts of multicollinearity on CAPT modalities: An heterogeneous machine learning framework for computer-assisted French phoneme pronunciation training’, Plos one, vol. 16, no. 10, p. e0257901, 2021.
  • A. Abubakar, U. F. Abbas, and K. E. Lasisi, ‘Remedying Multicollinearity in Quantitative Analysis: A Simulation Studies’, 2022.
  • Mahadi, M., Ballal, T., Moinuddin, M., & Al-Saggaf, U. M. (2022). A recursive least-squares with a time-varying regularization parameter. Applied Sciences, 12(4), 2077.
  • Kaneko, H. (2023). Interpretation of Machine Learning Models for Data Sets with Many Features Using Feature Importance. ACS omega, 8(25), 23218-23225.
  • Genç, M. (2024). An Enhanced Extreme Learning Machine Based on Square-Root Lasso Method. Neural Processing Letters, 56(1), 5.
  • D. C. Montgomery, E. A. Peck, and G. G. Vining, Introduction to linear regression analysis. John Wiley & Sons, 2021.
  • D. A. Belsley, E. Kuh, and R. E. Welsch, Regression Diagnostics: Identifying Influential Data and Sources of Collinearity, 1st ed. in Wiley Series in Probability and Statistics. Wiley, 1980. doi: 10.1002/0471725153.
  • S. Weisberg, Applied Linear Regression, 1st ed. in Wiley Series in Probability and Statistics. Wiley, 2005. doi: 10.1002/0471704091.
  • Strang, G. (2022). Introduction to linear algebra. Wellesley-Cambridge Press.
  • E. Fix and J. L. Hodges, ‘Discriminatory analysis. Nonparametric discrimination: Consistency properties’, International Statistical Review/Revue Internationale de Statistique, vol. 57, no. 3, pp. 238–247, 1989.
  • N. S. Altman, ‘An introduction to kernel and nearest-neighbor nonparametric regression’, The American Statistician, vol. 46, no. 3, pp. 175–185, 1992.
  • C. Cortes and V. Vapnik, ‘Support-vector networks’, Machine learning, vol. 20, pp. 273–297, 1995.
  • H. Drucker, C. J. Burges, L. Kaufman, A. Smola, and V. Vapnik, ‘Support vector regression machines’, Advances in neural information processing systems, vol. 9, 1996.
  • B. Li, J. Friedman, R. Olshen, and C. Stone, ‘Classification and regression trees (CART)’, Biometrics, vol. 40, no. 3, pp. 358–361, 1984.
  • L. Breiman, ‘Bagging predictors’, Machine learning, vol. 24, pp. 123–140, 1996.
  • L. Breiman, ‘Random forests’, Machine learning, vol. 45, pp. 5–32, 2001.
  • T. Chen and C. Guestrin, ‘Xgboost: A scalable tree boosting system’, in Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining, 2016, pp. 785–794.
  • J. R. Quinlan, ‘Learning with continuous classes’, in 5th Australian joint conference on artificial intelligence, World Scientific, 1992, pp. 343–348.
  • J. R. Quinlan, ‘Combining instance-based and model-based learning’, in Proceedings of the tenth international conference on machine learning, 1993, pp. 236–243.
  • J. R. Quinlan, C4. 5: programs for machine learning. Elsevier, 2014.
  • J. H. Friedman, ‘Multivariate adaptive regression splines’, The annals of statistics, vol. 19, no. 1, pp. 1–67, 1991.
  • S. Haykin and N. Network, ‘A comprehensive foundation’, Neural networks, vol. 2, no. 2004, p. 41, 2004.
  • R. W. Johnson, ‘Fitting percentage of body fat to simple body measurements’, Journal of Statistics Education, vol. 4, no. 1, 1996.
  • ‘Kaggle: Your Machine Learning and Data Science Community’. https://www.kaggle.com/ (accessed Sep. 22, 2023).
  • ‘Data World. https://data.world/nrippner/cancer-trials (accessed July. 18, 2024).
  • Frankenfield, D. C., Rowe, W. A., Cooney, R. N., Smith, J. S., & Becker, D. (2001). Limits of body mass index to detect obesity and predict body composition. Nutrition, 17(1), 26-30.
  • Fthenakis, Z. G., Balaska, D., & Zafiropulos, V. (2012). Uncovering the FUTREX-6100XL prediction equation for the percentage body fat. Journal of medical engineering & technology, 36(7), 351-357.
  • Deurenberg, P., Weststrate, J. A., & Seidell, J. C. (1991). Body mass index as a measure of body fatness: age-and sex-specific prediction formulas. British journal of nutrition, 65(2), 105-114.
  • Jackson, A. S., Stanforth, P. R., Gagnon, J., Rankinen, T., Leon, A. S., Rao, D. C., ... & Wilmore, J. H. (2002). The effect of sex, age and race on estimating percentage body fat from body mass index: The Heritage Family Study. International journal of obesity, 26(6), 789-796.
  • Meeuwsen, S., Horgan, G. W., & Elia, M. (2010). The relationship between BMI and percent body fat, measured by bioelectrical impedance, in a large adult sample is curvilinear and influenced by age and sex. Clinical nutrition, 29(5), 560-566.
  • Sung, H., & Mun, J. (2017). Development and cross-validation of equation for estimating percent body fat of Korean adults according to body mass index. Journal of Obesity & Metabolic Syndrome, 26(2), 122.
  • Uçar, M. K., Ucar, Z., Köksal, F., & Daldal, N. (2021). Estimation of body fat percentage using hybrid machine learning algorithms. Measurement, 167, 108173.
  • Shao, Y. E. (2014). Body fat percentage prediction using intelligent hybrid approaches. The Scientific World Journal, 2014.
  • Carrizosa, E., Mortensen, L. H., Morales, D. R., & Sillero-Denamiel, M. R. (2022). The tree based linear regression model for hierarchical categorical variables. Expert Systems with Applications, 203, 117423.
  • Kourou, K., Exarchos, T. P., Exarchos, K. P., Karamouzis, M. V., & Fotiadis, D. I. (2015). Machine learning applications in cancer prognosis and prediction. Computational and structural biotechnology journal, 13, 8-17.
  • Cruz, J. A., & Wishart, D. S. (2006). Applications of machine learning in cancer prediction and prognosis. Cancer informatics, 2, 117693510600200030.
  • Parikh, R. B., Manz, C., Chivers, C., Regli, S. H., Braun, J., Draugelis, M. E., ... & O’Connor, N. R. (2019). Machine learning approaches to predict 6-month mortality among patients with cancer. JAMA network open, 2(10), e1915997-e1915997.
  • Zhu, W., Xie, L., Han, J., & Guo, X. (2020). The application of deep learning in cancer prognosis prediction. Cancers, 12(3), 603.
  • Yaqoob, A., Musheer Aziz, R., & verma, N. K. (2023). Applications and techniques of machine learning in cancer classification: A systematic review. Human-Centric Intelligent Systems, 3(4), 588-615.
  • Swanson, K., Wu, E., Zhang, A., Alizadeh, A. A., & Zou, J. (2023). From patterns to patients: Advances in clinical machine learning for cancer diagnosis, prognosis, and treatment. Cell, 186(8), 1772-1791.
Yıl 2024, Cilt: 12 Sayı: 3, 68 - 80, 25.09.2024
https://doi.org/10.21541/apjess.1371070

Öz

Kaynakça

  • Ortiz, R., Contreras, M., & Mellado, C. (2023). Regression, multicollinearity and Markowitz. Finance Research Letters, 58, 104550.
  • Haavelmo, T. (1944). The probability approach in econometrics. Econometrica: Journal of the Econometric Society, iii-115.
  • Chan, J. Y. L., Leow, S. M. H., Bea, K. T., Cheng, W. K., Phoong, S. W., Hong, Z. W., & Chen, Y. L. (2022). Mitigating the multicollinearity problem and its machine learning approach: a review. Mathematics, 10(8), 1283.
  • A. Garg and K. Tai, ‘Comparison of statistical and machine learning methods in modelling of data with multicollinearity’, IJMIC, vol. 18, no. 4, p. 295, 2013, doi: 10.1504/IJMIC.2013.053535.
  • C. M. Stein, ‘Multiple regression contributions to probability and statistics’, Essays in Honor of Harold Hotelling, vol. 103, 1960.
  • C. M. Stein, ‘Confidence sets for the mean of a multivariate normal distribution’, Journal of the Royal Statistical Society Series B: Statistical Methodology, vol. 24, no. 2, pp. 265–285, 1962.
  • A. E. Hoerl and R. W. Kennard, ‘Ridge Regression: Applications to Nonorthogonal Problems’, Technometrics, vol. 12, no. 1, pp. 69–82, Feb. 1970, doi: 10.1080/00401706.1970.10488635.
  • L. Kejian, ‘A new class of blased estimate in linear regression’, Communications in Statistics - Theory and Methods, vol. 22, no. 2, pp. 393–402, Jan. 1993, doi: 10.1080/03610929308831027.
  • R. Tibshirani, ‘Regression shrinkage and selection via the lasso’, Journal of the Royal Statistical Society Series B: Statistical Methodology, vol. 58, no. 1, pp. 267–288, 1996.
  • H. Zou and T. Hastie, ‘Regularization and variable selection via the elastic net’, Journal of the Royal Statistical Society Series B: Statistical Methodology, vol. 67, no. 2, pp. 301–320, 2005.
  • G. Li and P. Niu, ‘An enhanced extreme learning machine based on ridge regression for regression’, Neural Computing and Applications, vol. 22, pp. 803–810, 2013.
  • T. E. Panduro and B. J. Thorsen, ‘Evaluating two model reduction approaches for large scale hedonic models sensitive to omitted variables and multicollinearity’, Letters in Spatial and Resource Sciences, vol. 7, pp. 85–102, 2014.
  • G. G. Dumancas and G. Bello, ‘Comparison of machine-learning techniques for handling multicollinearity in big data analytics and high-performance data mining’, in SC15: The International Conference for High Performance Computing Networking Storage and Analysis, 2015, pp. 41–42.
  • B. Kilinc, B. Aşıkgil, A. Erar, and B. Yazici, ‘Variable selection with genetic algorithm and multivariate adaptive regression splines in the presence of multicollinearity’, International Journal of Advanced and Applied Sciences, vol. 3, no. 12, 2016.
  • A. Katrutsa and V. Strijov, ‘Comprehensive study of feature selection methods to solve multicollinearity problem according to evaluation criteria’, Expert Systems with Applications, vol. 76, pp. 1–11, Jun. 2017, doi: 10.1016/j.eswa.2017.01.048.
  • E. Hoseinzade and S. Haratizadeh, ‘CNNpred: CNN-based stock market prediction using a diverse set of variables’, Expert Systems with Applications, vol. 129, pp. 273–285, 2019.
  • J.-M. Kim, N. Wang, Y. Liu, and K. Park, ‘Residual control chart for binary response with multicollinearity covariates by neural network model’, Symmetry, vol. 12, no. 3, p. 381, 2020.
  • C. P. Obite, N. P. Olewuezi, G. U. Ugwuanyim, and D. C. Bartholomew, ‘Multicollinearity Effect in Regression Analysis: A Feed Forward Artificial Neural Network Approach’, Asian Journal of Probability and Statistics, pp. 22–33, Jan. 2020, doi: 10.9734/ajpas/2020/v6i130151.
  • Hua, Y. (2020, May). An efficient traffic classification scheme using embedded feature selection and lightgbm. In 2020 Information Communication Technologies Conference (ICTC) (pp. 125-130). IEEE.
  • Qaraad, M., Amjad, S., Manhrawy, I. I., Fathi, H., Hassan, B. A., & El Kafrawy, P. (2021). A hybrid feature selection optimization model for high dimension data classification. IEEE Access, 9, 42884-42895.
  • Y. Bi, C. Li, Y. Benezeth, and F. Yang, ‘Impacts of multicollinearity on CAPT modalities: An heterogeneous machine learning framework for computer-assisted French phoneme pronunciation training’, Plos one, vol. 16, no. 10, p. e0257901, 2021.
  • A. Abubakar, U. F. Abbas, and K. E. Lasisi, ‘Remedying Multicollinearity in Quantitative Analysis: A Simulation Studies’, 2022.
  • Mahadi, M., Ballal, T., Moinuddin, M., & Al-Saggaf, U. M. (2022). A recursive least-squares with a time-varying regularization parameter. Applied Sciences, 12(4), 2077.
  • Kaneko, H. (2023). Interpretation of Machine Learning Models for Data Sets with Many Features Using Feature Importance. ACS omega, 8(25), 23218-23225.
  • Genç, M. (2024). An Enhanced Extreme Learning Machine Based on Square-Root Lasso Method. Neural Processing Letters, 56(1), 5.
  • D. C. Montgomery, E. A. Peck, and G. G. Vining, Introduction to linear regression analysis. John Wiley & Sons, 2021.
  • D. A. Belsley, E. Kuh, and R. E. Welsch, Regression Diagnostics: Identifying Influential Data and Sources of Collinearity, 1st ed. in Wiley Series in Probability and Statistics. Wiley, 1980. doi: 10.1002/0471725153.
  • S. Weisberg, Applied Linear Regression, 1st ed. in Wiley Series in Probability and Statistics. Wiley, 2005. doi: 10.1002/0471704091.
  • Strang, G. (2022). Introduction to linear algebra. Wellesley-Cambridge Press.
  • E. Fix and J. L. Hodges, ‘Discriminatory analysis. Nonparametric discrimination: Consistency properties’, International Statistical Review/Revue Internationale de Statistique, vol. 57, no. 3, pp. 238–247, 1989.
  • N. S. Altman, ‘An introduction to kernel and nearest-neighbor nonparametric regression’, The American Statistician, vol. 46, no. 3, pp. 175–185, 1992.
  • C. Cortes and V. Vapnik, ‘Support-vector networks’, Machine learning, vol. 20, pp. 273–297, 1995.
  • H. Drucker, C. J. Burges, L. Kaufman, A. Smola, and V. Vapnik, ‘Support vector regression machines’, Advances in neural information processing systems, vol. 9, 1996.
  • B. Li, J. Friedman, R. Olshen, and C. Stone, ‘Classification and regression trees (CART)’, Biometrics, vol. 40, no. 3, pp. 358–361, 1984.
  • L. Breiman, ‘Bagging predictors’, Machine learning, vol. 24, pp. 123–140, 1996.
  • L. Breiman, ‘Random forests’, Machine learning, vol. 45, pp. 5–32, 2001.
  • T. Chen and C. Guestrin, ‘Xgboost: A scalable tree boosting system’, in Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining, 2016, pp. 785–794.
  • J. R. Quinlan, ‘Learning with continuous classes’, in 5th Australian joint conference on artificial intelligence, World Scientific, 1992, pp. 343–348.
  • J. R. Quinlan, ‘Combining instance-based and model-based learning’, in Proceedings of the tenth international conference on machine learning, 1993, pp. 236–243.
  • J. R. Quinlan, C4. 5: programs for machine learning. Elsevier, 2014.
  • J. H. Friedman, ‘Multivariate adaptive regression splines’, The annals of statistics, vol. 19, no. 1, pp. 1–67, 1991.
  • S. Haykin and N. Network, ‘A comprehensive foundation’, Neural networks, vol. 2, no. 2004, p. 41, 2004.
  • R. W. Johnson, ‘Fitting percentage of body fat to simple body measurements’, Journal of Statistics Education, vol. 4, no. 1, 1996.
  • ‘Kaggle: Your Machine Learning and Data Science Community’. https://www.kaggle.com/ (accessed Sep. 22, 2023).
  • ‘Data World. https://data.world/nrippner/cancer-trials (accessed July. 18, 2024).
  • Frankenfield, D. C., Rowe, W. A., Cooney, R. N., Smith, J. S., & Becker, D. (2001). Limits of body mass index to detect obesity and predict body composition. Nutrition, 17(1), 26-30.
  • Fthenakis, Z. G., Balaska, D., & Zafiropulos, V. (2012). Uncovering the FUTREX-6100XL prediction equation for the percentage body fat. Journal of medical engineering & technology, 36(7), 351-357.
  • Deurenberg, P., Weststrate, J. A., & Seidell, J. C. (1991). Body mass index as a measure of body fatness: age-and sex-specific prediction formulas. British journal of nutrition, 65(2), 105-114.
  • Jackson, A. S., Stanforth, P. R., Gagnon, J., Rankinen, T., Leon, A. S., Rao, D. C., ... & Wilmore, J. H. (2002). The effect of sex, age and race on estimating percentage body fat from body mass index: The Heritage Family Study. International journal of obesity, 26(6), 789-796.
  • Meeuwsen, S., Horgan, G. W., & Elia, M. (2010). The relationship between BMI and percent body fat, measured by bioelectrical impedance, in a large adult sample is curvilinear and influenced by age and sex. Clinical nutrition, 29(5), 560-566.
  • Sung, H., & Mun, J. (2017). Development and cross-validation of equation for estimating percent body fat of Korean adults according to body mass index. Journal of Obesity & Metabolic Syndrome, 26(2), 122.
  • Uçar, M. K., Ucar, Z., Köksal, F., & Daldal, N. (2021). Estimation of body fat percentage using hybrid machine learning algorithms. Measurement, 167, 108173.
  • Shao, Y. E. (2014). Body fat percentage prediction using intelligent hybrid approaches. The Scientific World Journal, 2014.
  • Carrizosa, E., Mortensen, L. H., Morales, D. R., & Sillero-Denamiel, M. R. (2022). The tree based linear regression model for hierarchical categorical variables. Expert Systems with Applications, 203, 117423.
  • Kourou, K., Exarchos, T. P., Exarchos, K. P., Karamouzis, M. V., & Fotiadis, D. I. (2015). Machine learning applications in cancer prognosis and prediction. Computational and structural biotechnology journal, 13, 8-17.
  • Cruz, J. A., & Wishart, D. S. (2006). Applications of machine learning in cancer prediction and prognosis. Cancer informatics, 2, 117693510600200030.
  • Parikh, R. B., Manz, C., Chivers, C., Regli, S. H., Braun, J., Draugelis, M. E., ... & O’Connor, N. R. (2019). Machine learning approaches to predict 6-month mortality among patients with cancer. JAMA network open, 2(10), e1915997-e1915997.
  • Zhu, W., Xie, L., Han, J., & Guo, X. (2020). The application of deep learning in cancer prognosis prediction. Cancers, 12(3), 603.
  • Yaqoob, A., Musheer Aziz, R., & verma, N. K. (2023). Applications and techniques of machine learning in cancer classification: A systematic review. Human-Centric Intelligent Systems, 3(4), 588-615.
  • Swanson, K., Wu, E., Zhang, A., Alizadeh, A. A., & Zou, J. (2023). From patterns to patients: Advances in clinical machine learning for cancer diagnosis, prognosis, and treatment. Cell, 186(8), 1772-1791.
Toplam 60 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Denetimli Öğrenme, Makine Öğrenmesi Algoritmaları, Makine Öğrenme (Diğer)
Bölüm Araştırma Makaleleri
Yazarlar

Hasan Yıldırım 0000-0003-4582-9018

Erken Görünüm Tarihi 25 Eylül 2024
Yayımlanma Tarihi 25 Eylül 2024
Gönderilme Tarihi 4 Ekim 2023
Yayımlandığı Sayı Yıl 2024 Cilt: 12 Sayı: 3

Kaynak Göster

IEEE H. Yıldırım, “The Multicollinearity Effect on the Performance of Machine Learning Algorithms: Case Examples in Healthcare Modelling”, APJESS, c. 12, sy. 3, ss. 68–80, 2024, doi: 10.21541/apjess.1371070.

Academic Platform Journal of Engineering and Smart Systems