A popular statistical technique for modeling binary response variables is logistic regression. Nevertheless, the performance of standard logistic regression may be affected by the sensitivity of its maximum likelihood estimation to correlated predictor variables and outliers. In addition, traditional estimation techniques can occasionally become too complicated when working with high-dimensional datasets. In order to overcome these constraints, we provide a sparse and robust logistic regression model that makes use of the elastic net penalty as a regularizer that induces sparsity and density power divergence for robustness. Our method makes use of k-fold cross-validation to guaranty model stability and generalizability along with the majorization-minimization algorithm for effective parameter estimation. The efficacy of the suggested strategy for managing outliers in high dimensions is demonstrated through the execution of simulated datasets and a real-time example using the breast cancer data set.
Density power divergence elastic net penalty k-fold cross-validation logistic regression MM algorithm robust sparse
| Primary Language | English |
|---|---|
| Subjects | Computational Statistics |
| Journal Section | Research Article |
| Authors | |
| Submission Date | June 26, 2025 |
| Acceptance Date | November 8, 2025 |
| Early Pub Date | November 19, 2025 |
| Publication Date | December 30, 2025 |
| Published in Issue | Year 2025 Volume: 54 Issue: 6 |