Research Article
BibTex RIS Cite
Year 2022, , 748 - 763, 01.06.2022
https://doi.org/10.35378/gujs.735503

Abstract

References

  • [1] Li, K., “Sliced inverse regression for dimension reduction (with discussion)”, Journal of the American Statistical Association, 86: 316–342, (1991).
  • [2] Cook, R. “Regression graphics: ideas for studying the regression through graphics”, New York, Wily. (1998).
  • [3] Xia, Y., Tong, H., Li, W., Zhu, L. “An adaptive estimation of dimension reduction space”, Journal of the Royal Statistical Society: Series B, 64: 363–410, (2002).
  • [4] CooK, R., “Testing predictor contributions in sufficient dimension reduction”, Annals of Statistics, 32: 1061–92, (2004).
  • [5] Ni, L., Cook, R. D., Tsai, C. L., “A note on shrinkage sliced inverse regression”, Biometrika, 92: 242–247, (2005).
  • [6] Li, L., Nachtsheim, C. J., “Sparse sliced inverse regression”, Technometrics, 48: 503–510, (2006).
  • [7] Li, L., “Sparse sufficient dimension reduction”, Biometrika, 94: 603–613, (2007).
  • [8] Zhu, L., Miao, B., Peng, H., “On sliced inverse regression with large dimensional covariates”, Journal of American Statistical Association, 101: 630–643, (2006).
  • [9] Lin, Q., Zhao, Z., Liu J., “Sparse sliced inverse regression via lasso”, Journal of the American Statistical Association, 114: 1726–1739, (2019).
  • [10] Tibshirani, R., “Regression shrinkage and selection via the Lasso”, Journal of the Royal Statistical Society: Series B, 58: 267–288, (1996).
  • [11] Fan, J., Li, R. Z., “Variable selection via non-concave penalized likelihood and its oracle properties”, Journal of the American Statistical Association, 96: 1348–1360, (2001).
  • [12] Zou, H., Hastie, T., “Regularization and variable selection via the elastic net”, Journal of the Royal Statistical Society: Series B, 67: 301–320, (2005).
  • [13] Yuan, M., Lin, Y., “Model selection and estimation in regression with grouped variables”, Journal of the Royal Statistical Society: Series B, 68: 49–67, (2006).
  • [14] Zou, H., “The adaptive Lasso and its oracle properties”, Journal of the American Statistical Association, 101: 1418–1429, (2006).
  • [15] Bondell, H. D., Reich, B. J., “Simultaneous regression shrinkage, variable selection and clustering of predictors with OSCAR”, Biometrics, 64: 115–123, (2008).
  • [16] Zhang, C. H., “Nearly unbiased variable selection under minimax concave penalty”, Annals of Statistics, 38: 894–942, (2010).
  • [17] Sharma, D. B., Bondell, H. D., Zhang, H. H., “Consistent group identification and variable selection in regression with correlated predictors”, Journal of Computational and Graphical Statistics, 22: 319–340, (2013).
  • [18] Naik, P. A., Tsai, C. L., “Single-index model selections”, Biometrika, 88: 821–832, (2001).
  • [19] Li, L., Yin, X., “Sliced Inverse Regression with regularizations”, Biometrics, 64: 124–131, (2008).
  • [20] Alkenani, A., “Group identification and variable selection in sliced inverse regression with correlated predictors”, Journal of Modern Applied Statistical Methods, (2020).
  • [21] Gather, U., Hilker, T., Becker C., “A note on outlier sensitivity of sliced inverse regression”, Statistics, 36: 271–281, (2002).
  • [22] Yohai, V., Sertter, M., “A robust proposal for sliced inverse regression”, International conference on robust statistics, abstract, (2005).
  • [23] Prendergast, L., “Influence functions for sliced inverse regression”, Scandinavian Journal of Statistics, 32: 385–404, (2005).
  • [24] Alkenani, A., Dikheel, T., “Robust Group Identification and Variable Selection in Regression”, Journal of Probability and Statistics, 2017, Paper ID 2170816, 8, (2017).
  • [25] Rousseeuw, P., Yohai, V., “Robust regression by means of s-estimators”, Robust and Nonlinear Time Series Analysis, 256-272, (1984).
  • [26] Tukey, J. W., “A survey of sampling from contaminated distributions”, Contributions to Probability and Statistics, 2: 448–485, (1960).
  • [27] Pan, W., Wang, X., Xiao, W., Zhu, H., “A generic sure independence screening procedure”, Journal of American Statistical Association 1–29, (2018). (just-accepted).
  • [28] Zhang, J,Chen, X., “Robust sufficient dimension reduction via ball covariance”, Computational Statistics and Data Analysis, 140: 144–154, (2019).
  • [29] Yao, W., Wang, Q., “Robust variable selection through MAVE”, Computational Statistics and Data Analysis, 63: 42–49, (2013).
  • [30] Schott, J. R., “Determining the dimensionality in sliced inverse regression”, Journal of the American Statistical Association, 89: 141–148, (1994).
  • [31] Bura, E., Cook, R. D. “Extending sliced inverse regression: The weighted chi-squared test”, Journal of the American Statistical Association, 96: 996–1003, (2001).
  • [32] Cook, R. D., Yin, X., “Dimension reduction and visualization in discriminant analysis”, Australian and New Zealand Journal of Statistics, 43: 147–177, (2001).
  • [33] Zhu, Y., Zeng, P., “Fourier methods for estimating the central subspace and the central mean subspace in regression”, Journal of the American Statistical Association, 101: 1638–1651, (2006).
  • [34] McDonald, G. C., Schwing, R. C., “Instabilities of regression estimates relating air pollution to mortality”, Technometrics, 15: 463–481, (1973).

Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey's Biweight Criterion and Ball Covariance

Year 2022, , 748 - 763, 01.06.2022
https://doi.org/10.35378/gujs.735503

Abstract

The SSIR-PACS is a group identification and a model-free variable selection method under sufficient dimension reduction (SDR) settings. It combined the Pairwise Absolute Clustering and Sparsity (PACS) with sliced inverse regression (SIR) methods to produce solutions with sparsity and the ability of group identification. However, the SSIR-PACS depends on classical estimates for dispersion and location, squared loss function, and non-robust weights for outliers. In this paper, a robust version of SSIR-PACS (RSSIR-PACS) is proposed. We replaced the squared loss by the criterion of Tukey's biweight. Also, the non-robust weights to outliers, which depend on Pearson’s correlations, are substituted with robust weights based on recently developed ball correlation. Moreover, the estimates of the mean and covariance matrix are substituted by the median and ball covariance, respectively. The RSSIR-PACS is robust to outliers in both the response and covariates. According to the results of simulations, RSSIR-PACS produces very good results. If the outliers are existing, the efficacy of RSSIR-PACS is considerably better than the efficacy of the competitors. In addition, a robust criteria to estimate the structural dimension d is proposed. The RSSIR-PACS makes SSIR-PACS practically feasible. Also, we employed real data to demonstrate the utility of RSSIR-PACS.

References

  • [1] Li, K., “Sliced inverse regression for dimension reduction (with discussion)”, Journal of the American Statistical Association, 86: 316–342, (1991).
  • [2] Cook, R. “Regression graphics: ideas for studying the regression through graphics”, New York, Wily. (1998).
  • [3] Xia, Y., Tong, H., Li, W., Zhu, L. “An adaptive estimation of dimension reduction space”, Journal of the Royal Statistical Society: Series B, 64: 363–410, (2002).
  • [4] CooK, R., “Testing predictor contributions in sufficient dimension reduction”, Annals of Statistics, 32: 1061–92, (2004).
  • [5] Ni, L., Cook, R. D., Tsai, C. L., “A note on shrinkage sliced inverse regression”, Biometrika, 92: 242–247, (2005).
  • [6] Li, L., Nachtsheim, C. J., “Sparse sliced inverse regression”, Technometrics, 48: 503–510, (2006).
  • [7] Li, L., “Sparse sufficient dimension reduction”, Biometrika, 94: 603–613, (2007).
  • [8] Zhu, L., Miao, B., Peng, H., “On sliced inverse regression with large dimensional covariates”, Journal of American Statistical Association, 101: 630–643, (2006).
  • [9] Lin, Q., Zhao, Z., Liu J., “Sparse sliced inverse regression via lasso”, Journal of the American Statistical Association, 114: 1726–1739, (2019).
  • [10] Tibshirani, R., “Regression shrinkage and selection via the Lasso”, Journal of the Royal Statistical Society: Series B, 58: 267–288, (1996).
  • [11] Fan, J., Li, R. Z., “Variable selection via non-concave penalized likelihood and its oracle properties”, Journal of the American Statistical Association, 96: 1348–1360, (2001).
  • [12] Zou, H., Hastie, T., “Regularization and variable selection via the elastic net”, Journal of the Royal Statistical Society: Series B, 67: 301–320, (2005).
  • [13] Yuan, M., Lin, Y., “Model selection and estimation in regression with grouped variables”, Journal of the Royal Statistical Society: Series B, 68: 49–67, (2006).
  • [14] Zou, H., “The adaptive Lasso and its oracle properties”, Journal of the American Statistical Association, 101: 1418–1429, (2006).
  • [15] Bondell, H. D., Reich, B. J., “Simultaneous regression shrinkage, variable selection and clustering of predictors with OSCAR”, Biometrics, 64: 115–123, (2008).
  • [16] Zhang, C. H., “Nearly unbiased variable selection under minimax concave penalty”, Annals of Statistics, 38: 894–942, (2010).
  • [17] Sharma, D. B., Bondell, H. D., Zhang, H. H., “Consistent group identification and variable selection in regression with correlated predictors”, Journal of Computational and Graphical Statistics, 22: 319–340, (2013).
  • [18] Naik, P. A., Tsai, C. L., “Single-index model selections”, Biometrika, 88: 821–832, (2001).
  • [19] Li, L., Yin, X., “Sliced Inverse Regression with regularizations”, Biometrics, 64: 124–131, (2008).
  • [20] Alkenani, A., “Group identification and variable selection in sliced inverse regression with correlated predictors”, Journal of Modern Applied Statistical Methods, (2020).
  • [21] Gather, U., Hilker, T., Becker C., “A note on outlier sensitivity of sliced inverse regression”, Statistics, 36: 271–281, (2002).
  • [22] Yohai, V., Sertter, M., “A robust proposal for sliced inverse regression”, International conference on robust statistics, abstract, (2005).
  • [23] Prendergast, L., “Influence functions for sliced inverse regression”, Scandinavian Journal of Statistics, 32: 385–404, (2005).
  • [24] Alkenani, A., Dikheel, T., “Robust Group Identification and Variable Selection in Regression”, Journal of Probability and Statistics, 2017, Paper ID 2170816, 8, (2017).
  • [25] Rousseeuw, P., Yohai, V., “Robust regression by means of s-estimators”, Robust and Nonlinear Time Series Analysis, 256-272, (1984).
  • [26] Tukey, J. W., “A survey of sampling from contaminated distributions”, Contributions to Probability and Statistics, 2: 448–485, (1960).
  • [27] Pan, W., Wang, X., Xiao, W., Zhu, H., “A generic sure independence screening procedure”, Journal of American Statistical Association 1–29, (2018). (just-accepted).
  • [28] Zhang, J,Chen, X., “Robust sufficient dimension reduction via ball covariance”, Computational Statistics and Data Analysis, 140: 144–154, (2019).
  • [29] Yao, W., Wang, Q., “Robust variable selection through MAVE”, Computational Statistics and Data Analysis, 63: 42–49, (2013).
  • [30] Schott, J. R., “Determining the dimensionality in sliced inverse regression”, Journal of the American Statistical Association, 89: 141–148, (1994).
  • [31] Bura, E., Cook, R. D. “Extending sliced inverse regression: The weighted chi-squared test”, Journal of the American Statistical Association, 96: 996–1003, (2001).
  • [32] Cook, R. D., Yin, X., “Dimension reduction and visualization in discriminant analysis”, Australian and New Zealand Journal of Statistics, 43: 147–177, (2001).
  • [33] Zhu, Y., Zeng, P., “Fourier methods for estimating the central subspace and the central mean subspace in regression”, Journal of the American Statistical Association, 101: 1638–1651, (2006).
  • [34] McDonald, G. C., Schwing, R. C., “Instabilities of regression estimates relating air pollution to mortality”, Technometrics, 15: 463–481, (1973).
There are 34 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Statistics
Authors

Ali Alkenani 0000-0001-5067-2321

Publication Date June 1, 2022
Published in Issue Year 2022

Cite

APA Alkenani, A. (2022). Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance. Gazi University Journal of Science, 35(2), 748-763. https://doi.org/10.35378/gujs.735503
AMA Alkenani A. Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance. Gazi University Journal of Science. June 2022;35(2):748-763. doi:10.35378/gujs.735503
Chicago Alkenani, Ali. “Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance”. Gazi University Journal of Science 35, no. 2 (June 2022): 748-63. https://doi.org/10.35378/gujs.735503.
EndNote Alkenani A (June 1, 2022) Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance. Gazi University Journal of Science 35 2 748–763.
IEEE A. Alkenani, “Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance”, Gazi University Journal of Science, vol. 35, no. 2, pp. 748–763, 2022, doi: 10.35378/gujs.735503.
ISNAD Alkenani, Ali. “Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance”. Gazi University Journal of Science 35/2 (June 2022), 748-763. https://doi.org/10.35378/gujs.735503.
JAMA Alkenani A. Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance. Gazi University Journal of Science. 2022;35:748–763.
MLA Alkenani, Ali. “Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance”. Gazi University Journal of Science, vol. 35, no. 2, 2022, pp. 748-63, doi:10.35378/gujs.735503.
Vancouver Alkenani A. Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey’s Biweight Criterion and Ball Covariance. Gazi University Journal of Science. 2022;35(2):748-63.