Research Article
BibTex RIS Cite
Year 2020, Volume: 49 Issue: 6, 2104 - 2118, 08.12.2020
https://doi.org/10.15672/hujms.629192

Abstract

References

  • [1] I.A. Ahmad and P.E. Lin, A nonparametric estimation of the entropy of the absolutely continuous distribution, IEEE Trans. Inform. Theor. 22, 327-375, 1976.
  • [2] H. Alizadeh Noughabi, A new estimator of Kullback–Leibler information and its application in goodness of fit tests, J. Stat. Comput. Simul. 89 (10), 1914-1934, 2019.
  • [3] H. Alizadeh Noughabi, A new estimator of entropy and its application in testing normality, J. Stat. Comput. Simul. 86, 1151-1162, 2010.
  • [4] H. Alizadeh Noughabi and N.R. Arghami, Monte Carlo comparison of seven normality tests, J. Stat. Comput. Simul. 8, 965-972, 2011a.
  • [5] H. Alizadeh Noughabi and N.R. Arghami, Testing exponentiality based on characterizations of the exponential distribution, J. Stat. Comput. Simul. 81, 1641-1651, 2011b.
  • [6] H. Alizadeh Noughabi and N. Balakrishnan, Tests of goodness of fit based on Phidivergence, J. Appl. Stat. 43 (3), 412-429, 2016.
  • [7] T.W. Anderson and D.A. Darling, A test of goodness of fit, J. Amer. Statist. Assoc. 49, 765-769, 1954.
  • [8] I. Arizono and H. Ohta, A test for normality based on Kullback Leibler information, Amer. Statist. 43, 20-23, 1989.
  • [9] V. Balakrishnan and L.D. Sanghvi, Distance between populations on the basis of attribute, Biometrics 24, 859-865, 1968.
  • [10] C.I. Bliss, Statistics in Biology: Statistical methods for research in the natural sciences, McGrawHill Book Company, New York, 1967.
  • [11] B. Choi, Improvement of goodness of fit test for normal distribution, J. Stat. Comput. Simul. 78, 781-788, 2008.
  • [12] J.C. Corea, A new estimator of entropy, Comm. Statist. Theory Methods 24, 2439- 2449, 1995.
  • [13] H. Cramer, On the composition of elementary errors, Scand. Actuar. J. 1, 1374, 1928.
  • [14] E.S. Dudewicz and E.C. Van der Meulen, Entropy-based tests of uniformity, J. Amer. Statist. Assoc. 76, 967-974, 1981.
  • [15] A.J. Duncan, Quality Control and Industrial statistics, Homewood (IL), Irwin, 1974.
  • [16] N. Ebrahimi, M. Habibullah and E.S. Soofi, Testing exponentiality based on Kullback- Leibler information, J. R. Stat. Soc. Ser. B. Stat. Methodol. 54, 739-748, 1992.
  • [17] N. Ebrahimi, K. Pflughoeft and E.S. Soofi, Two measures of sample entropy, Statist. Probab. Lett. 20, 225-234, 1994.
  • [18] M.D. Esteban, M.E. Castellanos, D. Morales and I. Vajda, Monte Carlo comparison of four normality tests using different entropy estimates, Comm. Statist. Simulation Comput. 30, 761-785, 2001.
  • [19] C.M. Jarque and A.K. Bera, A test normality of observations and regression residuals, Int. Stat. Rev. 55, 163-172, 1987.
  • [20] H. Joe, Estimation of entropy and other functionals of a multivariate density, Ann. Inst. Statist. Math. 41, 683-697, 1989.
  • [21] O. Karadag and S. Aktas, Goodness of fit tests for generalized gamma distribution, International Conference of Numerical Analysis and Applied Mathematics 2015, AIP Conference Proceedings, 1738, 2016.
  • [22] A.N. Kolmogorov, Sulla determinazione empiricadi une legga di distribuzione, Giornale dell Istituto Italiano degli Attuari 4, 83-91, 1993.
  • [23] N.H. Kuiper, Tests concerning random points on a circle, Proc. K. Ned. Akad. Wet. 63, 38-47, 1960.
  • [24] S. Lee Bull, Entropy-based goodness of fit test for a composite hypothesis, Bull. Korean Math. Soc. 53(2), 351-363, 2016.
  • [25] J. Lequesne, Entropy-based goodness-of-fit test: Application to the Pareto distribution, AIP Conf. Proc. of the International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, 1553, 155-62, 2013.
  • [26] J. Lequesne, A goodness-of-fit test of student distributions based on Renyi entropy, AIP Conf. Proc. of the International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, 1641, 487-94, 2015.
  • [27] J. Lequesne and P. Regnault, Goodness-of-fit tests based on entropy: R package KL-goftest, Work in progress, 2017.
  • [28] S. Park, A goodness-of-fit test for normality based on the sample entropy of order statistics, Statist. Probab. Lett. 44 (4), 359-363, 1999.
  • [29] K. Pearson, On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling, Philos. Mag. Lett. 50, 157-175, 1900.
  • [30] P. Puig and M.A.Stephens, Tests of fit for the Laplace distribution with applications, Technometrics 4, 417424, 2000.
  • [31] X. Romao, R. Delgado and A. Costa, An empirical power comparison of univariate goodness-of-fit tests for normality, J. Stat. Comput. Simul. 80 (5),545-591, 2010.
  • [32] C.E. Shannon, Mathematical theory of communications, Bell Syst. tech. 27, 379-423, 623-656, 1948.
  • [33] S.S. Shapiro and M.B. Wilk, An analysis of variance test for normality (Complete Sample), Biometrika 52, 591-611,1965.
  • [34] N. Smirnov, Table for estimating the goodness of fit of empirical distributions, Ann Math Stat. 19 (2), 279281, 1948.
  • [35] K.S. Song, Goodness-of-fit tests based on KullbackLeibler discrimination information, IEEE Trans. Inf. Theory 48 (5),110317, 2002.
  • [36] B. Van Es, Estimating functional related to a density by a lass of statistic based on spacings, Scand. J. Stat. 19, 61-72, 1992.
  • [37] O. Vasicek, A Test for normality based on sample entropy, J. R. Stat. Soc. Ser. B. Stat. Methodol. 38, 730-737, 1976.
  • [38] R.E. Von Mises, Wahrscheinlichkeit, Statistik und Wahrheit, Julius Springer, 1928.
  • [39] X. Wang, Y. Liu and B. Han, Goodness-of-fit tests based on Bernstein distribution estimator, J. Nonparametr. Stat., 2018.
  • [40] G.S. Watson, Goodness of fit tests on a circle, Biometrika 48, 109-114, 1961.
  • [41] P. Wieczorkowski and P. Grzegorzewsky, Entropy estimators improvements and comparisons, Comm. Statist. Simulation Comput. 28, 541-567, 1999.
  • [42] F. Yousefzadeh and N.R. Arghami, Testing exponentiality based on type II censored data and a New cdf estimator, Comm. Statist. Simulation Comput. 37, 1479-1499, 2008.

An estimation of Phi divergence and its application in testing normality

Year 2020, Volume: 49 Issue: 6, 2104 - 2118, 08.12.2020
https://doi.org/10.15672/hujms.629192

Abstract

In this article, a new goodness of fit test for normality is introduced based on Phi divergence. The test statistic is estimated using spacing and the consistency of the test is proved. Then with replacing some special cases of Phi divergence, the efficiency of each test statistic is analyzed by Monte Carlo simulation against some competitors (based on Phi divergence using kernel density function and also some classical competitors). It is shown that each special case of Phi divergence based test is the most powerful in each group of alternatives (depending on symmetry or support).

References

  • [1] I.A. Ahmad and P.E. Lin, A nonparametric estimation of the entropy of the absolutely continuous distribution, IEEE Trans. Inform. Theor. 22, 327-375, 1976.
  • [2] H. Alizadeh Noughabi, A new estimator of Kullback–Leibler information and its application in goodness of fit tests, J. Stat. Comput. Simul. 89 (10), 1914-1934, 2019.
  • [3] H. Alizadeh Noughabi, A new estimator of entropy and its application in testing normality, J. Stat. Comput. Simul. 86, 1151-1162, 2010.
  • [4] H. Alizadeh Noughabi and N.R. Arghami, Monte Carlo comparison of seven normality tests, J. Stat. Comput. Simul. 8, 965-972, 2011a.
  • [5] H. Alizadeh Noughabi and N.R. Arghami, Testing exponentiality based on characterizations of the exponential distribution, J. Stat. Comput. Simul. 81, 1641-1651, 2011b.
  • [6] H. Alizadeh Noughabi and N. Balakrishnan, Tests of goodness of fit based on Phidivergence, J. Appl. Stat. 43 (3), 412-429, 2016.
  • [7] T.W. Anderson and D.A. Darling, A test of goodness of fit, J. Amer. Statist. Assoc. 49, 765-769, 1954.
  • [8] I. Arizono and H. Ohta, A test for normality based on Kullback Leibler information, Amer. Statist. 43, 20-23, 1989.
  • [9] V. Balakrishnan and L.D. Sanghvi, Distance between populations on the basis of attribute, Biometrics 24, 859-865, 1968.
  • [10] C.I. Bliss, Statistics in Biology: Statistical methods for research in the natural sciences, McGrawHill Book Company, New York, 1967.
  • [11] B. Choi, Improvement of goodness of fit test for normal distribution, J. Stat. Comput. Simul. 78, 781-788, 2008.
  • [12] J.C. Corea, A new estimator of entropy, Comm. Statist. Theory Methods 24, 2439- 2449, 1995.
  • [13] H. Cramer, On the composition of elementary errors, Scand. Actuar. J. 1, 1374, 1928.
  • [14] E.S. Dudewicz and E.C. Van der Meulen, Entropy-based tests of uniformity, J. Amer. Statist. Assoc. 76, 967-974, 1981.
  • [15] A.J. Duncan, Quality Control and Industrial statistics, Homewood (IL), Irwin, 1974.
  • [16] N. Ebrahimi, M. Habibullah and E.S. Soofi, Testing exponentiality based on Kullback- Leibler information, J. R. Stat. Soc. Ser. B. Stat. Methodol. 54, 739-748, 1992.
  • [17] N. Ebrahimi, K. Pflughoeft and E.S. Soofi, Two measures of sample entropy, Statist. Probab. Lett. 20, 225-234, 1994.
  • [18] M.D. Esteban, M.E. Castellanos, D. Morales and I. Vajda, Monte Carlo comparison of four normality tests using different entropy estimates, Comm. Statist. Simulation Comput. 30, 761-785, 2001.
  • [19] C.M. Jarque and A.K. Bera, A test normality of observations and regression residuals, Int. Stat. Rev. 55, 163-172, 1987.
  • [20] H. Joe, Estimation of entropy and other functionals of a multivariate density, Ann. Inst. Statist. Math. 41, 683-697, 1989.
  • [21] O. Karadag and S. Aktas, Goodness of fit tests for generalized gamma distribution, International Conference of Numerical Analysis and Applied Mathematics 2015, AIP Conference Proceedings, 1738, 2016.
  • [22] A.N. Kolmogorov, Sulla determinazione empiricadi une legga di distribuzione, Giornale dell Istituto Italiano degli Attuari 4, 83-91, 1993.
  • [23] N.H. Kuiper, Tests concerning random points on a circle, Proc. K. Ned. Akad. Wet. 63, 38-47, 1960.
  • [24] S. Lee Bull, Entropy-based goodness of fit test for a composite hypothesis, Bull. Korean Math. Soc. 53(2), 351-363, 2016.
  • [25] J. Lequesne, Entropy-based goodness-of-fit test: Application to the Pareto distribution, AIP Conf. Proc. of the International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, 1553, 155-62, 2013.
  • [26] J. Lequesne, A goodness-of-fit test of student distributions based on Renyi entropy, AIP Conf. Proc. of the International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, 1641, 487-94, 2015.
  • [27] J. Lequesne and P. Regnault, Goodness-of-fit tests based on entropy: R package KL-goftest, Work in progress, 2017.
  • [28] S. Park, A goodness-of-fit test for normality based on the sample entropy of order statistics, Statist. Probab. Lett. 44 (4), 359-363, 1999.
  • [29] K. Pearson, On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling, Philos. Mag. Lett. 50, 157-175, 1900.
  • [30] P. Puig and M.A.Stephens, Tests of fit for the Laplace distribution with applications, Technometrics 4, 417424, 2000.
  • [31] X. Romao, R. Delgado and A. Costa, An empirical power comparison of univariate goodness-of-fit tests for normality, J. Stat. Comput. Simul. 80 (5),545-591, 2010.
  • [32] C.E. Shannon, Mathematical theory of communications, Bell Syst. tech. 27, 379-423, 623-656, 1948.
  • [33] S.S. Shapiro and M.B. Wilk, An analysis of variance test for normality (Complete Sample), Biometrika 52, 591-611,1965.
  • [34] N. Smirnov, Table for estimating the goodness of fit of empirical distributions, Ann Math Stat. 19 (2), 279281, 1948.
  • [35] K.S. Song, Goodness-of-fit tests based on KullbackLeibler discrimination information, IEEE Trans. Inf. Theory 48 (5),110317, 2002.
  • [36] B. Van Es, Estimating functional related to a density by a lass of statistic based on spacings, Scand. J. Stat. 19, 61-72, 1992.
  • [37] O. Vasicek, A Test for normality based on sample entropy, J. R. Stat. Soc. Ser. B. Stat. Methodol. 38, 730-737, 1976.
  • [38] R.E. Von Mises, Wahrscheinlichkeit, Statistik und Wahrheit, Julius Springer, 1928.
  • [39] X. Wang, Y. Liu and B. Han, Goodness-of-fit tests based on Bernstein distribution estimator, J. Nonparametr. Stat., 2018.
  • [40] G.S. Watson, Goodness of fit tests on a circle, Biometrika 48, 109-114, 1961.
  • [41] P. Wieczorkowski and P. Grzegorzewsky, Entropy estimators improvements and comparisons, Comm. Statist. Simulation Comput. 28, 541-567, 1999.
  • [42] F. Yousefzadeh and N.R. Arghami, Testing exponentiality based on type II censored data and a New cdf estimator, Comm. Statist. Simulation Comput. 37, 1479-1499, 2008.
There are 42 citations in total.

Details

Primary Language English
Subjects Statistics
Journal Section Statistics
Authors

Mahsa Tavakoli 0000-0002-8099-9007

Hadi Alizadeh Noughabi This is me 0000-0002-7515-1896

Gholam Reza Mohtashami Borzadaran 0000-0002-8841-1386

Publication Date December 8, 2020
Published in Issue Year 2020 Volume: 49 Issue: 6

Cite

APA Tavakoli, M., Alizadeh Noughabi, H., & Mohtashami Borzadaran, G. R. (2020). An estimation of Phi divergence and its application in testing normality. Hacettepe Journal of Mathematics and Statistics, 49(6), 2104-2118. https://doi.org/10.15672/hujms.629192
AMA Tavakoli M, Alizadeh Noughabi H, Mohtashami Borzadaran GR. An estimation of Phi divergence and its application in testing normality. Hacettepe Journal of Mathematics and Statistics. December 2020;49(6):2104-2118. doi:10.15672/hujms.629192
Chicago Tavakoli, Mahsa, Hadi Alizadeh Noughabi, and Gholam Reza Mohtashami Borzadaran. “An Estimation of Phi Divergence and Its Application in Testing Normality”. Hacettepe Journal of Mathematics and Statistics 49, no. 6 (December 2020): 2104-18. https://doi.org/10.15672/hujms.629192.
EndNote Tavakoli M, Alizadeh Noughabi H, Mohtashami Borzadaran GR (December 1, 2020) An estimation of Phi divergence and its application in testing normality. Hacettepe Journal of Mathematics and Statistics 49 6 2104–2118.
IEEE M. Tavakoli, H. Alizadeh Noughabi, and G. R. Mohtashami Borzadaran, “An estimation of Phi divergence and its application in testing normality”, Hacettepe Journal of Mathematics and Statistics, vol. 49, no. 6, pp. 2104–2118, 2020, doi: 10.15672/hujms.629192.
ISNAD Tavakoli, Mahsa et al. “An Estimation of Phi Divergence and Its Application in Testing Normality”. Hacettepe Journal of Mathematics and Statistics 49/6 (December 2020), 2104-2118. https://doi.org/10.15672/hujms.629192.
JAMA Tavakoli M, Alizadeh Noughabi H, Mohtashami Borzadaran GR. An estimation of Phi divergence and its application in testing normality. Hacettepe Journal of Mathematics and Statistics. 2020;49:2104–2118.
MLA Tavakoli, Mahsa et al. “An Estimation of Phi Divergence and Its Application in Testing Normality”. Hacettepe Journal of Mathematics and Statistics, vol. 49, no. 6, 2020, pp. 2104-18, doi:10.15672/hujms.629192.
Vancouver Tavakoli M, Alizadeh Noughabi H, Mohtashami Borzadaran GR. An estimation of Phi divergence and its application in testing normality. Hacettepe Journal of Mathematics and Statistics. 2020;49(6):2104-18.