Research Article
BibTex RIS Cite
Year 2016, , 113 - 117, 15.04.2016
https://doi.org/10.36753/mathenot.421418

Abstract

References

  • [1] Aczel, J. Forte, B. and Ng, C. T., Why Shannon and Hartley entropies are "natural". Adv. Appl. Probab. (1970), 6, 131-146.
  • [2] Campbell, L.L., Exponential entropy as a measure of extent of a distribution. Zeitschr. fur Wahrsch. und verw. Geb. (1966), 5, 217-255.
  • [3] Cover, T. M. and Thomas, J. A., Elements of Information Theory. Second Edition. Wiley Interscience. (2006).
  • [4] Daroczy, Z. and Maksa, Gy., Non-negative information functions in Analytic Function Methods in Probability and Statistics. Colloq. Math. Soc. J. Bolyai 21, Gyires, B., Ed; north Holand: Amesterdam. (1979), 65-76.
  • [5] Diderrich, G., The role of boundedness in characterizing Shannon entropy. Information and Control. (1975), 29, 140-161.
  • [6] Fadeev, D.K., On the concept of entropy of a finite probability scheme (in Russian). Uspehi Mat. Nauk. (1956), 11, 227-231.
  • [7] Hartley, R. V. L., Transmission of information. Bell System Technical Journal. (1928), 7, 535-563.
  • [8] Harvda, J. and Charvat, F., Quantification method of classification processes: Concept of structural α−entropy. Kybernetika. (1967), 3, 30-35.
  • [9] Koski, T. and Persson, L.E., Some properties of generalized exponential entropies with applications to data compression. Information Theory. 50 (1992), 6, 1220-1228.
  • [10] Lee, P. M., On the axioms of information theory. Ann. Math. Statist. (1964) 35, 415-418.
  • [11] Renyi, A., On measures of entropy and information. Proc. Berekeley Symposium, Statist. Probability. (1961), 1, 547-561.
  • [12] Shannon, C. E., A mathematical theory of communication. Bell System Technical Journal., (1948), 27, 379-423.
  • [13] Tsallis, C., Possible generalizations of Boltzmann-Gibbs statistics. Journal of Statistical Physics. (1988), 52, 479-487.
  • [14] Tverberg, H., A new derivation of the information function. Math. Scand. (1958), 6, 297-298.

Renyi entropy in continuous case is not the limit of discrete case

Year 2016, , 113 - 117, 15.04.2016
https://doi.org/10.36753/mathenot.421418

Abstract

Shannon entropy is the creation of Shannon (1948) based on the experiences in Bell System Company
during and after the Second World War. Then, Renyi (1961) generalized it for one parameter families of
entropies. This entropy for discrete random variables is non-negative but it can be negative in continuous
case.
In this paper, we show that Renyi entropy for continuous random variables is not equal to the limit of it
for discrete random variables. Also, some notes are derived in view of the variate versions of entropy
criteria. 

References

  • [1] Aczel, J. Forte, B. and Ng, C. T., Why Shannon and Hartley entropies are "natural". Adv. Appl. Probab. (1970), 6, 131-146.
  • [2] Campbell, L.L., Exponential entropy as a measure of extent of a distribution. Zeitschr. fur Wahrsch. und verw. Geb. (1966), 5, 217-255.
  • [3] Cover, T. M. and Thomas, J. A., Elements of Information Theory. Second Edition. Wiley Interscience. (2006).
  • [4] Daroczy, Z. and Maksa, Gy., Non-negative information functions in Analytic Function Methods in Probability and Statistics. Colloq. Math. Soc. J. Bolyai 21, Gyires, B., Ed; north Holand: Amesterdam. (1979), 65-76.
  • [5] Diderrich, G., The role of boundedness in characterizing Shannon entropy. Information and Control. (1975), 29, 140-161.
  • [6] Fadeev, D.K., On the concept of entropy of a finite probability scheme (in Russian). Uspehi Mat. Nauk. (1956), 11, 227-231.
  • [7] Hartley, R. V. L., Transmission of information. Bell System Technical Journal. (1928), 7, 535-563.
  • [8] Harvda, J. and Charvat, F., Quantification method of classification processes: Concept of structural α−entropy. Kybernetika. (1967), 3, 30-35.
  • [9] Koski, T. and Persson, L.E., Some properties of generalized exponential entropies with applications to data compression. Information Theory. 50 (1992), 6, 1220-1228.
  • [10] Lee, P. M., On the axioms of information theory. Ann. Math. Statist. (1964) 35, 415-418.
  • [11] Renyi, A., On measures of entropy and information. Proc. Berekeley Symposium, Statist. Probability. (1961), 1, 547-561.
  • [12] Shannon, C. E., A mathematical theory of communication. Bell System Technical Journal., (1948), 27, 379-423.
  • [13] Tsallis, C., Possible generalizations of Boltzmann-Gibbs statistics. Journal of Statistical Physics. (1988), 52, 479-487.
  • [14] Tverberg, H., A new derivation of the information function. Math. Scand. (1958), 6, 297-298.
There are 14 citations in total.

Details

Primary Language English
Journal Section Articles
Authors

M. Sanei Tabass This is me

G. R. Mohtashami Borzadaran This is me

M. Amini This is me

Publication Date April 15, 2016
Submission Date September 11, 2015
Published in Issue Year 2016

Cite

APA Tabass, M. S., Borzadaran, G. R. M., & Amini, M. (2016). Renyi entropy in continuous case is not the limit of discrete case. Mathematical Sciences and Applications E-Notes, 4(1), 113-117. https://doi.org/10.36753/mathenot.421418
AMA Tabass MS, Borzadaran GRM, Amini M. Renyi entropy in continuous case is not the limit of discrete case. Math. Sci. Appl. E-Notes. April 2016;4(1):113-117. doi:10.36753/mathenot.421418
Chicago Tabass, M. Sanei, G. R. Mohtashami Borzadaran, and M. Amini. “Renyi Entropy in Continuous Case Is Not the Limit of Discrete Case”. Mathematical Sciences and Applications E-Notes 4, no. 1 (April 2016): 113-17. https://doi.org/10.36753/mathenot.421418.
EndNote Tabass MS, Borzadaran GRM, Amini M (April 1, 2016) Renyi entropy in continuous case is not the limit of discrete case. Mathematical Sciences and Applications E-Notes 4 1 113–117.
IEEE M. S. Tabass, G. R. M. Borzadaran, and M. Amini, “Renyi entropy in continuous case is not the limit of discrete case”, Math. Sci. Appl. E-Notes, vol. 4, no. 1, pp. 113–117, 2016, doi: 10.36753/mathenot.421418.
ISNAD Tabass, M. Sanei et al. “Renyi Entropy in Continuous Case Is Not the Limit of Discrete Case”. Mathematical Sciences and Applications E-Notes 4/1 (April 2016), 113-117. https://doi.org/10.36753/mathenot.421418.
JAMA Tabass MS, Borzadaran GRM, Amini M. Renyi entropy in continuous case is not the limit of discrete case. Math. Sci. Appl. E-Notes. 2016;4:113–117.
MLA Tabass, M. Sanei et al. “Renyi Entropy in Continuous Case Is Not the Limit of Discrete Case”. Mathematical Sciences and Applications E-Notes, vol. 4, no. 1, 2016, pp. 113-7, doi:10.36753/mathenot.421418.
Vancouver Tabass MS, Borzadaran GRM, Amini M. Renyi entropy in continuous case is not the limit of discrete case. Math. Sci. Appl. E-Notes. 2016;4(1):113-7.

20477

The published articles in MSAEN are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.