TY - JOUR T1 - Renyi entropy in continuous case is not the limit of discrete case AU - Tabass, M. Sanei AU - Borzadaran, G. R. Mohtashami AU - Amini, M. PY - 2016 DA - April DO - 10.36753/mathenot.421418 JF - Mathematical Sciences and Applications E-Notes JO - Math. Sci. Appl. E-Notes PB - Murat TOSUN WT - DergiPark SN - 2147-6268 SP - 113 EP - 117 VL - 4 IS - 1 LA - en AB - Shannon entropy is the creation of Shannon (1948) based on the experiences in Bell System Companyduring and after the Second World War. Then, Renyi (1961) generalized it for one parameter families ofentropies. This entropy for discrete random variables is non-negative but it can be negative in continuouscase.In this paper, we show that Renyi entropy for continuous random variables is not equal to the limit of itfor discrete random variables. Also, some notes are derived in view of the variate versions of entropycriteria.  KW - Vector valued Dirichlet series (VVDS) KW - relative Ritt order KW - relative Ritt lower order KW - growth CR - [1] Aczel, J. Forte, B. and Ng, C. T., Why Shannon and Hartley entropies are "natural". Adv. Appl. Probab. (1970), 6, 131-146. CR - [2] Campbell, L.L., Exponential entropy as a measure of extent of a distribution. Zeitschr. fur Wahrsch. und verw. Geb. (1966), 5, 217-255. CR - [3] Cover, T. M. and Thomas, J. A., Elements of Information Theory. Second Edition. Wiley Interscience. (2006). CR - [4] Daroczy, Z. and Maksa, Gy., Non-negative information functions in Analytic Function Methods in Probability and Statistics. Colloq. Math. Soc. J. Bolyai 21, Gyires, B., Ed; north Holand: Amesterdam. (1979), 65-76. CR - [5] Diderrich, G., The role of boundedness in characterizing Shannon entropy. Information and Control. (1975), 29, 140-161. CR - [6] Fadeev, D.K., On the concept of entropy of a finite probability scheme (in Russian). Uspehi Mat. Nauk. (1956), 11, 227-231. CR - [7] Hartley, R. V. L., Transmission of information. Bell System Technical Journal. (1928), 7, 535-563. CR - [8] Harvda, J. and Charvat, F., Quantification method of classification processes: Concept of structural α−entropy. Kybernetika. (1967), 3, 30-35. CR - [9] Koski, T. and Persson, L.E., Some properties of generalized exponential entropies with applications to data compression. Information Theory. 50 (1992), 6, 1220-1228. CR - [10] Lee, P. M., On the axioms of information theory. Ann. Math. Statist. (1964) 35, 415-418. CR - [11] Renyi, A., On measures of entropy and information. Proc. Berekeley Symposium, Statist. Probability. (1961), 1, 547-561. CR - [12] Shannon, C. E., A mathematical theory of communication. Bell System Technical Journal., (1948), 27, 379-423. CR - [13] Tsallis, C., Possible generalizations of Boltzmann-Gibbs statistics. Journal of Statistical Physics. (1988), 52, 479-487. CR - [14] Tverberg, H., A new derivation of the information function. Math. Scand. (1958), 6, 297-298. UR - https://doi.org/10.36753/mathenot.421418 L1 - https://dergipark.org.tr/en/download/article-file/468595 ER -