Entropi Ölçülerinin Kestirim Başarımının İncelenmesi
Year 2021,
, 435 - 457, 24.05.2021
Özge Canlı
,
Serkan Günel
Abstract
Sistemler arasındaki bilgi akışını sistem durumları kullanılarak entropiden türetilen ölçüler yardımı ile nicelendirebiliriz. Bu çalışmada birbirine bağlanmış Gauss süreçlerinin ortak bilgi ölçüsü, aktarım entropisi ve iki durumlu sistemin entropisi bu amaçla kestirilmiştir. Sistemlerin bilgi ölçüleri histogram, çekirdek yoğunluk kestirimcisi, k-en yakın komşuluklu entropi kestirimcisi ve kpN entropi kestirimcisinin farklı parametreleri için elde edilmiştir. Ölçülerin kestirim başarımını belirlemek için normalize olmuş kestirim yanlılık ve normalize olmuş standart hata değerleri hesaplanmıştır. Kestirimcilerin en iyi parametreleri bu ölçüler için saptanmıştır. k-en yakın komşuluklu kestirimcisinin birbirine bağlanmış Gauss süreçleri için en iyi performansa sahip olduğu gösterilmiştir. Buna ek olarak, doğrusal olmayan iki durumlu sistemin entropisi kestirilmiştir. Çekirdek yoğunluk kestirimcisi ve kpN entropi kestirimcisi bu sistem için entropi kestiriminde başarılı olduğu, histogram ve k-en yakın komşuluklu entropi kestirimcilerinin normalize olmuş kestirim yanlılığının ve normalize olmuş standart hata oranlarının yüksek olduğu gösterilmiştir. Entropi kestirimlerinin bulunan en iyi parametreleri kullanıldığında birbirine bağlanmış dinamik sistemlerin entropi kestirimleri yüksek doğrulukla kestirilir. Aktarım entropisi bahsedilen entropi yöntemleri kullanılarak gerçek bir veri kümesi için kestirilmiş ve hipotez testleri yardımı ile aktarım entropisini kestiren yöntemlerin istatiksel önem dereceleri incelenmiştir. Bu çalışma sistemlerin belirsizliği ve sistemler arasındaki bilgi akışını temsil eden bilgi ölçüsü parametrelerini uygun şekilde belirlememiz için bir yol sunar.
References
- Shannon, C. E. 1948. A mathematical theory of communication. Bell System Technical Journal, Cilt. 27 (3), s. 379-423. DOI: 10.1002/j.1538-7305.1948.tb01338.x
- Cover, T. M. & Thomas , J. A., 2012. Elements of information theory, John Wiley & Sons. 2nd Edition, 748s.
- Schreiber, T., 2000. Measuring information transfer. Physical review letters, Cilt 85 (2), s. 461-464. DOI: 10.1103/PhysRevLett.85.46
- Duan, P., Yang, F., Chen, T. , Shah, S. L., 2013. Direct causality detection via the transfer entropy approach. IEEE transactions on control systems technology, Cilt. 21(6), s. 2052-2066,. DOI: 10.1109/TCST.2012.2233476
- Wibral, M., Vicente, R., Lindner, M., 2014. Directed Information in NeuroscienceTransfer entropy in neuroscience. ss 3-36. Wibral, M., Vicente, R., Lindner, M., ed. 2014. Directed information measures in neuroscience, Springer, Berlin, Heidelberg, 224 s.
- Spinney, R. E., Prokopenko, M., Lizier, J. T., 2017. Transfer entropy in continuous time, with applications to jump and neural spiking processes. Physical Review E, Cilt. 95(3), 032319. DOI: 10.1103/PhysRevE.95.032319
- Xiong, W., Faes, L., Ivanov, P. C., 2017. Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations. Physical Review E, Cilt. 95(6), 062114. DOI: 10.1103/PhysRevE.95.062114
- Kraskov, A., Stögbauer, H., Grassberger, P., 2004. Estimating mutual information. Physical review E, Cilt. 69(6), 066138. DOI: 10.1103/PhysRevE.69.066138
- Lombardi, D., & Pant, S., 2016. Nonparametric k-nearest-neighbor entropy estimator. Physical Review E, Cilt. 93(1), 013310. DOI: 10.1103/PhysRevE.93.013310
- Gencaga, D., Knuth, K. H., Rossow, W. B., 2015. A recipe for the estimation of information flow in a dynamical system. Entropy, Cilt. 17(1), s. 438-470. DOI: 10.3390/e17010438
- Diego, D., Haaga, K. A., Hannisdal, B., 2019. Transfer entropy computation using the Perron-Frobenius operator. Physical Review E, Cilt. 99(4), 042212. DOI: 10.1103/PhysRevE.99.042212
- Wand, M. P. ve Jones, M. C., 1994. Kernel smoothing, Chapman and Hall/CRC, 224 s.
- Kozachenko, L. F. ve Leonenko, N. N., 1987. Sample estimate of the entropy of a random vector. Problemy Peredachi Informatsii, Cilt 23(2), s. 9-16.
- Zhu, J., Bellanger, J. J., Shu, H., Le Bouquin Jeannès, R., 2015. Contribution to transfer entropy estimation via the k-nearest-neighbors approach. Entropy, Cilt. 17(6), s. 4173-4201. DOI: 10.3390/e17064173
- Kaiser, A. ve Schreiber, T., 2002. Information transfer in continuous processes. Physica D: Nonlinear Phenomena, Cilt. 166(1-2), s. 43-62. DOI: 10.1016/S0167-2789(02)00432-3
- Soong T. T., 1973. Random differential equations in science and engineering, New York, U.S.A.: Academic Press, 333 s.
- Shanmugan K. S., Breipohl, A. M., 1988. Random Signals: Detection, Estimation and Analysis, 688 s.
Performance Investigation of the Entropy Measures Estimations
Year 2021,
, 435 - 457, 24.05.2021
Özge Canlı
,
Serkan Günel
Abstract
Information flow between the systems can be quantified by using the measures that are derived from entropies by observing the state variables. In this study, the mutual information and the transfer entropy of the coupled Gaussian processes and the entropy of the bistable system have been estimated for this purpose. Information measures of the systems have been obtained by the histogram, kernel density estimator, k-nearest neighbor entropy estimator and kpN entropy estimator for the different parameters. Normalized bias and normalized standard error have been calculated to determine the performance of the estimation of the entropy measures. The optimal parameters of the entropy estimators of information measures have been obtained. It is shown that k-nearest neighbor entropy estimator has the best performance for the coupled Gaussian processes. Additionally, the entropy of a nonlinear system i.e. the bistable system has been estimated. While kernel density estimator and k-nearest neighbor entropy estimator have succeeded in achieving the estimation of the entropy in the bistable system, it is shown that normalized bias and normalized standard error of histogram and k-nearest neighbor method have high ratios. The entropy estimations of the coupled dynamical systems have been estimated with considerable accuracy when the optimal parameters of the entropy estimations have been used. The transfer entropy of a real dataset has also been estimated via these methods. We have discussed the statistical significance of the estimation methods of transfer entropy by using a hypothesis test. The study provides a way to determine information measures that quantify the uncertainty of the systems and information flows between the systems, accurately.
References
- Shannon, C. E. 1948. A mathematical theory of communication. Bell System Technical Journal, Cilt. 27 (3), s. 379-423. DOI: 10.1002/j.1538-7305.1948.tb01338.x
- Cover, T. M. & Thomas , J. A., 2012. Elements of information theory, John Wiley & Sons. 2nd Edition, 748s.
- Schreiber, T., 2000. Measuring information transfer. Physical review letters, Cilt 85 (2), s. 461-464. DOI: 10.1103/PhysRevLett.85.46
- Duan, P., Yang, F., Chen, T. , Shah, S. L., 2013. Direct causality detection via the transfer entropy approach. IEEE transactions on control systems technology, Cilt. 21(6), s. 2052-2066,. DOI: 10.1109/TCST.2012.2233476
- Wibral, M., Vicente, R., Lindner, M., 2014. Directed Information in NeuroscienceTransfer entropy in neuroscience. ss 3-36. Wibral, M., Vicente, R., Lindner, M., ed. 2014. Directed information measures in neuroscience, Springer, Berlin, Heidelberg, 224 s.
- Spinney, R. E., Prokopenko, M., Lizier, J. T., 2017. Transfer entropy in continuous time, with applications to jump and neural spiking processes. Physical Review E, Cilt. 95(3), 032319. DOI: 10.1103/PhysRevE.95.032319
- Xiong, W., Faes, L., Ivanov, P. C., 2017. Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations. Physical Review E, Cilt. 95(6), 062114. DOI: 10.1103/PhysRevE.95.062114
- Kraskov, A., Stögbauer, H., Grassberger, P., 2004. Estimating mutual information. Physical review E, Cilt. 69(6), 066138. DOI: 10.1103/PhysRevE.69.066138
- Lombardi, D., & Pant, S., 2016. Nonparametric k-nearest-neighbor entropy estimator. Physical Review E, Cilt. 93(1), 013310. DOI: 10.1103/PhysRevE.93.013310
- Gencaga, D., Knuth, K. H., Rossow, W. B., 2015. A recipe for the estimation of information flow in a dynamical system. Entropy, Cilt. 17(1), s. 438-470. DOI: 10.3390/e17010438
- Diego, D., Haaga, K. A., Hannisdal, B., 2019. Transfer entropy computation using the Perron-Frobenius operator. Physical Review E, Cilt. 99(4), 042212. DOI: 10.1103/PhysRevE.99.042212
- Wand, M. P. ve Jones, M. C., 1994. Kernel smoothing, Chapman and Hall/CRC, 224 s.
- Kozachenko, L. F. ve Leonenko, N. N., 1987. Sample estimate of the entropy of a random vector. Problemy Peredachi Informatsii, Cilt 23(2), s. 9-16.
- Zhu, J., Bellanger, J. J., Shu, H., Le Bouquin Jeannès, R., 2015. Contribution to transfer entropy estimation via the k-nearest-neighbors approach. Entropy, Cilt. 17(6), s. 4173-4201. DOI: 10.3390/e17064173
- Kaiser, A. ve Schreiber, T., 2002. Information transfer in continuous processes. Physica D: Nonlinear Phenomena, Cilt. 166(1-2), s. 43-62. DOI: 10.1016/S0167-2789(02)00432-3
- Soong T. T., 1973. Random differential equations in science and engineering, New York, U.S.A.: Academic Press, 333 s.
- Shanmugan K. S., Breipohl, A. M., 1988. Random Signals: Detection, Estimation and Analysis, 688 s.