Review
BibTex RIS Cite

Entropy in Artificial Intelligence Methods

Year 2019, Volume: 1 Issue: 1, 15 - 22, 30.12.2019

Abstract

The
importance of measurement and quality of information, as well as attributing a
meaning to information, is understood better by considering many aspects such
as the historical change in the capacity of information and the development of
systems to manage and guide such information. Information Theory, put forward
by Claude E. Shannon, has revolutionized the controlled management of
knowledge. The concept of entropy, which is used for the uncertainty of
information by many methods, has been developed on the basis of features
introduced by Shannon. In this context, entropy, which is a measure of
irregularity in the measurement of information, is important in many areas. In
recent years, rapidly growing Artificial Intelligence field is one of such
areas. Artificial Intelligence has become processable on larger data,
especially with the development of Big Data and Deep Learning areas. In this
study, the use of entropy concept in some of the methods in Artificial
Intelligence field is investigated. As a result, it is observed that although
there is uniformity in terms of basic logic and concept, there are differences
in methodological application. One of the main objectives of the study is to
gain an insight to develop novel methods in the context of entropy and
artificial intelligence.

References

  • [1] R. Clausius, "Über verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie," Annalen der Physik, vol. 201, pp. 353-400, 1865.
  • [2] M. Badino, "Mechanistic slumber vs. statistical insomnia: the early history of Boltzmann’s H-theorem (1868–1877)," The European Physical Journal H, vol. 36, pp. 353-378, 2011.
  • [3] J. W. Gibbs, On the equilibrium of heterogeneous substances, 1879.
  • [4] H. Nyquist, "Certain factors affecting telegraph speed," Transactions of the American Institute of Electrical Engineers, vol. 43, pp. 412-422, 1924.
  • [5] R. V. Hartley, "Transmission of information 1," Bell System technical journal, vol. 7, pp. 535-563, 1928.
  • [6] C. E. Shannon, "A mathematical theory of communication," Bell system technical journal, vol. 27, pp. 379-423, 1948.
  • [7] C. E. Shannon, "Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy," IEEE Transactions on information theory, vol. 26, pp. 26-37, 1948.
  • [8] S. F. Gull, "Bayesian inductive inference and maximum entropy," in Maximum-entropy and Bayesian methods in science and engineering, ed: Springer, 1988, pp. 53-74.
  • [9] B. Hssina, A. Merbouha, H. Ezzikouri, and M. Erritali, "A comparative study of decision tree ID3 and C4. 5," International Journal of Advanced Computer Science and Applications, vol. 4, 2014.
  • [10] A. Janecek, W. Gansterer, M. Demel, and G. Ecker, "On the relationship between feature selection and classification accuracy," in New Challenges for Feature Selection in Data Mining and Knowledge Discovery, 2008, pp. 90-105.
  • [11] N. Murata, S. Yoshizawa, and S.-i. Amari, "Network information criterion-determining the number of hidden units for an artificial neural network model," IEEE Transactions on Neural Networks, vol. 5, pp. 865-872, 1994.
  • [12] C.-Y. J. Peng, K. L. Lee, and G. M. Ingersoll, "An introduction to logistic regression analysis and reporting," The journal of educational research, vol. 96, pp. 3-14, 2002.
  • [13] C. Liu and H.-Y. Shum, "Kullback-leibler boosting," in Computer Vision and Pattern Recognition, 2003. Proceedings. 2003 IEEE Computer Society Conference on, 2003, pp. I-I.
  • [14] R. Jenssen, K. Hild, D. Erdogmus, J. C. Principe, and T. Eltoft, "Clustering using Renyi's entropy," in Neural Networks, 2003. Proceedings of the International Joint Conference on, 2003, pp. 523-528.
  • [15] R.-P. Li and M. Mukaidono, "A maximum-entropy approach to fuzzy clustering," in Fuzzy Systems, 1995. International Joint Conference of the Fourth IEEE International Conference on Fuzzy Systems and The Second International Fuzzy Engineering Symposium., Proceedings of 1995 IEEE Int, 1995, pp. 2227-2232.
  • [16] C. Jin, L. De-Lin, and M. Fen-Xiang, "An improved ID3 decision tree algorithm," in Computer Science & Education, 2009. ICCSE'09. 4th International Conference on, 2009, pp. 127-130.
  • [17] C.-L. Chang and M.-Y. Hsu, "The study that applies artificial intelligence and logistic regression for assistance in differential diagnostic of pancreatic cancer," Expert Systems with Applications, vol. 36, pp. 10663-10672, 2009.
  • [18] G. Li and J. Shi, "On comparing three artificial neural networks for wind speed forecasting," Applied Energy, vol. 87, pp. 2313-2320, 2010.
  • [19] D. Yu, K. Yao, H. Su, G. Li, and F. Seide, "KL-divergence regularized deep neural network adaptation for improved large vocabulary speech recognition," in Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on, 2013, pp. 7893-7897.

Yapay Zekâ Yöntemlerinde Entropi

Year 2019, Volume: 1 Issue: 1, 15 - 22, 30.12.2019

Abstract

Bilginin
tarihsel süreçteki kapasitesinin değişimi, bu bilgileri yönetmek ve
yönlendirmek için oluşturulan sistemlerin gelişimi gibi birçok nokta dikkate
alındığında, bilginin hem ölçümünün hem kalitesinin hem de anlamlandırılmasının
önemi daha iyi anlaşılmaktadır. Claude E. Shannon tarafından ortaya atılan Bilgi
Kuramı da bu noktada bilginin kontrollü yönetimi için çığır açmıştır. Devamında
gelişen birçok yöntem tarafından bilginin belirsizliği için kullanılan entropi
kavramı ise Shannon tarafından ortaya atılan özellikler temel alınarak
geliştirilmiştir. Bu bağlamda bilginin ölçümünde düzensizliğin ölçütü olan
entropi birçok alanda önem arz etmektedir. Son yıllarda hızla büyüyen Yapay
Zekâ alanı ise bunlardan biridir. Yapay Zekâ özellikle Büyük Veri ve Derin
Öğrenme alanlarının gelişimi ile daha büyük veriler üzerinde işlem yapılabilir
bir alan haline gelmiştir. Bu çalışmamızda Yapay Zekâ alanındaki yöntemlerden
bazılarında kullanılan entropi kavramı üzerine bir inceleme çalışması
yapılmıştır. Sonuç olarak temel mantık ve kavram açısından aynı olmakla birlikte
yöntemsel uygulamada farklılıklar gözlemlenmiştir. Çalışmanın ana hedeflerinden
biri de incelenen entropi ve yapay zekâ yöntemleri bağlamında yeni yöntemlerin
geliştirilmesine ön ayak olmak için bir bakış açısı kazandırmaktır.

References

  • [1] R. Clausius, "Über verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie," Annalen der Physik, vol. 201, pp. 353-400, 1865.
  • [2] M. Badino, "Mechanistic slumber vs. statistical insomnia: the early history of Boltzmann’s H-theorem (1868–1877)," The European Physical Journal H, vol. 36, pp. 353-378, 2011.
  • [3] J. W. Gibbs, On the equilibrium of heterogeneous substances, 1879.
  • [4] H. Nyquist, "Certain factors affecting telegraph speed," Transactions of the American Institute of Electrical Engineers, vol. 43, pp. 412-422, 1924.
  • [5] R. V. Hartley, "Transmission of information 1," Bell System technical journal, vol. 7, pp. 535-563, 1928.
  • [6] C. E. Shannon, "A mathematical theory of communication," Bell system technical journal, vol. 27, pp. 379-423, 1948.
  • [7] C. E. Shannon, "Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy," IEEE Transactions on information theory, vol. 26, pp. 26-37, 1948.
  • [8] S. F. Gull, "Bayesian inductive inference and maximum entropy," in Maximum-entropy and Bayesian methods in science and engineering, ed: Springer, 1988, pp. 53-74.
  • [9] B. Hssina, A. Merbouha, H. Ezzikouri, and M. Erritali, "A comparative study of decision tree ID3 and C4. 5," International Journal of Advanced Computer Science and Applications, vol. 4, 2014.
  • [10] A. Janecek, W. Gansterer, M. Demel, and G. Ecker, "On the relationship between feature selection and classification accuracy," in New Challenges for Feature Selection in Data Mining and Knowledge Discovery, 2008, pp. 90-105.
  • [11] N. Murata, S. Yoshizawa, and S.-i. Amari, "Network information criterion-determining the number of hidden units for an artificial neural network model," IEEE Transactions on Neural Networks, vol. 5, pp. 865-872, 1994.
  • [12] C.-Y. J. Peng, K. L. Lee, and G. M. Ingersoll, "An introduction to logistic regression analysis and reporting," The journal of educational research, vol. 96, pp. 3-14, 2002.
  • [13] C. Liu and H.-Y. Shum, "Kullback-leibler boosting," in Computer Vision and Pattern Recognition, 2003. Proceedings. 2003 IEEE Computer Society Conference on, 2003, pp. I-I.
  • [14] R. Jenssen, K. Hild, D. Erdogmus, J. C. Principe, and T. Eltoft, "Clustering using Renyi's entropy," in Neural Networks, 2003. Proceedings of the International Joint Conference on, 2003, pp. 523-528.
  • [15] R.-P. Li and M. Mukaidono, "A maximum-entropy approach to fuzzy clustering," in Fuzzy Systems, 1995. International Joint Conference of the Fourth IEEE International Conference on Fuzzy Systems and The Second International Fuzzy Engineering Symposium., Proceedings of 1995 IEEE Int, 1995, pp. 2227-2232.
  • [16] C. Jin, L. De-Lin, and M. Fen-Xiang, "An improved ID3 decision tree algorithm," in Computer Science & Education, 2009. ICCSE'09. 4th International Conference on, 2009, pp. 127-130.
  • [17] C.-L. Chang and M.-Y. Hsu, "The study that applies artificial intelligence and logistic regression for assistance in differential diagnostic of pancreatic cancer," Expert Systems with Applications, vol. 36, pp. 10663-10672, 2009.
  • [18] G. Li and J. Shi, "On comparing three artificial neural networks for wind speed forecasting," Applied Energy, vol. 87, pp. 2313-2320, 2010.
  • [19] D. Yu, K. Yao, H. Su, G. Li, and F. Seide, "KL-divergence regularized deep neural network adaptation for improved large vocabulary speech recognition," in Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on, 2013, pp. 7893-7897.
There are 19 citations in total.

Details

Primary Language Turkish
Subjects Computer Software
Journal Section Vol 1 - Issue 1 - 30 December 2019 [en]
Authors

Muhammed Ali Koşan 0000-0002-1422-6006

Aysun Coşkun 0000-0002-8541-9497

Hacer Karacan 0000-0001-6788-008X

Publication Date December 30, 2019
Published in Issue Year 2019 Volume: 1 Issue: 1

Cite

APA Koşan, M. A., Coşkun, A., & Karacan, H. (2019). Yapay Zekâ Yöntemlerinde Entropi. Journal of Information Systems and Management Research, 1(1), 15-22.