The Weibull and Log-Normal distributions are frequently used in reliability to analyze lifetime (or failure time) data. The ratio of maximized likelihood (RML) has been extensively used in choosing between the two distributions. The Kullback-Leibler information is a measure of uncertainty between two densities. We examine the use of Kullback-Leibler Divergence (KLD) in discriminating either the Weibull or Log-Normal distribution. An advantage of the KLD is that it incorporates entropy of each model. We explain the applicability of the KLD by a real data set and the consistency of the KLD with the RML is established.
Primary Language | English |
---|---|
Journal Section | Makaleler |
Authors | |
Publication Date | December 2, 2014 |
Published in Issue | Year 2012 Issue: 16 |