INFORMATION DISTINGUISHABILITY WITH APPLICATION TO ANALYSIS OF FAILURE DATA
成果类型:
Article
署名作者:
SOOFI, ES; EBRAHIMI, N; HABIBULLAH, M
署名单位:
Northern Illinois University; University of Wisconsin System
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.2307/2291079
发表日期:
1995
页码:
657-668
关键词:
maximum-entropy
distributions
tests
MODEL
摘要:
In maximum entropy (ME) modeling, the information discrepancy between two distributions is measured in terms of their entropy difference. In discrimination information statistics the information discrepancy between two distributions is measured in terms of the Kullback-Leibler function (i.e., relative entropy or cross-entropy). This article presents an equivalence between Kullback-Leibler functions and entropy differences involving an ME distribution. Based an this equivalence, the concept of information discrimination (ID) distinguishability is introduced as a unifying framework for the two methods of measuring information discrepancy between distributions. Applications of ID distinguishability as diagnostics for examining robustness of parametric procedures and sensitivity of nonparametric statistics across parametric families of distributions is proposed. The equivalence result facilitates estimation of Kullback-Leibler functions in terms of entropy estimates. Application of the ID distinguishability to modeling failure data brings a new dimension into entropy estimation-entropy estimation based on the hazard function. ID statistics for modeling lifetime distributions with increasing failure rates are studied. Two illustrative examples are analyzed.