Bounding (d)over-bar-distance by informational divergence: A method to prove measure concentration
成果类型:
Article
署名作者:
Marton, K
刊物名称:
ANNALS OF PROBABILITY
ISSN/ISSBN:
0091-1798
发表日期:
1996
页码:
857-866
关键词:
摘要:
There is a simple inequality by Pinsker between variational distance and informational divergence of probability measures defined on arbitrary probability spaces. We shall consider probability measures on sequences taken from countable alphabets, and derive, from Pinsker's inequality, bounds on the d-distance by informational divergence. Such bounds can be used to prove the ''concentration of measure'' phenomenon for some nonproduct distributions.