GAUSSIAN CONCENTRATION BOUNDS FOR STOCHASTIC CHAINS OF UNBOUNDED MEMORY
成果类型:
Article
署名作者:
Chazottes, Jean-Rene; Gallo, Sandro; Takahashi, Daniel Y.
署名单位:
Institut Polytechnique de Paris; Ecole Polytechnique; Centre National de la Recherche Scientifique (CNRS); Universidade Federal de Sao Carlos; Universidade Federal do Rio Grande do Norte
刊物名称:
ANNALS OF APPLIED PROBABILITY
ISSN/ISSBN:
1050-5164
DOI:
10.1214/22-AAP1893
发表日期:
2023
页码:
3321-3350
关键词:
complete connections
Concentration inequalities
perfect simulation
markov approximations
random-variables
uniqueness
nonuniqueness
divergence
FIELDS
摘要:
Stochastic chains of unbounded memory (SCUMs) are generalization of Markov chains, also known in the literature as chains with complete con-nections or g-measures. We obtain Gaussian concentration bounds (GCB) in this large class of models, for general alphabets, under two different con-ditions on the kernel: (1) when the sum of its oscillations is less than one, or (2) when the sum of its variations is finite, that is, belongs to l(1)(N). We also obtain explicit constants as functions of the parameters of the model. Our conditions are sharp in the sense that we exhibit examples of SCUMs that do not have GCB and for which the sum of oscillations is 1 + is an element of, or the variation belongs to l(1+is an element of)(N) for any e > 0. These examples are based on the existence of phase transitions.We illustrate our results with four applications. First, we derive a Dvoretzky-Kiefer-Wolfowitz-type inequality which gives a uniform control on the fluctuations of the empirical measure. Second, in the finite-alphabet case, we obtain an upper bound on the d -distance between two stationary SCUMs and, as a by-product, we obtain new explicit bounds on the speed of Markovian approximation in d . Third, we derive new bounds on the fluctu-ations of the plug-in estimator for entropy. Fourth, we obtain new rate of convergence for the maximum likelihood estimator of conditional probability.
来源URL: