WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS ARE MINIMAX OPTIMAL DISTRIBUTION ESTIMATORS

成果类型:
Article
署名作者:
Stephanovitch, Arthur; Aamari, Eddie; Levrard, Clement
署名单位:
Centre National de la Recherche Scientifique (CNRS); Universite PSL; Ecole Normale Superieure (ENS); Centre National de la Recherche Scientifique (CNRS); Universite de Rennes
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/24-AOS2430
发表日期:
2024
页码:
2167-2193
关键词:
manifold
摘要:
We provide nonasymptotic rates of convergence of the Wasserstein Generative Adversarial networks (WGAN) estimator. We build neural networks classes representing the generators and discriminators which yield a GAN that achieves the minimax optimal rate for estimating a certain probability measure mu with support in Rp. The probability mu is considered to be the push forward of the Lebesgue measure on the d-dimensional torus Td by a map g star : Td -> Rp of smoothness beta + 1. Measuring the error with the gamma-Holder Integral Probability Metric (IPM), we obtain up to logarithmic factors, the minimax optimal rate O (n termines the smoothness of the target measure mu , gamma is the smoothness of the IPM (gamma = 1 is the Wasserstein case) and d <= p is the intrinsic dimension of mu . In the process, we derive a sharp interpolation inequality between Holder IPMs. This novel result of theory of functions spaces generalizes classical interpolation inequalities to the case where the measures involved have densities on different manifolds.