Statistical foundations of deep generative models 

Abstract:

Deep generative models are probabilistic generative models where the generator is parameterized by a deep neural network. They are popular models for modeling high-dimensional data such as texts, images and speeches, and have achieved impressive empirical success. Despite demonstrated success in  empirical performance,  theoretical understanding of such models  is largely lacking. We investigate statistical properties of deep generative models from a nonparametric distribution estimation viewpoint. In the considered model, data are assumed to be observed in some high-dimensional ambient space but concentrate around some low-dimensional structure such as a lower-dimensional manifold.  This talk will focus on a theoretical underpinning of deep generative models from the lens of statistical theory. In particular, one will gain  insights into  i) how deep generative models can avoid the curse of dimensionality and outperform classical nonparametric estimates,  and  ii) how likelihood approaches work for high-dimensional distribution estimation, especially in adapting to the intrinsic geometry of the data. 

 

Presented By: Lizhen Lin (University of Maryland)