Abstract
In this paper we complete our understanding of the role played by the limiting (or residue) function in the context of mod-Gaussian convergence. The question about the probabilistic interpretation of such functions was initially raised by Marc Yor. After recalling our recent result which interprets the limiting function as a measure of “breaking of symmetry” in the Gaussian approximation in the framework of general central limit theorems type results, we introduce the framework of L$^{1}$-mod-Gaussian convergence in which the residue function is obtained as (up to a normalizing factor) the probability density of some sequences of random variables converging in law after a change of probability measure. In particular we recover some celebrated results due to Ellis and Newman on the convergence in law of dependent random variables arising in statistical mechanics. We complete our results by giving an alternative approach to the Stein method to obtain the rate of convergence in the Ellis-Newman convergence theorem and by proving a new local limit theorem. More generally we illustrate our results with simple models from statistical mechanics.