An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are derived for the distance between the distribution of a sum of independent integer-valued random variables and an appropriately chosen compound Poisson law. In the case where all summands have the same conditional distribution given that they are non-zero, a bound on the relative entropy distance between their sum and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the summands have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals,'' and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds.

## ZORA Wartung

ZORA's new graphical user interface has been launched. For further infos take a look at Open Access Blog 'New Look & Feel – ZORA goes mobile'.

Barbour, A D; Johnson, O; Kontoyiannis, I; Madiman, M (2010). *Compound poisson approximation via information functionals.* Electronic Journal of Probability, 15(42):1344-1369.

## Abstract

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are derived for the distance between the distribution of a sum of independent integer-valued random variables and an appropriately chosen compound Poisson law. In the case where all summands have the same conditional distribution given that they are non-zero, a bound on the relative entropy distance between their sum and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the summands have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals,'' and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds.

## Citations

## Downloads

## Additional indexing

Item Type: | Journal Article, refereed, original work |
---|---|

Communities & Collections: | 07 Faculty of Science > Institute of Mathematics |

Dewey Decimal Classification: | 510 Mathematics |

Language: | English |

Date: | 31 August 2010 |

Deposited On: | 23 Dec 2010 13:53 |

Last Modified: | 05 Apr 2016 14:24 |

Publisher: | Institute of Mathematical Statistics |

ISSN: | 1083-6489 |

Official URL: | http://www.math.washington.edu/~ejpecp/EjpVol15/paper42.abs.html |

Related URLs: | http://arxiv.org/abs/1004.3692 |

## Download

| Filetype:
PDF (Verlags-PDF)
Size: 1MB | |

| Content: Accepted Version Filetype: PDF Size: 292kB |

TrendTerms displays relevant terms of the abstract of this publication and related documents on a map. The terms and their relations were extracted from ZORA using word statistics. Their timelines are taken from ZORA as well. The bubble size of a term is proportional to the number of documents where the term occurs. Red, orange, yellow and green colors are used for terms that occur in the current document; red indicates high interlinkedness of a term with other terms, orange, yellow and green decreasing interlinkedness. Blue is used for terms that have a relation with the terms in this document, but occur in other documents.

You can navigate and zoom the map. Mouse-hovering a term displays its timeline, clicking it yields the associated documents.