Header

UZH-Logo

Maintenance Infos

Lossless analog compression


Alberti, Giovanni; Bolcskei, Helmut; De Lellis, Camillo; Koliander, Gunther; Riegler, Erwin (2019). Lossless analog compression. IEEE Transactions on Information Theory, 65(11):7480-7513.

Abstract

We establish the fundamental limits of lossless analog compression by considering the recovery of arbitrary random vectors x ∈ R m from the noiseless linear measurements y = Ax with measurement matrix A ∈ R n×m . Our theory is inspired by the groundbreaking work of Wu and Verdú (2010) on almost lossless analog compression but applies to the nonasymptotic, i.e., fixed-m case, and considers zero error probability. Specifically, our achievability result states that, for Lebesgue-almost all A, the random vector x can be recovered with zero error probability provided that n > K(x), where K(x) is given by the infimum of the lower modified Minkowski dimension over all support sets U of x (i.e., sets U ⊆ R m with P[x ∈ U] = 1). We then particularize this achievability result to the class of s-rectifiable random vectors as introduced in Koliander et al. (2016); these are random vectors of absolutely continuous distribution-with respect to the s-dimensional Hausdorff measure-supported on countable unions of s-dimensional C 1 -submanifolds of Rm. Countable unions of C 1 -submanifolds include essentially all signal models used in the compressed sensing literature such as the standard union of subspaces model underlying much of compressed sensing theory and spectrum-blind sampling, smooth submanifolds, block-sparsity, and low-rank matrices as considered in the matrix completion problem. Specifically, we prove that, for Lebesgue-almost all A, s-rectifiable random vectors x can be recovered with zero error probability from n > s linear measurements. This threshold is, however, found not to be tight as exemplified by the construction of an s-rectifiable random vector that can be recovered with zero error probability from n <; s linear measurements. Motivated by this observation, we introduce the new class of s-analytic random vectors, which admit a strong converse in the sense of n ≥ s being necessary for recovery with probability of error smaller than one. The central conceptual tools in the development of our theory are geometric measure theory and the theory of real analytic functions.

Abstract

We establish the fundamental limits of lossless analog compression by considering the recovery of arbitrary random vectors x ∈ R m from the noiseless linear measurements y = Ax with measurement matrix A ∈ R n×m . Our theory is inspired by the groundbreaking work of Wu and Verdú (2010) on almost lossless analog compression but applies to the nonasymptotic, i.e., fixed-m case, and considers zero error probability. Specifically, our achievability result states that, for Lebesgue-almost all A, the random vector x can be recovered with zero error probability provided that n > K(x), where K(x) is given by the infimum of the lower modified Minkowski dimension over all support sets U of x (i.e., sets U ⊆ R m with P[x ∈ U] = 1). We then particularize this achievability result to the class of s-rectifiable random vectors as introduced in Koliander et al. (2016); these are random vectors of absolutely continuous distribution-with respect to the s-dimensional Hausdorff measure-supported on countable unions of s-dimensional C 1 -submanifolds of Rm. Countable unions of C 1 -submanifolds include essentially all signal models used in the compressed sensing literature such as the standard union of subspaces model underlying much of compressed sensing theory and spectrum-blind sampling, smooth submanifolds, block-sparsity, and low-rank matrices as considered in the matrix completion problem. Specifically, we prove that, for Lebesgue-almost all A, s-rectifiable random vectors x can be recovered with zero error probability from n > s linear measurements. This threshold is, however, found not to be tight as exemplified by the construction of an s-rectifiable random vector that can be recovered with zero error probability from n <; s linear measurements. Motivated by this observation, we introduce the new class of s-analytic random vectors, which admit a strong converse in the sense of n ≥ s being necessary for recovery with probability of error smaller than one. The central conceptual tools in the development of our theory are geometric measure theory and the theory of real analytic functions.

Statistics

Citations

Dimensions.ai Metrics

Altmetrics

Downloads

24 downloads since deposited on 16 Dec 2019
24 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Mathematics
Dewey Decimal Classification:510 Mathematics
Uncontrolled Keywords:Library and Information Sciences, Information Systems, Computer Science Applications
Language:English
Date:1 November 2019
Deposited On:16 Dec 2019 08:42
Last Modified:16 Dec 2019 08:42
Publisher:Institute of Electrical and Electronics Engineers
ISSN:0018-9448
OA Status:Green
Publisher DOI:https://doi.org/10.1109/tit.2019.2923091

Download

Green Open Access

Content: Published Version
Language: English
Filetype: PDF - Registered users only
Size: 791kB
View at publisher
Download PDF  'Lossless analog compression'.
Preview
Content: Submitted Version
Language: English
Filetype: PDF
Size: 767kB