Header

UZH-Logo

Maintenance Infos

Compressing Bidirectional Texture Functions via Tensor Train Decomposition


Ballester-Ripoll, Rafael; Pajarola, R (2016). Compressing Bidirectional Texture Functions via Tensor Train Decomposition. In: Proceedings Pacific Graphics Short Papers, Okinawa, 11 October 2016 - 14 October 2016, 1-14.

Abstract

Material reflectance properties play a central role in photorealistic rendering. Bidirectional texture functions (BTFs) can faithfully represent these complex properties, but their inherent high dimensionality (texture coordinates, color channels, view and illumination spatial directions) requires many coefficients to encode. Numerous algorithms based on tensor decomposition have been proposed for efficient compression of multidimensional BTF arrays, however, these prior methods still grow exponentially in size with the number of dimensions. We tackle the BTF compression problem with a different model, the tensor train (TT) decomposition. The main difference is that TT compression scales linearly with the input dimensionality and is thus much better suited for high-dimensional data tensors. Furthermore, it allows faster random-access texel reconstruction than the previous Tucker-based approaches. We demonstrate the performance benefits of the TT decomposition in terms of accuracy and visual appearance, compression rate and reconstruction speed.

Abstract

Material reflectance properties play a central role in photorealistic rendering. Bidirectional texture functions (BTFs) can faithfully represent these complex properties, but their inherent high dimensionality (texture coordinates, color channels, view and illumination spatial directions) requires many coefficients to encode. Numerous algorithms based on tensor decomposition have been proposed for efficient compression of multidimensional BTF arrays, however, these prior methods still grow exponentially in size with the number of dimensions. We tackle the BTF compression problem with a different model, the tensor train (TT) decomposition. The main difference is that TT compression scales linearly with the input dimensionality and is thus much better suited for high-dimensional data tensors. Furthermore, it allows faster random-access texel reconstruction than the previous Tucker-based approaches. We demonstrate the performance benefits of the TT decomposition in terms of accuracy and visual appearance, compression rate and reconstruction speed.

Statistics

Citations

Dimensions.ai Metrics

Altmetrics

Downloads

36 downloads since deposited on 27 Dec 2016
19 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
Dewey Decimal Classification:000 Computer science, knowledge & systems
Language:English
Event End Date:14 October 2016
Deposited On:27 Dec 2016 08:16
Last Modified:26 Jul 2018 06:58
Publisher:The Eurographics Association
ISBN:978-3-03868-024-6
OA Status:Green
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.2312/pg.20161329
Other Identification Number:merlin-id:14369

Download

Download PDF  'Compressing Bidirectional Texture Functions via Tensor Train Decomposition'.
Preview
Filetype: PDF
Size: 3MB
View at publisher