2021
DOI: 10.1002/andp.202000508
|View full text |Cite
|
Sign up to set email alerts
|

Toward Bayesian Data Compression

Abstract: In order to handle large datasets omnipresent in modern science, efficient compression algorithms are necessary. Here, a Bayesian data compression (BDC) algorithm that adapts to the specific measurement situation is derived in the context of signal reconstruction. BDC compresses a dataset under conservation of its posterior structure with minimal information loss given the prior knowledge on the signal, the quantity of interest. Its basic form is valid for Gaussian priors and likelihoods. For constant noise st… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…As the data volume of online education resources is increasing, there is a need to improve the efficiency of data utilization while protecting the data operation and maintenance costs of online education resources. In a database system, the architecture is divided into application layer, logical layer, and physical layer [13]. For the online education management platform, the database infrastructure is widely used, as shown in Figure 1.…”
Section: Design Of Data Compression Strategymentioning
confidence: 99%
“…As the data volume of online education resources is increasing, there is a need to improve the efficiency of data utilization while protecting the data operation and maintenance costs of online education resources. In a database system, the architecture is divided into application layer, logical layer, and physical layer [13]. For the online education management platform, the database infrastructure is widely used, as shown in Figure 1.…”
Section: Design Of Data Compression Strategymentioning
confidence: 99%
“…More specifically, in the problem of Bayesian data compression one tries to find compressed data that imply an approximate posterior that is as similar as possible to the original one, which is measured by their relative entropy. [ 62 ] However, there can be cases in which the relative attention entropy is a better choice as it permits for importance weighting of the potential situations.…”
Section: Discussionmentioning
confidence: 99%