2015
DOI: 10.1111/cgf.12725
|View full text |Cite
|
Sign up to set email alerts
|

Projective Blue‐Noise Sampling

Abstract: We propose projective blue-noise patterns that retain their blue-noise characteristics when undergoing one or multiple projections onto lower dimensional subspaces. These patterns are produced by extending existing methods, such as dart throwing and Lloyd relaxation, and have a range of applications. For numerical integration, our patterns often outperform state-of-the-art stochastic and low-discrepancy patterns, which have been specifically designed only for this purpose. For image reconstruction, our method … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 23 publications
(29 citation statements)
references
References 34 publications
0
29
0
Order By: Relevance
“…In higher dimensions (dimension 4 for numerical comparisons) the Sobol sequence and its scrambling are also considered. For anti-aliasing evaluation, we have compared to Projective Blue Noise (PBN) [RRSG16] that also aims to control projections. Reinert et al have defined two variants of PBN; the first one relies on a Dart Throwing (DT) strategy and the second one on a Lloyd's relaxation approach.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…In higher dimensions (dimension 4 for numerical comparisons) the Sobol sequence and its scrambling are also considered. For anti-aliasing evaluation, we have compared to Projective Blue Noise (PBN) [RRSG16] that also aims to control projections. Reinert et al have defined two variants of PBN; the first one relies on a Dart Throwing (DT) strategy and the second one on a Lloyd's relaxation approach.…”
Section: Resultsmentioning
confidence: 99%
“…Projective Blue‐Noise Sampling The idea of having s ‐D sampling with good blue‐noise projections, has been presented in [RRSG16], where the metric of distance between points used in any optimization process has been modified in order to keep track of the projective distance. Although the goal of getting projective BN properties has been achieved that paper does not guarantee uniformity in higher‐dimensional space, which could be potentially harmful for integration tasks, especially when the number of samples grows.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The idea is that the projections of the samples on each axis are uniformly distributed. Saka et al [2007] studied Latinization as a post-processing of an optimized set, whereas Reinert et al [2016] integrated the idea within the optimization process. Our technique also produces uniform distributions of projection, but with provable LD.…”
Section: Latin Hybercube Samplingmentioning
confidence: 99%
“…We prefer the latter option, since it imposes less bias (over-or under-crowding) near the edges of the strata, and is also easier to implement. We also occasionally Latinized the point-set during the optimization process in the spirit of Reinert et al [2016]. This process almost circumvents the bias to the edges.…”
Section: Producing the Input Reference Setsmentioning
confidence: 99%