2018 IEEE Winter Conference on Applications of Computer Vision (WACV) 2018
DOI: 10.1109/wacv.2018.00112
|View full text |Cite
|
Sign up to set email alerts
|

DGSAC: Density Guided Sampling and Consensus

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 23 publications
1
9
0
Order By: Relevance
“…An ensemble of these layer-wise generative classifiers is used to make the final prediction by performing a Borda count-based rank-aggregation. Ranking preferences have been used extensively in robust fitting problems in computer vision (Chin et al, 2011;jun Chin et al, 2009;Tiwari & Anand, 2018), and we show its effectiveness in introducing robustness in DNNs against adversarial attacks.…”
Section: Regroup Methodologysupporting
confidence: 53%
“…An ensemble of these layer-wise generative classifiers is used to make the final prediction by performing a Borda count-based rank-aggregation. Ranking preferences have been used extensively in robust fitting problems in computer vision (Chin et al, 2011;jun Chin et al, 2009;Tiwari & Anand, 2018), and we show its effectiveness in introducing robustness in DNNs against adversarial attacks.…”
Section: Regroup Methodologysupporting
confidence: 53%
“…In this section, we firstly evaluate the influence of the different components of HRMP on the performance. Then, we compare the proposed HRMP with eleven state-of-theart model fitting methods, including J-Linkage [15], KF [8], AKSWH [9], T-Linkage [16], RPA [17], DPA [20], RansaCov [18], TSMP [29], HVF [30], DGSAC [19] and MSHF [10] on both challenging synthetic data and real images. For fair comparison, the model hypotheses are generated by employing the proximity sampling [15] for all the fitting methods.…”
Section: Methodsmentioning
confidence: 99%
“…Generally speaking, robust geometric model fitting methods [8]- [10], [15]- [18] can be coarsely classified into the consensus analysis based methods and the preference analysis based methods. The consensus analysis based fitting methods (e.g., adaptive kernel-scale weighted hypotheses (AKSWH) [9], density guided sampling and consensus (DGSAC) [19], random sample coverage (RansaCov) [18] and mode-seeking on hypergraphs fitting (MSHF) [10]) exploit the consensus set corresponding to each model hypothesis for distinguishing inliers from outliers. The preference analysis based fitting methods (e.g., kernel fitting (KF) [8], Jaccard distance based clustering (J-linkage) [15], Tanimoto distance based clustering (T-linkage) [16], robust preference analysis (RPA) [17] and density preference analysis (DPA) [20]) focus on the preference of each data point to the significant model hypotheses for clustering data points.…”
Section: Introductionmentioning
confidence: 99%
“…For example, Barath et al [4] presented the MAGSAC algorithm that does not require a single inlier-outlier threshold such as RANSAC. By exploiting the residual density, Tiwari and Anand [56] introduced the DGSAC algorithm. In the work of Ranftl and Koltun [44] outliers are removed via geometric model estimation and the underlying fundamental matrix is computed using deep neural networks.…”
Section: Related Workmentioning
confidence: 99%