2019 53rd Annual Conference on Information Sciences and Systems (CISS) 2019
DOI: 10.1109/ciss.2019.8693043
|View full text |Cite
|
Sign up to set email alerts
|

Efficient learning of neighbor representations for boundary trees and forests

Abstract: We introduce a semiparametric approach to neighbor-based classification. We build off the recently proposed Boundary Trees algorithm by Mathy et al. (2015) which enables fast neighbor-based classification, regression and retrieval in large datasets. While boundary trees use an Euclidean measure of similarity, the Differentiable Boundary Tree algorithm by Zoran et al. (2017) was introduced to learn low-dimensional representations of complex input data, on which semantic similarity can be calculated to train bou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…We are working on speeding up blind regression. Beyond the imputation step, advances in learning nearest neighbor representations could improve the overall prediction performance as well, for instance using differentiable boundary sets (Adikari and Draper, 2018).…”
Section: Improving Forecastsmentioning
confidence: 99%
“…We are working on speeding up blind regression. Beyond the imputation step, advances in learning nearest neighbor representations could improve the overall prediction performance as well, for instance using differentiable boundary sets (Adikari and Draper, 2018).…”
Section: Improving Forecastsmentioning
confidence: 99%