2018
DOI: 10.1016/j.patrec.2017.11.008
|View full text |Cite
|
Sign up to set email alerts
|

Random forest regression for manifold-valued responses

Abstract: An increasing array of biomedical and computer vision applications requires the predictive modeling of complex data, for example images and shapes. The main challenge when predicting such objects lies in the fact that they do not comply to the assumptions of Euclidean geometry. Rather, they occupy non-linear spaces, a.k.a. manifolds, where it is difficult to define concepts such as coordinates, vectors and expected values. In this work, we construct a non-parametric predictive methodology for manifold-valued o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0
3

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(12 citation statements)
references
References 15 publications
0
9
0
3
Order By: Relevance
“…Given that amplicon sequencing datasets tend to lie on such manifolds, using learned dissimilarities could represent a potentially powerful way to analyze these datasets. Furthermore, since these dissimilarity matrices are derived from decision tree ensembles, interactions between ASVs are potentially accounted for, thereby overcoming one of the weaknesses of distance metrics (7,43,48). Therefore, using learned dissimilarities could result in the construction of more informative ordinations.…”
Section: Discussionmentioning
confidence: 99%
“…Given that amplicon sequencing datasets tend to lie on such manifolds, using learned dissimilarities could represent a potentially powerful way to analyze these datasets. Furthermore, since these dissimilarity matrices are derived from decision tree ensembles, interactions between ASVs are potentially accounted for, thereby overcoming one of the weaknesses of distance metrics (7,43,48). Therefore, using learned dissimilarities could result in the construction of more informative ordinations.…”
Section: Discussionmentioning
confidence: 99%
“…RF is a supervised learning approach that combined the bagging ensemble ML algorithm achieved from the classification and regression tree and the random subspace technique, introduced by Breiman [47]. Despite its simplicity, it is an effective tool that relies on the "divide and conquer" principle to solve multi-regression & prediction problems [48]- [50]. It has low sensitivity to multicollinearity and achieves stable performances on unbalanced datasets.…”
Section: B Random Forest (Rf) Modelmentioning
confidence: 99%
“…Random Forest es un método basado en losárboles de decisión muy utilizado gracias a su sencillez y robustez frente a cambios en los parámetros [17]. Aunque se suele usar para realizar clasificaciones [20] también es posible aplicarlos para regresión [21,22] El método implementa variosárboles de decisión independientes entre sí, creados a partir de conjuntos de datos aleatorios con igual distribución. Si se parte de una matriz de datos X (n × m) con n observaciones y m variables, el proceso seguido consiste en ir creando las ramas de uń arbol utilizando una muestra aleatoria de p predictores, con p < m, escogidos aleatoriamente entre las m variables.…”
Section: Random Forestsunclassified