2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018
DOI: 10.1109/iros.2018.8593552
|View full text |Cite
|
Sign up to set email alerts
|

Active Learning based on Data Uncertainty and Model Sensitivity

Abstract: Robots can rapidly acquire new skills from demonstrations. However, during generalisation of skills or transitioning across fundamentally different skills, it is unclear whether the robot has the necessary knowledge to perform the task. Failing to detect missing information often leads to abrupt movements or to collisions with the environment. Active learning can quantify the uncertainty of performing the task and, in general, locate regions of missing information. We introduce a novel algorithm for active lea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
2
2
2

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 29 publications
0
9
0
Order By: Relevance
“…We show that with the increasing latent dimensionality, the search time does not increase, as it is dependent solely on the number of nodes. Comparing to [10] and [2], our approach does not require second-order derivatives and is significantly faster. The [2] approach takes more time than the NN-based method, so we only used latter for this comparison.…”
Section: Pendulum Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…We show that with the increasing latent dimensionality, the search time does not increase, as it is dependent solely on the number of nodes. Comparing to [10] and [2], our approach does not require second-order derivatives and is significantly faster. The [2] approach takes more time than the NN-based method, so we only used latter for this comparison.…”
Section: Pendulum Experimentsmentioning
confidence: 99%
“…Such models have been used for similarity estimation from the perspective of Riemannian manifolds in the context of Gaussian process latent variable models [34]. As such nonparametric approaches scale poorly with data set size, several authors [2,9,10] proposed the marriage of Riemannian manifold-based metric learning with deep generative models. This results in a principled measure of similarity between two members of the data distribution by relating it to the shortest path, or, geodesic between two corresponding points in latent space.…”
Section: Introductionmentioning
confidence: 99%
“…RELATED WORK Active learning, where a learner actively poses queries to a teacher for input to reduce sample complexity, has been widely applied in supervised learning settings [13]. Our approach is most suitably situated in the literature on active learning from demonstration [9,[14][15][16][17], also referred to as active imitation learning [18,19], in which the learner generates task instances for which the teacher may provide a demonstration. Active learning from demonstration has been applied to autonomous navigation [14], object seeking with a quadruped robot [15], grasping objects [16], reaching to task space positions with a manipulator [9], and generating smooth robot motion from a latent-space encoding [17].…”
Section: ) Leverage the Probabilistic Information Encoded Inmentioning
confidence: 99%
“…Our approach is most suitably situated in the literature on active learning from demonstration [9,[14][15][16][17], also referred to as active imitation learning [18,19], in which the learner generates task instances for which the teacher may provide a demonstration. Active learning from demonstration has been applied to autonomous navigation [14], object seeking with a quadruped robot [15], grasping objects [16], reaching to task space positions with a manipulator [9], and generating smooth robot motion from a latent-space encoding [17]. Also included in this area are approaches where the learner does not request full task demonstrations, but instead asks for the action to take in the particular state that it is in [18][19][20].…”
Section: ) Leverage the Probabilistic Information Encoded Inmentioning
confidence: 99%
“…The geometry of the manifold has been shown to carry great value when systematically interacting with the latent representations, as it provides a stringent solution to the identifiability problem that plagues latent variable models [Tosi et al, 2014, Hauberg, 2018. For example, this geometry has allowed VAEs to discover latent evolutionary signals in proteins [Detlefsen et al, 2020], provide efficient robot controls [Scannell et al, 2021, Chen et al, 2018b, improve latent clustering abilities [Yang et al, 2018 and more. The fundamental issue with these geometric approaches is that the studied manifold is inherently a stochastic object, but classic differential geometry only supports the study of deterministic manifolds.…”
Section: Introductionmentioning
confidence: 99%