2018
DOI: 10.1177/0278364918816374
|View full text |Cite
|
Sign up to set email alerts
|

Small-variance asymptotics for non-parametric online robot learning

Abstract: Small variance asymptotics is emerging as a useful technique for inference in large scale Bayesian non-parametric mixture models. This paper analyses the online learning of robot manipulation tasks with Bayesian non-parametric mixture models under small variance asymptotics. The analysis yields a scalable online sequence clustering (SOSC) algorithm that is non-parametric in the number of clusters and the subspace dimension of each cluster. SOSC groups the new datapoint in low dimensional subspaces by online in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 15 publications
(17 citation statements)
references
References 46 publications
0
17
0
Order By: Relevance
“…We present latent space formulations of our approach to exploit the task structure using: 1) mixture of factor analyzers decomposition of the covariance matrix [15], 2) semi-tied covariance matrices of the mixture model [25], and 3) Bayesian non-parametric formulation of the model with Hierarchical Dirichlet process (HDP) for online learning under small variance asymptotics [26]. The paper unifies and extends our previous work on encoding manipulation skills in a task-adaptive manner [24,25,26]. Our objective is to reduce the number of demonstrations required for learning a new task, while ensuring effective generalization in new environmental situations.…”
Section: Introductionmentioning
confidence: 99%
“…We present latent space formulations of our approach to exploit the task structure using: 1) mixture of factor analyzers decomposition of the covariance matrix [15], 2) semi-tied covariance matrices of the mixture model [25], and 3) Bayesian non-parametric formulation of the model with Hierarchical Dirichlet process (HDP) for online learning under small variance asymptotics [26]. The paper unifies and extends our previous work on encoding manipulation skills in a task-adaptive manner [24,25,26]. Our objective is to reduce the number of demonstrations required for learning a new task, while ensuring effective generalization in new environmental situations.…”
Section: Introductionmentioning
confidence: 99%
“…However, the parameters {μ, σ } have a big impact on the convergence property. We set the initial values of μ, σ with the K -means method [27], [40].…”
Section: B Initialization Of Task Generative Modelmentioning
confidence: 99%
“…In [25], a parametric hidden Markov model (HMM) was used to encode the data from the demonstrations in the training phase. Tanwani and Calinon [26], [27] developed a task-parameterized HSMM to copy with the environmental situations in the process of manipulation tasks. From the above-mentioned works, the robot learning can be regarded as a problem of feature extraction from the demonstrated training data for a specific skill in the process of HRI.…”
Section: Introductionmentioning
confidence: 99%
“…In [97], a visuo‐haptic AR method was presented to learn object operation tasks, where the HRI system allowed the human operator to choose and operate the virtual objects by adding real workspace. In [98], an online robot learning method with small variance asymptotics was proposed to recognise the human operator's intention and update the telerobot's motion according to the learned task model. For the above‐mentioned approaches, the robot learning methods involve the statistical probability theory and PbD can model the cooperative tasks and copy with the variability the tasks to decrease the workload of humans and work efficiency of the teleoperation system.…”
Section: Key Technologies and Challengesmentioning
confidence: 99%