Robotics: Science and Systems XV 2019
DOI: 10.15607/rss.2019.xv.071
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Neural Movement Primitives

Abstract: Conditional Neural Movement Primitives (CNMPs) is a learning from demonstration framework that is designed as a robotic movement learning and generation system built on top of a recent deep neural architecture, namely Conditional Neural Processes (CNPs). Based on CNPs, CNMPs extract the prior knowledge directly from the training data by sampling observations from it, and uses it to predict a conditional distribution over any other target points. CNMPs specifically learns complex temporal multi-modal sensorimot… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(18 citation statements)
references
References 17 publications
0
10
0
Order By: Relevance
“…49,50 In such cases, the number of task parameters is large; therefore, kernel estimation may not be suitable and using a DNN may be required. Seker et al 3 suggest an alternative primitive representation based on the conditional neural processes DNN architecture. Augmenting such a representation with a learning and generation system facilitates encoding complex temporal multi-modal sensorimotor relations in connection with complex task constraints.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…49,50 In such cases, the number of task parameters is large; therefore, kernel estimation may not be suitable and using a DNN may be required. Seker et al 3 suggest an alternative primitive representation based on the conditional neural processes DNN architecture. Augmenting such a representation with a learning and generation system facilitates encoding complex temporal multi-modal sensorimotor relations in connection with complex task constraints.…”
Section: Discussionmentioning
confidence: 99%
“…1 In light of this, several forms of motor primitives have been suggested for robot motion planning and control. 2,3 When motion is based on motor primitives, learning to perform a task is related to learning the suitable motion primitive parameters. Similarly, performance adaptation and improvement during runtime are related to adaptation of the motion primitive parameter values.…”
Section: Introductionmentioning
confidence: 99%
“…We include ProMPs because it is the foundational framework that motivates our work. CNMP can be viewed as a version of our model with fixed variance in the latent distribution, but that predicts output variance σ y (z, x) as done in previous work [21]. VAE-CNMP is a variation of our model without Bayesian aggregation, the via-point, and context variable independence assumption, but still uses the Isotropic Gaussian as the prior in the KL divergence regularization.…”
Section: Empirical Analysismentioning
confidence: 99%
“…The pivotal work of Seker et al [21] overcomes the limitations of previous work by proposing a continuoustime, non-linear representation of motor skills. These motor primitives utilizes Conditional Neural Processes (CNPs) [7], [8], an encoder-decoder model that allows aggregating multiple input-output pairs to form a latent representation of a function.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, several path planning methods have been proposed using shallow (De Momi et al, 2016) or deep network approaches such as deep multi-layer perceptrons (DMLP; Qureshi et al, 2019), long short-term memory (LSTM) networks (Bency et al, 2019), and deep reinforcement learning (deep-RL; Tai et al, 2017;Panov et al, 2018), or mixed approaches that can be used for robot movement generation (Seker et al, 2019). All these methods plan paths iteratively by predicting the next state or next action (in case of RL) based on the current state, environment configuration, and the target position until the goal is reached.…”
Section: Introductionmentioning
confidence: 99%