2018
DOI: 10.3389/fnbot.2018.00044
|View full text |Cite
|
Sign up to set email alerts
|

Learning of Central Pattern Generator Coordination in Robot Drawing

Abstract: How do robots learn to perform motor tasks in a specific condition and apply what they have learned in a new condition? This paper proposes a framework for motor coordination acquisition of a robot drawing straight lines within a part of the workspace. Then, it addresses transferring the acquired coordination into another area of the workspace while performing the same task. Motor patterns are generated by a Central Pattern Generator (CPG) model. The motor coordination for a given task is acquired by using a m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 30 publications
0
9
0
Order By: Relevance
“…The addition of different nonlinear terms in the CPG models could also enable the emergence of diverse behaviors like jumping with appropriate environmental scenarios. The proposed DeepCPG based policies could also be used with complex robot architectures like humanoid robots [84] and soft robots [85] for learning different behaviors. We believe these could be interesting directions to explore as part of future work.…”
Section: Discussionmentioning
confidence: 99%
“…The addition of different nonlinear terms in the CPG models could also enable the emergence of diverse behaviors like jumping with appropriate environmental scenarios. The proposed DeepCPG based policies could also be used with complex robot architectures like humanoid robots [84] and soft robots [85] for learning different behaviors. We believe these could be interesting directions to explore as part of future work.…”
Section: Discussionmentioning
confidence: 99%
“…A humanoid robot artist requires human-specific skills which are challenging tasks for humanoid robots. Because Sketching is one of the primitive drawing techniques, many researchers (e.g., Srikaew et al, 1998;Olsson et al, 2002;Calinon et al, 2005;Lin et al, 2009;Kudoh et al, 2009;Lau & Baltes, 2010;Sasaki et al, 2016;Singh et al, 2017;Atoofi, 2018) have tried to develop a robust humanoid robot that could produce pen-and-ink sketches. However, there are some limitations such as the drawing task is usually slow due to complicated motion control and complexity of the input images.…”
Section: Introductionmentioning
confidence: 99%
“…However, this was only in a fixed plotter-like setting. In order to improve the drawing accuracy, some researches try to implement machine learning approaches such as deep neural network (Sasaki et al, 2016;Singh et al, 2017;Atoofi, 2018). An in-depth review of these works shows that none of them used visual servoing in humanoid robots as feedback for drawing observation, and none provides drawing correction during the drawing process.…”
Section: Introductionmentioning
confidence: 99%
“…6 Currently, the widely applied control methods for multi-joint robots include sliding mode control, 7 fuzzy control, 8 crossing-coupling control and contour error coupled control, [9][10][11] and central pattern generator (CPG) control. [12][13][14][15][16][17][18][19] Kamal et al 7 controlled TDFR's two joints to follow the two desired trajectories by applying the sliding mode control method. Combining proportional-integral-derivative (PID) control and fuzzy control, Mohan et al 8 controlled a TDFR to achieve the pre-appointed motions.…”
Section: Introductionmentioning
confidence: 99%