2016
DOI: 10.3390/s16050640
|View full text |Cite
|
Sign up to set email alerts
|

3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands

Abstract: Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new vi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 39 publications
0
6
0
Order By: Relevance
“…The most straightforward approach to move a robot is to design a velocity controller which relates the error function variations during timeė with the robot velocity V = [υ, ω], Chaummette and Hutchinson in [19]. Formally, this is written asė = ∇eV, where ∇e is the gradient of the error function e, υ ∈ R 3 is the linear velocity and ω ∈ R 3 the angular velocity, thus the velocity vector is in V ∈ R 6 .…”
Section: Task-space Robotic Controlmentioning
confidence: 99%
See 1 more Smart Citation
“…The most straightforward approach to move a robot is to design a velocity controller which relates the error function variations during timeė with the robot velocity V = [υ, ω], Chaummette and Hutchinson in [19]. Formally, this is written asė = ∇eV, where ∇e is the gradient of the error function e, υ ∈ R 3 is the linear velocity and ω ∈ R 3 the angular velocity, thus the velocity vector is in V ∈ R 6 .…”
Section: Task-space Robotic Controlmentioning
confidence: 99%
“…Works [4,5] by Mateo et al present a method for understanding how the surface changes during manipulation task, implementing a Dijkstra-based method to model the deformation of the object. The authors also present a method in [6] to predict when the object surface changes drastically during manipulation tasks to achieve dexterous manipulation and prevent damages in objects. The previous cited works are focused mainly in the perception of the objects, but works more related to the control of robots for the manipulation of elastic objects are [7,8] by Navarro-Alarcon, in which an strategy to control the shape of objects with robots, using Fourier transform as features, is presented.…”
Section: Introductionmentioning
confidence: 99%
“…The use of a tactile sensor inside the gripper provides useful information for gasping [ 23 , 24 ], object recognition [ 25 ], and manipulation [ 26 , 27 ] tasks. This problem has not yet been solved [ 28 ].…”
Section: Related Workmentioning
confidence: 99%
“…The system can determine the location of the end-effector in the three-dimensional Cartesian coordinate system. In 2016, the 3D visual data-driven spatiotemporal deformations for non-rigid object gasping using robot hands was introduced by Mateo et al [ 4 ]. The experiments show that the proposed method can grasp several objects in various configurations.…”
Section: Introductionmentioning
confidence: 99%