2011 IEEE International Conference on Robotics and Automation 2011
DOI: 10.1109/icra.2011.5980336
|View full text |Cite
|
Sign up to set email alerts
|

Classification of clothing using interactive perception

Abstract: Abstract-We present a system for automatically extracting and classifying items in a pile of laundry. Using only visual sensors, the robot identifies and extracts items sequentially from the pile. When an item has been removed and isolated, a model is captured of the shape and appearance of the object, which is then compared against a database of known items. The classification procedure relies upon silhouettes, edges, and other low-level image measurements of the articles of clothing. The contributions of thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
69
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
5
3
2

Relationship

1
9

Authors

Journals

citations
Cited by 85 publications
(69 citation statements)
references
References 25 publications
0
69
0
Order By: Relevance
“…In this way, the approach bears some resemblance to interactive perception [8], [9], [25], [26], [27], except that we allow a human to perform the interaction due to the specific constraints of articulated motion in our objects. Automatically planning the end effector motion path for interactive perception in such situations remains an unsolved problem, because a preliminary model (at least) is needed in order to interact with the object, but the interaction is necessary to estimate the model.…”
Section: Occlusion-aware Reconstructionmentioning
confidence: 99%
“…In this way, the approach bears some resemblance to interactive perception [8], [9], [25], [26], [27], except that we allow a human to perform the interaction due to the specific constraints of articulated motion in our objects. Automatically planning the end effector motion path for interactive perception in such situations remains an unsolved problem, because a preliminary model (at least) is needed in order to interact with the object, but the interaction is necessary to estimate the model.…”
Section: Occlusion-aware Reconstructionmentioning
confidence: 99%
“…Unlike dressing assistance, recognition methods for clothing manipulation have been widely reported in the literature [6,19]. Because objects made of cloth are so flexible, observing their motions during manipulation is essential.…”
Section: Clothing-state Estimationmentioning
confidence: 99%
“…Research on perception and manipulation tasks for robotic laundry systems consists of: isolating and classifying clothes from a laundry heap [8,5,9], finding a tractable state of the clothes in order to interact and manipulate them [3] and then folding the cloth [1,2,5]. In that regard, Ramisa et al [8] propose a supervised learning approach for grasping highly wrinkled garments.…”
Section: Literature Reviewmentioning
confidence: 99%