2017
DOI: 10.1016/j.cviu.2016.09.005
|View full text |Cite
|
Sign up to set email alerts
|

Left/right hand segmentation in egocentric videos

Abstract: Wearable cameras allow people to record their daily activities from a user-centered (First Person Vision) perspective. Due to their favorable location, wearable cameras frequently capture the hands of the user, and may thus represent a promising usermachine interaction tool for different applications. Existent First Person Vision methods handle hand segmentation as a backgroundforeground problem, ignoring two important facts: i) hands are not a single "skin-like" moving element, but a pair of interacting coope… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 20 publications
(13 citation statements)
references
References 29 publications
0
13
0
Order By: Relevance
“…Computer vision methods for the analysis of hand movements in egocentric videos have been the focus of a number of previous investigations [21] , dealing with problems including hand detection [22] , hand segmentation [23] , object detection/identification [24] , [25] , hand posture estimation [26] , [27] , and activity recognition [28] , [29] . In the context of rehabilitation, the hand use of individuals with cervical spinal cord injury (cSCI) was detected by applying computer vision to egocentric videos, with promising results [30] .…”
Section: Introductionmentioning
confidence: 99%
“…Computer vision methods for the analysis of hand movements in egocentric videos have been the focus of a number of previous investigations [21] , dealing with problems including hand detection [22] , hand segmentation [23] , object detection/identification [24] , [25] , hand posture estimation [26] , [27] , and activity recognition [28] , [29] . In the context of rehabilitation, the hand use of individuals with cervical spinal cord injury (cSCI) was detected by applying computer vision to egocentric videos, with promising results [30] .…”
Section: Introductionmentioning
confidence: 99%
“…The quadrant with the highest sum of the Haarlike feature vector determined the hand type: "user's right" if right lower quadrant; "user's left" if left lower quadrant; "other hands" if upper quadrants [30]. The angle of the forearm/hand regions was also used by Betancourt et al [53], [96]. The authors fitted an ellipse around the segmented region, calculating the angle between the arm and the lower frame border and the normalized distance of the ellipse center from the left border.…”
Section: Hand Identificationmentioning
confidence: 99%
“…In particular, HSV is based on the way humans perceive colors while LAB and YCbCr use one of the components for lightness and the remaining ones for the color intensity. In egocentric vision, [30,31] use a mixture of color histograms and visual flow for hand-segmentation, while [4] combined HSV features, a Random Forest classifier and super-pixels for gesture recognition. Recently, Li and Kitani [12] analyzed the discriminative power of different color histograms with a Random Forest regressor.…”
Section: Feature Selectionmentioning
confidence: 99%