2017 IEEE Winter Conference on Applications of Computer Vision (WACV) 2017
DOI: 10.1109/wacv.2017.132
|View full text |Cite
|
Sign up to set email alerts
|

Egocentric Height Estimation

Abstract: Egocentric, or first-person vision which became popular in recent years with an emerge in wearable technology, is different than exocentric (third-person) vision in some distinguishable ways, one of which being that the camera wearer is generally not visible in the video frames. Recent work has been done on action and object recognition in egocentric videos, as well as work on biometric extraction from first-person videos. Height estimation can be a useful feature for both soft-biometrics and object tracking. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…They computed block-wise optical flows from the given egocentric video and trained a small CNN for camera wearer classification. Finocchiaro et al [6] estimated the height of the camera from the ground using only the egocentric video without any intrinsic camera information. They have used a 2stream CNN based regression model, which regresses the height of the camera wearer from the given input RGB video and its derived optical flows.…”
Section: Related Workmentioning
confidence: 99%
“…They computed block-wise optical flows from the given egocentric video and trained a small CNN for camera wearer classification. Finocchiaro et al [6] estimated the height of the camera from the ground using only the egocentric video without any intrinsic camera information. They have used a 2stream CNN based regression model, which regresses the height of the camera wearer from the given input RGB video and its derived optical flows.…”
Section: Related Workmentioning
confidence: 99%
“…Primary and secondary information captured by BWCs can be used to comprehensively describe the behaviors of the wearer. Such a description may be employed to profile wearers through their physical characteristics [8], [9], their activities [10], the foods they eat [11], and how they interact with others [12]. Each additional day of BWC usage marks the collation of more information, also enabling the inference of insights that are not directly observable.…”
mentioning
confidence: 99%