2014
DOI: 10.1117/12.2067569
|View full text |Cite
|
Sign up to set email alerts
|

Focus-of-attention for human activity recognition from UAVs

Abstract: This paper presents a system to extract metadata about human activities from full-motion video recorded from a UAV. The pipeline consists of these components: tracking, motion features, representation of the tracks in terms of their motion features, and classification of each track as one of the human activities of interest. We consider these activities: walk, run, throw, dig, wave. Our contribution is that we show how a robust system can be constructed for human activity recognition from UAVs, and that focus-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 26 publications
0
7
0
Order By: Relevance
“…The classification results are comparable with the ones showed in this research, and its better accuracy is the tradeoff with the simplistic implementation that is not capable of recognizing more objects or actions. The results presented in [114] have little more classes to be separated. The tracking activity is applied directly using BOW, which makes the architecture more rigid when adding new activities.…”
Section: Surveillance Missionmentioning
confidence: 88%
“…The classification results are comparable with the ones showed in this research, and its better accuracy is the tradeoff with the simplistic implementation that is not capable of recognizing more objects or actions. The results presented in [114] have little more classes to be separated. The tracking activity is applied directly using BOW, which makes the architecture more rigid when adding new activities.…”
Section: Surveillance Missionmentioning
confidence: 88%
“…Three of these activities were static, i.e., performed in place (waving, dogging throwing), while two were dynamic (walking and running). These five activities were selected to have a fair comparison with other works which used the same five activities for human activity classification [77,84,85]. The performance of human detection models and human activity classification models were evaluated using these five activities.…”
Section: A Dataset Overviewmentioning
confidence: 99%
“…The comparison was done in terms of activity classification accuracy. Burghouts et al [84] proposed the first HAR method that extracted motion features from videos and utilized these features for activity classification. This method yielded an accuracy of 57%.…”
Section: ) Human Activity Classification Experimentsmentioning
confidence: 99%
See 2 more Smart Citations