Extracting keyframes for action recognition purposes is a challenging task, as compression is to be achieved without losing the impression of the action. Insufficient or wrong keyframes can lead to confusion in the type of action. Proposed two-level key frame extraction algorithm uses an adaptive threshold technique to identify most dissimilar frames as keyframes. At the first level, global features based on intensity histogram and at second level local features computed from wavelet decomposition are used as similarity measures. A new performance parameter, Compression Ratio-Normalized Fidelity (CRNF) is introduced for evaluating the keyframe extraction algorithm. Average CRNF of 0.81 is achieved by the proposed algorithm, as compared 0.37 achieved by existing histogram-based method for action recognition dataset. CRNF value of 0.95 is achieved for Open Video project dataset, which is more than existing methods. Qualitative results further prove the effectiveness of the proposed algorithm.
Humans can perform an enormous number of actions like running, walking, pushing, and punching, and can perform them in multiple ways. Hence recognizing a human action from a video is a challenging task. In a supervised learning environment, actions are first represented using robust features and then a classifier is trained for classification. The selection of a classifier does affect the performance of human action recognition. This work focuses on the comparison of two structures of the neural network, namely, feed forward neural network and cascade forward neural network, for human action recognition. Histogram of oriented gradients (HOG) and histogram of optical flow (HOF) are used as features for representing the actions. HOG represents the spatial features of the video while HOF gives motion features of the video. The performance of two neural network architectures is compared based on recognition accuracy. Well-known publically available datasets for action and interaction detection are used for testing. It is seen that, for human action recognition applications, feed forward neural network gives better results in terms of higher recognition accuracy than Cascade forward neural network.
In this paper, a new local feature, called, Salient Wavelet Feature with Histogram of Oriented Gradients (SWFHOG) is introduced for human action recognition and behaviour analysis. In the proposed approach, regions having maximum information are selected based on their entropies. The SWF feature descriptor is formed by using the wavelet sub-bands obtained by applying wavelet decomposition to selected regions. To improve the accuracy further, the SWF feature vector is combined with the Histogram of Oriented Gradient global feature descriptor to form the SWFHOG feature descriptor. The proposed algorithm is evaluated using publicly available KTH, Weizmann, UT Interaction, and UCF Sports datasets for action recognition. The highest accuracy of 98.33% is achieved for the UT interaction dataset. The proposed SWFHOG feature descriptor is tested for behaviour analysis to identify the actions as normal or abnormal. The actions from SBU Kinect and UT Interaction dataset are divided into two sets as Normal Behaviour and Abnormal Behaviour. For the application of behaviour analysis, 95% recognition accuracy is achieved for the SBU Kinect dataset and 97% accuracy is obtained for the UT Interaction dataset. Robustness of the proposed SWFHOG algorithm is tested against Camera view angle change and imperfect actions using Weizmann robustness testing datasets. The proposed SWFHOG method shows promising results as compared to earlier methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.