In this paper, a fully automatic 2.5D facial technique for forensic applications is presented. Feature extraction and classification are fundamental processes in any face identification technique. Two methods for feature extraction and classification are proposed in this paper subsequently. Active Appearance Model (AAM) is one of the familiar feature extraction methods but it has weaknesses in its fitting process. Artificial bee colony (ABC) is a fitting solution due to its fast search ability. However, it has drawback in its neighborhood search. On the other hand, PSO-SVM is one of the most recent classification approaches. However, its performance is weakened by the usage of random values for calculating velocity. To solve the problems, this research is conducted in three phases as follows: the first phase is to propose Maximum Resource Neighborhood Search (MRNS) which is an enhanced ABC algorithm to improve the fitting process in current AAM. Then, Adaptively Accelerated PSO-SVM (AAPSO-SVM) classification technique is proposed, by which the selection of the acceleration coefficient values is done using particle fitness values in finding the optimal parameters of SVM. The proposed methods AAM-MRNS, AAPSO-SVM and the whole 2.5D facial technique are evaluated by comparing them with the other methods using new 2.5D face image data set. Further, a sample of Malaysian criminal real case of CCTV facial investigation suspect has been tested in the proposed technique. Results from the experiment shows that the proposed techniques outperformed the conventional techniques. Furthermore, the 2.5D facial technique is able to recognize a sample of Malaysian criminal case called "Tepuk Bahu" using CCTV facial investigation.
Gait recognition using the energy image representation of the average silhouette image in one complete cycle becomes a baseline in model-free approaches research. Nevertheless, gait is sensitive to any changes. Up to date in the area of feature extraction, image feature representation method based on the spatial gradient is still lacking in efficiency especially for the covariate case like carrying bag and wearing a coat. Although the use of Histogram of orientation Gradient (HOG) in pedestrian detection is the most effective method, its accuracy is still considered low after testing on covariate dataset. Thus, this research proposed a combination of frequency and spatial features based on Inverse Fast Fourier Transform and Histogram of Oriented Gradient (IFFTG-HoG) for gait recognition. It consists of three phases, namely image processing phase, feature extraction phase in the production of new image representation and the classification. The first phase comprises the image binarization process and energy image generation using average gait image in one cycle. In the second phase, the IFFTG-HoG method is used as a features gait extraction after generating energy image. Here, the IFFTG-HoG method has also been improved by using Chebyshev distance to calculate the magnitude of the gradient to increase the rate of recognition accuracy. Lastly, K-Nearest Neighbour (k=NN) classifier with K=1 is employed for individual classification in the third phase. A total of 124 people from CASIA B dataset were tested using the proposed IFTG-HoG method. It performed better in gait individual classification as the value of average accuracy for the standard dataset 96.7%, 93.1% and 99.6%compared to HoG method by 94.1%, 85.9% and 96.2% in order. With similar motivation, we tested on Rempit datasets to recognize motorcycle rider anomaly event, and our proposed method outperforms Dalal Method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.