Human action recognition (HAR) technology is receiving considerable attention in the field of human-computer interaction. We present a HAR system that works stably in real-world applications. In realworld applications, the HAR system needs to identify detailed actions for specific purposes, and the action data includes many variations. Accordingly, we conducted three experiments. First, we tested our recognition system's performance on the UTD-MHAD dataset. We compared our system's accuracy with results from previous research and confirmed that our system achieves a 91% average performance among recognition systems. Furthermore, we hypothesized the use of a HAR system to detect burglary. In the second experiment, we compared the existing benchmark data with our crime detection dataset. We recognized the test scenarios' data by using the recognition system trained by each dataset. The recognition system trained by our dataset achieved higher accuracy than the past benchmark dataset. The results show that the training data should contain detailed actions for a real application. In the third experiment, we tried to find the motion data type that stably recognizes action regardless of data variation. In a real application, the action data are changed by people. Thus, we introduced variations in the action data using the cross-subject protocol and moving area setting. We trained the recognition system using each position and angle data. In addition, we compared the accuracy of each system. We found that the angle format results in better accuracy because the angle data are beneficial for converting the action variation into a consistent pattern.