Since 21st century, social aging trend has become more and more serious. It needs accurately identify the nursing needs of elderly disabled people, who are characterized by inconvenient movement and unclear speech. How to solve these problems has become a key point in the field of elderly care and medical care. In response to this problem, this research has designed a human-computer interactive gesture recognition system for elderly nursing beds in the context of big data. The recognition rate of the fusion feature + support vector machine (SVM) classifier adopted in this study is higher than 90% for each category of gesture. On the test set, this method has an average recognition rate of 96.35%, which is much higher than that of single feature + SVM classifier. While other methods’ recognition rate is lower than 90%. The recognition rate of tag c (nursing bed posture turning left) with obvious gesture feature information is as high as 99.28%, and that of tag h (nursing bed posture bedpan lowering) with weak gesture feature is 93.65%. The human-computer interaction system has well realized the recognition intention of user’s dynamic and static gestures, achieved the goal set by the research, and the interaction form is natural and reliable. In the later research, we can further realize a more comprehensive, accurate and natural human-computer interaction product design through the multi-channel joint decision-making scheme to meet the needs of the elderly.