SUMMARYTo help student nurses learn to transfer patients from a bed to a wheelchair, this paper proposes a system for automatic skill evaluation in nurses' training for this task. Multiple Kinect sensors were employed, in conjunction with colored markers attached to the trainee's and patient's clothing and to the wheelchair, in order to measure both participants' postures as they interacted closely during the transfer and to assess the correctness of the trainee's movements and use of equipment. The measurement method involved identifying body joints, and features of the wheelchair, via the colors of the attached markers and calculating their 3D positions by combining color and depth data from two sensors. We first developed an automatic segmentation method to convert a continuous recording of the patient transfer process into discrete steps, by extracting from the raw sensor data the defining features of the movements of both participants during each stage of the transfer. Next, a checklist of 20 evaluation items was defined in order to evaluate the trainee nurses' skills in performing the patient transfer. The items were divided into two types, and two corresponding methods were proposed for classifying trainee performance as correct or incorrect. One method was based on whether the participants' relevant body parts were positioned in a predefined spatial range that was considered 'correct' in terms of safety and efficacy (e.g., feet placed appropriately for balance). The second method was based on quantitative indexes and thresholds for parameters describing the participants' postures and movements, as determined by a Bayesian minimum-error method. A prototype system was constructed and experiments were performed to assess the proposed approach. The evaluation of nurses' patient transfer skills was performed successfully and automatically. The automatic evaluation results were compared with evaluation by human teachers and achieved an accuracy exceeding 80%.