Hand gesture recognition and classification play a pivotal role in automating Human‐Computer Interaction (HCI) and have garnered substantial attention in research. In this study, the focus is placed on the application of gesture recognition in surgical settings to provide valuable feedback during medical training. A tool gesture classification system based on Deep Learning (DL) techniques is proposed, specifically employing a Long Short Term Memory (LSTM)‐based model with an attention mechanism. The research is structured in three key stages: data pre‐processing to eliminate outliers and smooth trajectories, addressing noise from surgical instrument data acquisition; data augmentation to overcome data scarcity by generating new trajectories through controlled spatial transformations; and the implementation and evaluation of the DL‐based classification strategy. The dataset used includes recordings from ten participants with varying surgical experience, covering three types of trajectories and involving both right and left arms. The proposed classifier, combined with the data augmentation strategy, is assessed for its effectiveness in classifying all acquired gestures. The performance of the proposed model is evaluated against other DL‐based methodologies commonly employed in surgical gesture classification. The results indicate that the proposed approach outperforms these benchmark methods, achieving higher classification accuracy and robustness in distinguishing diverse surgical gestures.