Summary
A novel security awareness surgical navigation system has been proposed for the accurate minimally invasive surgery with machine learning algorithms, haptic‐enabled devices, and customized surgical tools to guide the surgery with real‐time force and visual navigation. To provide a direct and simplified user interface during the operation, we combined traditional surgical guide images with AR‐based view and implemented a 3D reconstructed patient‐specific surgical environment includes with all surgical requisite details. In particular, we trained the surgical collected biomechanics haptic data by employed LSTM‐based RNN algorithm, and residual network for the intraoperative force manipulation prediction and classification, respectively. Experiments evaluation results on percutaneous therapy surgery demonstrated a higher performance and distinguished accuracy by the visual and haptic combined than the traditional navigation system. These preliminary study findings may suggested a new framework in the minimally invasive surgical navigation application and hint at the possibility integration of haptic, AR, and machine learning algorithms implementation in medical simulation. In addition, we take security into account when implementation this new framework.