We acknowledge the funding of the Werner Siemens Foundation through the MIRCALE (Minimally Invasive Robot-Assisted Computer-guided LaserosteotomE) project. ABSTRACT Today's mechanical tools for bone cutting (osteotomy) lead to mechanical trauma that prolong the healing process. Medical device manufacturers continuously strive to improve their tools to minimize such trauma. One example of such a new tool and procedure is minimally invasive surgery with laser as the cutting element. This setup allows for tissue ablation using laser light instead of mechanical tools, which reduces the post-surgery healing time. During surgery, a reliable feedback system is crucial to avoid collateral damage to the surrounding tissues. Therefore, we propose a tissue classification method that analyzes the acoustic waves produced during laser ablation and show its applicability in an ex-vivo experiment. The ablation process with a microsecond pulsed Erbium-doped Yttrium Aluminium Garnet (Er:YAG) laser produces acoustic waves that we captured with an air-coupled transducer. Consequently, we used these captured waves to classify five porcine tissue types: hard bone, soft bone, muscle, fat, and skin tissue. For automated tissue classification of the measured acoustic waves, we propose three Neural Network (NN) approaches: A Fully-connected Neural Network (FcNN), a one-dimensional Convolutional Neural Network (CNN), and a Recurrent Neural Network (RNN). The time-and the frequency-dependent parts of the measured waves' pressure variation were used as separate inputs to train and validate the designed NNs. In a final step, we used Grad-CAM to find the frequencies' activation map and conclude that the low frequencies are the most important ones for this classification task. In our experiments, we achieved an accuracy of 100 % for the five tissue types for all the proposed NNs. We tested the different classifiers for their robustness and concluded that using frequency-dependent data together with a FcNN is the most robust approach.