With the emergence of the research field of Quantum Machine Learning, interest in finding advantageous real-world applications is growing as well. However, challenges concerning the number of available qubits on Noisy Intermediate Scale Quantum (NISQ) devices and accuracy losses due to hardware imperfections still remain and limit the applicability of such approaches in real-world scenarios. Therefore, for simplification, most studies assume nearly noise-free conditions as they are expected with logical, i.e. error-corrected, qubits instead of real qubits provided by hardware. However, the number of logical qubits is expected to scale slowly as they require a high number of real qubits for error correction. This is our motivation to deal with noise as an unavoidable, non-negligible problem on NISQ devices. As an application, we use the example of particle decay tree reconstruction as a highly complex combinatoric problem in High Energy Physics. We investigate methods to reduce the noise impact of such devices and propose a hybrid architecture that extends a classical graph neural network by a parameterized quantum circuit. While we have shown that such a hybrid architecture enables a reduction of the amount of trainable parameters compared to the fully classical case, we are now specifically interested in the actual performance in more realistic, i.e. noise prone scenarios. Using simple synthetic Decay Trees, we train the network in classical simulations to allow for efficient optimization of the parameters. The trained parameters are validated in noisy simulations based on devices by "IBM Quantum" and are used in interpretability and significance studies, enabling improvements in the accuracy on real devices.