Uncertainty-based sensor management for positioning is an essential component in safe drone operations inside urban environments with large urban valleys. These canyons significantly restrict the Line-Of-Sight signal conditions required for accurate positioning using Global Navigation Satellite Systems (GNSS). Therefore, sensor fusion solutions need to be in place which can take advantage of alternative Positioning, Navigation and Timing (PNT) sensors such as accelerometers or gyroscopes to complement GNSS information. Recent stateof-art research has focused on Machine Learning (ML) techniques such as Support Vector Machines (SVM) that utilize statistical learning to provide an output for a given input. However, understanding the uncertainty of these predictions made by Deep Learning (DL) models can help improve integrity of fusion systems. Therefore, there is a need for a DL model that can also provide uncertainty-related information as the output. This paper proposes a Bayesian-LSTM Neural Network (BLSTMNN) that is used to fuse GNSS and Inertial Measurement Unit (IMU) data. Furthermore, Protection Level (PL) is estimated based on the uncertainty distribution given by the system. To test the algorithm, Hardware-In-the-Loop (HIL) simulation has been performed, utilizing Spirent's GSS7000 simulator and OKTAL-SE Sim3D to simulate GNSS propagation in urban canyons. SimSENSOR is used to simulate the accelerometer and gyroscope. Results show that Bayesian-LSTM provides the best fusion performance compared to GNSS alone, and GNSS/IMU fusion using EKF and SVM. Furthermore, regarding uncertainty estimates, the proposed algorithm can estimate the positioning boundaries correctly, with an error rate of 0.4% and with an accuracy of 99.6%.