This article proposes a novel decentralized two-layered and multi-sensorial based fusion architecture for establishing a novel resilient pose estimation scheme. As it will be presented, the first layer of the fusion architecture considers a set of distributed nodes. All the possible combinations of pose information, appearing from different sensors, are integrated to acquire various possibilities of estimated pose obtained by involving multiple extended Kalman filters. Based on the estimated poses, obtained from the first layer, a Fault Resilient Optimal Information Fusion (FR-OIF) paradigm is introduced in the second layer to provide a trusted pose estimation. The second layer incorporates the output of each node (constructed in the first layer) in a weighted linear combination form, while explicitly accounting for the maximum likelihood fusion criterion. Moreover, in the case of inaccurate measurements, the proposed FR-OIF formulation enables a self resiliency by embedding a built-in fault isolation mechanism. Additionally, the FR-OIF scheme is also able to address accurate localization in the presence of sensor failures or erroneous measurements. To demonstrate the effectiveness of the proposed fusion architecture, extensive experimental studies have been conducted with a micro aerial vehicle, equipped with various onboard pose sensors, such as a 3D lidar, a real-sense camera, an ultra wide band node, and an IMU. The efficiency of the proposed novel framework is extensively evaluated through multiple experimental results, while its superiority is also demonstrated through a comparison with the classical multi-sensorial centralized fusion approach.