In the rise of recent advancements in unmanned aerial vehicles, many studies have focused on using multi-modal platforms for remote inspection of industrial and construction sites. The acquisition of multiple data modalities assists the inspectors in acquiring comprehensive information about the targeted components. Despite the benefits of multi-modal platforms, the calibration and fusion of the obtained data modalities present many challenges that need to be addressed. Using a calibration board with geometrically known features to estimate intrinsic and extrinsic parameters and accurately align the images in thermal and visible spectral bands, is one of the main approaches to address the problem of dissimilarity of feature appearances in different spectrums.This study presents a comprehensive platform for drone-based multimodal inspection of industrial and construction components, including three main components: 1) a sensor setup that can be used as a standalone system or a payload for a drone; 2) a multi-modal embedded system; and 3) a novel calibration board for multi-modal data fusion. The multi-modal embedded system provides the required features to record, transmit, and visualize the thermal, visible, and depth data synchronously. Additionally, the system presents a multi-modal fusion technique to form RGBD&T data containing thermal and texture information of the obtained 3D view. Moreover, this study introduces a novel self-heating calibration board that uses Thermoelectric Peltier modules to provide an identifiable and sharp pattern in thermal and visible images. The calibration board is designed with an aim also to be used as Ground Control Point (GCP) in drone surveys.INDEX TERMS Multi-modal sensory platform, unmanned aerial vehicle, calibration board, data fusion, ground control point, thermography.