In response to the issues of low accuracy, perception degradation, and poor reliability of single-sensor SLAM technologies in complex environments, this study presents a novel IMU-centered multi-sensor fusion SLAM algorithm (IFAL-SLAM) integrating LiDAR, vision, and IMU, based on factor graph elimination optimization (\textbf{I}MU-centered multi-sensor \textbf{F}usion, \textbf{A}daptive \textbf{L}agrangian methods). The proposed system leverages a multi-factor graph model, centering on the IMU, and applies a covariance matrix to fuse visual-inertial and LiDAR-inertial odometries for bias correction, using loop closure factors for global adjustments. To minimize the optimization costs post-fusion, a sliding window mechanism is incorporated, coupled with a QR decomposition elimination method based on Householder transformation to convert the factor graph into a Bayesian network. Finally, an adaptive Lagrangian relaxation method is proposed, employing matrix-form penalty parameters and adaptive strategies to enhance convergence speed and robustness under high rotational dynamics.Experimental results indicate that the proposed algorithm achieves absolute trajectory errors of approximately 0.58 m and 0.24 m in large and small complex scenes, respectively, surpassing classic algorithms in terms of accuracy and reliability.