Simultaneous Localization and Mapping (SLAM) is a pivotal technology in autonomous vehicle navigation and a significant research area in robotics. Addressing the inaccuracies in point cloud registration of traditional LiDAR SLAM, which lead to localization and mapping errors, we propose a novel LiDAR Inertial Odometry approach integrating IMU and a multi-feature joint registration strategy. Initially, we introduce an innovative ground segmentation method and feature categorization strategy, enhancing ground detection mechanisms and optimizing the feature extraction process. Subsequently, our multi-feature joint registration method computes the pose transformations between current frames and the local map. Finally, we employ a global registration method based on Fast Point Feature Histograms (FPFH) feature descriptors for coarse alignment, providing initial estimates for the Generalized Iterative Closest Point (GICP) algorithm, thus efficiently and accurately mitigating cumulative errors. Extensive evaluations on the KITTI dataset and real-world campus environment demonstrate that our approach significantly surpasses existing advanced LiDAR SLAM solutions, achieving over a 24% improvement in pose estimation accuracy.