Unmanned ground vehicles (UGVs) are making more and more progress in many application scenarios in recent years, such as exploring unknown wild terrain, working in precision agriculture and serving in emergency rescue. Due to the complex ground conditions and changeable surroundings of these unstructured environments, it is challenging for these UGVs to obtain robust and accurate state estimations by using sensor fusion odometry without prior perception and optimization for specific scenarios. In this paper, based on an error-state Kalman filter (ESKF) fusion model, we propose a robust lidar-inertial odometry with a novel ground condition perception and optimization algorithm specifically designed for UGVs. The probability distribution gained from the raw inertial measurement unit (IMU) measurements during a certain time period and the state estimation of ESKF were both utilized to evaluate the flatness of ground conditions in real-time; then, by analyzing the relationship between the current ground condition and the accuracy of the state estimation, the tightly coupled lidar-inertial odometry was dynamically optimized further by adjusting the related parameters of the processing algorithm of the lidar points to obtain robust and accurate ego-motion state estimations of UGVs. The method was validated in various types of environments with changeable ground conditions, and the robustness and accuracy are shown through the consistent accurate state estimation in different ground conditions compared with the state-of-art lidar-inertial odometry systems.