Accurate and reliable sensor calibration is essential to fuse LiDAR and inertial measurements, which are usually available in robotic applications. In this paper, we propose a novel LiDAR-IMU calibration method within the continuoustime batch-optimization framework, where the intrinsics of both sensors and the spatial-temporal extrinsics between sensors are calibrated without using calibration infrastructure such as fiducial tags. Compared to discrete-time approaches, the continuoustime formulation has natural advantages for fusing high rate measurements from LiDAR and IMU sensors. To improve efficiency and address degenerate motions, two observabilityaware modules are leveraged: (i) The information-theoretic data selection policy selects only the most informative segments for calibration during data collection, which significantly improves the calibration efficiency by processing only the selected informative segments. (ii) The observability-aware state update mechanism in nonlinear least-squares optimization updates only the identifiable directions in the state space with truncated singular value decomposition (TSVD), which enables accurate calibration results even under degenerate cases where informative data segments are not available. The proposed LiDAR-IMU calibration approach has been validated extensively in both simulated and real-world experiments with different robot platforms, demonstrating its high accuracy and repeatability in commonly-seen human-made environments. We also open source our codebase to benefit the research community: https://github.com/APRIL-ZJU/OA-LICalib.