Underwater positioning presents a challenging issue, because of the rapid attenuation of electronic magnetic waves, the disturbances and uncertainties in the environment. Conventional methods usually employed acoustic devices to localize Unmanned Underwater Vehicles (UUVs), which suffer from a slow refresh rate, low resolution, and are susceptible to the environmental noise. In addition, the complex terrain can also degrade the accuracy of the acoustic navigation systems. The applications of underwater positioning methods based on visual sensors are prevented by difficulties of acquiring the depth maps due to the sparse features, the changing illumination condition, and the scattering phenomenon. In the paper, a novel visual-based underwater positioning system is proposed based on a Light Detection and Ranging (LiDAR) camera and an inertial measurement unit. The LiDAR camera, benefiting from the laser scanning techniques, could simultaneously generate the associated depth maps. The inertial sensor would offer information about its altitudes. Through the fusion of the data from multiple sensors, the positions of the UUVs can be predicted. After that, the Bundle Adjustment (BA) method is used to recalculate the rotation matrix and the translation vector to improve the accuracy. The experiments are carried out in a tank to illustrate the effects and accuracy of the investigated method, in which the ultra-wideband (UWB) positioning system is used to provide reference trajectories. It is concluded that the developed positioning system is able to estimate the trajectory of UUVs accurately, whilst being stable and robust.