Autonomous navigation in unstructured vegetated environments remains an open challenge.To successfully operate in these settings, autonomous ground vehicles (AGVs) must assess the environment and determine which vegetation is pliable enough to safely traverse. In this paper, we propose ForestTrav (Forest Traversability) : a novel lidar-only (geometric), online traversability estimation (TE) method that can accurately generate a per-voxel traversability estimate for densely vegetated environments, shown for dense subtropical forests. The method leverages a salient, probabilistic 3D voxel representation, continuously fusing in lidar measurements to maintain multiple, per-voxel ray statistics, in combination with the structural context and compactness of sparse convolutional neural networks (SCNNs) to perform accurate TE in densely vegetated environments. The proposed method is real-time capable and is shown to outperform state-of-the-art volumetric and 2.5D TE methods by a significant margin (0.62 vs. 0.41 Matthews correlation coefficient (MCC) score at 0.1 m voxel resolution) in challenging scenes and to generalize to unseen environments. ForestTrav demonstrates that lidar-only (geometric) methods can provide accurate, online TE in complex, densely-vegetated environments. This capability has not been previously demonstrated in the literature in such complex environments. Further, we analyze the response of the TE methods to the temporal and spatial evolution of the probabilistic map as a function of information accumulated over time during scene exploration. It shows that our method performs well even with limited information in the early stages of exploration, and this provides an additional tool to assess the expected performance during deployment. Finally, to train and assess TE methods in highly-vegetated environments, we collected and labeled a novel, real-world data set and provide it to the community as an open-source resource.