Various 3D imaging techniques are routinely used to examine biological materials, the results of which are usually a stack of grayscale images. In order to quantify structural aspects of the biological materials, however, they must first be extracted from the dataset in a process called segmentation. If the individual structures to be extracted are in contact or very close to each other, distance-based segmentation methods utilizing the Euclidean distance transform are commonly employed. Major disadvantages of the Euclidean distance transform, however, are its susceptibility to noise (very common in biological data), which often leads to incorrect segmentations (i.e., poor separation of objects of interest), and its limitation of being only effective for roundish objects. In the present work, we propose an alternative distance transform method, the random-walk distance transform, and demonstrate its effectiveness in high-throughput segmentation of three microCT datasets of biological tilings (i.e., structures composed of a large number of similar repeating units). In contrast to the Euclidean distance transform, the random-walk approach represents the global, rather than the local, geometric character of the objects to be segmented and, thus, is less susceptible to noise. In addition, it is directly applicable to structures with anisotropic shape characteristics. Using three case studies—tessellated cartilage from a stingray, the dermal endoskeleton of a starfish, and the prismatic layer of a bivalve mollusc shell—we provide a typical workflow for the segmentation of tiled structures, describe core image processing concepts that are underused in biological research, and show that for each study system, large amounts of biologically-relevant data can be rapidly segmented, visualized, and analyzed.