Background: Animal-attached sensors are increasingly used to provide insights on behaviour and physiology. However, such tags usually lack information on the structure of the surrounding environment from the perspective of a study animal and thus may be unable to identify potentially important drivers of behaviour. Recent advances in robotics and computer vision have led to the availability of integrated depth-sensing and motion-tracking mobile devices. These enable the construction of detailed 3D models of an environment within which motion can be tracked without reliance on GPS. The potential of such techniques has yet to be explored in the field of animal biotelemetry. This report trials an animal-attached structured light depth-sensing and visual-inertial odometry motion-tracking device in an outdoor environment (coniferous forest) using the domestic dog (Canis familiaris) as a compliant test species.Results: A 3D model of the forest environment surrounding the subject animal was successfully constructed using point clouds. The forest floor was labelled using a progressive morphological filter. Trees trunks were modelled as cylinders and identified by random sample consensus. The predicted and actual presence of trees matched closely, with an object-level accuracy of 93.3%. Individual points were labelled as belonging to tree trunks with a precision, recall, and F β score of 1.00, 0.88, and 0.93, respectively. In addition, ground-truth tree trunk radius measurements were not significantly different from random sample consensus model coefficient-derived values. A first-person view of the 3D model was created, illustrating the coupling of both animal movement and environment reconstruction.
Conclusions:Using data collected from an animal-borne device, the present study demonstrates how terrain and objects (in this case, tree trunks) surrounding a subject can be identified by model segmentation. The device pose (position and orientation) also enabled recreation of the animal's movement path within the 3D model. Although some challenges such as device form factor, validation in a wider range of environments, and direct sunlight interference remain before routine field deployment can take place, animal-borne depth sensing and visual-inertial odometry have great potential as visual biologging techniques to provide new insights on how terrestrial animals interact with their environments.