Abstract. We describe a system that tracks pairs of fruit flies and automatically detects and classifies their actions. We compare experimentally the value of a frame-level feature representation with the more elaborate notion of 'bout features' that capture the structure within actions. Similarly, we compare a simple sliding window classifier architecture with a more sophisticated structured output architecture, and find that window based detectors outperform the much slower structured counterparts, and approach human performance. In addition we test our top performing detector on the CRIM13 mouse dataset, finding that it matches the performance of the best published method. Our Fly-vs-Fly dataset contains 22 hours of video showing pairs of fruit flies engaging in 10 social interactions in three different contexts; it is fully annotated by experts, and published with articulated pose trajectory features.
How animals use sensory information to weigh the risks vs. benefits of behavioral decisions remains poorly understood. Inter-male aggression is triggered when animals perceive both the presence of an appetitive resource, such as food or females, and of competing conspecific males. How such signals are detected and integrated to control the decision to fight is not clear. For instance, it is unclear whether food increases aggression directly, or as a secondary consequence of increased social interactions caused by attraction to food. Here we use the vinegar fly, Drosophila melanogaster, to investigate the manner by which food influences aggression. We show that food promotes aggression in flies, and that it does so independently of any effect on frequency of contact between males, increase in locomotor activity or general enhancement of social interactions. Importantly, the level of aggression depends on the absolute amount of food, rather than on its surface area or concentration. When food resources exceed a certain level, aggression is diminished, suggestive of reduced competition. Finally, we show that detection of sugar via Gr5a+ gustatory receptor neurons (GRNs) is necessary for food-promoted aggression. These data demonstrate that food exerts a specific effect to promote aggression in male flies, and that this effect is mediated, at least in part, by sweet-sensing GRNs.
We present a multisensory method for estimating the transformation of a mobile phone between two images taken from its camera. Pose estimation is a necessary step for applications such as 3D reconstruction and panorama construction, but detecting and matching robust features can be computationally expensive. In this paper we propose a method for combining the inertial sensors (accelerometers and gyroscopes) of a mobile phone with its camera to provide a fast and accurate pose estimation.We use the inertial based pose to warp two images into the same perspective frame. We then employ an adaptive FAST feature detector and image patches, normalized with respect to illumination, as feature descriptors. After the warping the images are approximately aligned with each other so the search for matching key-points also becomes faster and in certain cases more reliable. Our results show that by incorporating the inertial sensors we can considerably speed up the process of detecting and matching key-points between two images, which is the most time consuming step of the pose estimation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.