Automated visual tracking of animals is rapidly becoming an indispensable tool for the study of behavior. It offers a quantitative methodology by which organisms' sensing and decision-making can be studied in a wide range of ecological contexts. Despite this, existing solutions tend to be challenging to deploy in practice, especially when considering long and/or high-resolution video-streams. Here, we present TRex, a fast and easy-to-use solution for tracking a large number of individuals simultaneously using background-subtraction with real-time (60Hz) tracking performance for up to approximately 256 individuals and estimates 2D visual-fields, outlines, and head/rear of bilateral animals, both in open and closed-loop contexts. Additionally, TRex offers highly-accurate, deep-learning-based visual identification of up to approximately 100 unmarked individuals, where it is between 2.5-46.7 times faster, and requires 2-10 times less memory, than comparable software (with relative performance increasing for more organisms/longer videos) and provides interactive data-exploration within an intuitive, platform-independent graphical user-interface.
Automated visual tracking of animals is rapidly becoming an indispensable tool for the study of behavior. It offers a quantitative methodology by which organisms' sensing and decision-making can be studied in a wide range of ecological contexts. Despite this, existing solutions tend to be challenging to deploy in practice, especially when considering long and/or high-resolution video streams. Here, we present TRex, a fast and easy-to-use solution for tracking a large number of individuals simultaneously with real-time (60Hz) tracking performance for up to approximately 256 individuals and estimates 2D body postures and visual fields, both in open- and closed-loop contexts. Additionally, TRex offers highly-accurate, deep-learning-based visual identification of up to approximately 100 unmarked individuals, where it is between 2.5-46.7 times faster, and requires 2-10 times less memory, than comparable software (with relative performance increasing for more organisms and longer videos) and provides interactive visualization and data-exploration within an intuitive, platform-independent graphical user interface.
The purpose of this study was to investigate bacterial recovery and transfer from three biometric sensors and the survivability of bacteria on the devices. The modalities tested were fingerprint, hand geometry and hand vein recognition, all of which require sensor contact with the hand or fingers to collect the biometric. Each sensor was tested separately with two species of bacteria, Staphylococcus aureus and Escherichia coli.Survivability was investigated by sterilizing the sensor surface, applying a known volume of diluted bacterial culture to the sensor and allowing it to dry. Bacteria were recovered at 5, 20, 40 and 60 minutes after drying by touching the contaminated device with a sterile finger cot. The finger cot was re-suspended in 5 mL of saline solution, and plated dilutions to obtain live cells counts from the bacterial recovery. The transferability of bacteria from each device surface was investigated by touching the contaminated device and then touching a plate to transfer the bacteria to growth medium to obtain live cell counts. The time lapse between consecutive touches was one minute, with the number of touches was n = 50. Again, S. aureus and E. coli were used separately as detection organisms. This paper will descrbe the results of the study in terms of survival curves and transfer curves of each bacterial strain for each device.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.