Filtergraph is a web application being developed and maintained by the Vanderbilt Initiative in Data-intensive Astrophysics (VIDA) to flexibly and rapidly visualize a large variety of astronomy datasets of various formats and sizes. The user loads a flat-file dataset into Filtergraph which automatically generates an interactive data portal that can be easily shared with others. From this portal, the user can immediately generate scatter plots of up to five dimensions as well as histograms and tables based on the dataset. Key features of the portal include intuitive controls with auto-completed variable names, the ability to filter the data in real time through user-specified criteria, the ability to select data by dragging on the screen, and the ability to perform arithmetic operations on the data in real time. To enable seamless data visualization and exploration, changes are quickly rendered on screen and visualizations can be exported as high quality graphics files. The application is optimized for speed in the context of large datasets: for instance, a plot generated from a stellar database of 3.1 million entries renders in less than 2 seconds on a standard web server platform. This 2 web application has been created using the Web2py web framework based on the Python programming language. Filtergraph is free to use at http://filtergraph.vanderbilt.edu/. Highlights: We developed a web-based application for visualization of astronomy data. The user can generate publication quality multi-dimensional plots and tables. Designed for speed, the user can instantly interact with and visualize the data. To enhance collaboration, data and visualizations can be shared via simple URL. This web application can accept data in a wide variety of file formats.
We describe a new neural-net based light curve classifier and provide it with documentation as a ready-to-use tool for the community. While optimized for identification and classification of eclipsing binary stars, the classifier is general purpose, and has been developed for speed in the context of upcoming massive surveys such as LSST. A challenge for classifiers in the context of neural-net training and massive data sets is to minimize the number of parameters required to describe each light curve. We show that a simple and fast geometric representation that encodes the overall light curve shape, together with a chi-square parameter to capture higherorder morphology information results in efficient yet robust light curve classification, especially for eclipsing binaries. Testing the classifier on the ASAS light curve database, we achieve a retrieval rate of 98% and a false-positive rate of 2% for eclipsing binaries. We achieve similarly high retrieval rates for most other periodic variable-star classes, including RR Lyrae, Mira, and delta Scuti. However, the classifier currently has difficulty discriminating between different subclasses of eclipsing binaries, and suffers a relatively low (∼60%) retrieval rate for multi-mode delta Cepheid stars. We find that it is imperative to train the classifier's neural network with exemplars that include the full range of light curve quality to which the classifier will be expected to perform; the classifier performs well on noisy light curves only when trained with noisy exemplars. The classifier source code, ancillary programs, a trained neural net, and a guide for use, are provided.
Virtual reality (VR) is a promising tool and is increasingly used in many different fields, in which virtual walking can be generalized through detailed modeling of the physical environment such as in sports science, medicine and furthermore. However, the visualization of a virtual environment using a head-mounted display (HMD) differs compared to reality, and it is still not clear whether the visual perception works equally within VR. The purpose of the current study is to compare the spatial orientation between real world (RW) and VR. Therefore, the participants had to walk blindfolded to different placed objects in a real and virtual environment, which did not differ in physical properties. They were equipped with passive markers to track the position of the back of their hand, which was used to specify each object’s location. The first task was to walk blindfolded from one starting position to different placed sport-specific objects requiring different degrees of rotation after observing them for 15 s (0°, 45°, 180°, and 225°). The three-way ANOVA with repeated measurements indicated no significant difference between RW and VR within the different degrees of rotation (p > 0.05). In addition, the participants were asked to walk blindfolded three times from a new starting position to two objects, which were ordered differently during the conditions. Except for one case, no significant differences in the pathways between RW and VR were found (p > 0.05). This study supports that the use of VR ensures similar behavior of the participants compared to real-world interactions and its authorization of use.
Virtual reality (VR) has become a common tool and is often considered for sport-specific purposes. Despite the increased usage, the transfer of VR-adapted skills into the real-world (RW) has not yet been sufficiently studied, and it is still unknown how much of the own body must be visible to complete motoric tasks within VR. In addition, it should be clarified whether older adults also need to perceive their body within VR scenarios to the same extent as younger people extending the usability. Therefore, younger (18–30 years old) and elderly adults (55 years and older) were tested (n = 42) performing a balance-, grasping- and throwing task in VR (HMD based) accompanied with different body visualization types in VR and in the RW having the regular visual input of body’s components. Comparing the performances between the age groups, the time for completion, the number of steps (balance task), the subjective estimation of difficulty, the number of errors, and a rating system revealing movements’ quality were considered as examined parameters. A one-way ANOVA/Friedmann with repeated measurements with factor [body visualization] was conducted to test the influence of varying body visualizations during task completion. Comparisons between the conditions [RW, VR] were performed using the t-Tests/Wilcoxon tests, and to compare both age groups [young, old], t-Tests for independent samples/Mann-Whitney-U-Test were used. The analyses of the effect of body visualization on performances showed a significant loss in movement’s quality when no body part was visualized (p < .05). This did not occur for the elderly adults, for which no influence of the body visualization on their performance could be proven. Comparing both age groups, the elderly adults performed significantly worse than the young age group in both conditions (p < .05). In VR, both groups showed longer times for completion, a higher rating of tasks’ difficulty in the balance and throwing task, and less performance quality in the grasping task. Overall, the results suggest using VR for the elderly with caution to the task demands, and the visualization of the body seemed less crucial for generating task completion. In summary, the actual task demands in VR could be successfully performed by elderly adults, even once one has to reckon with losses within movement’s quality. Although more different movements should be tested, basic elements are also realizable for elderly adults expanding possible areas of VR applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.