Ground and space-based sky surveys enable powerful cosmological probes based on measurements of galaxy properties and the distribution of galaxies in the Universe. These probes include weak lensing, baryon acoustic oscillations, abundance of galaxy clusters, and redshift space distortions; they are essential to improving our knowledge of the nature of dark energy. On the theory and modeling front, large-scale simulations of cosmic structure formation play an important role in interpreting the observations and in the challenging task of extracting cosmological physics at the needed precision. These simulations must cover a parameter range beyond the standard six cosmological parameters and need to be run at high mass and force resolution. One key simulationbased task is the generation of accurate theoretical predictions for observables, via the method of emulation. Using a new sampling technique, we explore an 8-dimensional parameter space including massive neutrinos and a variable dark energy equation of state. We construct trial emulators using two surrogate models (the linear power spectrum and an approximate halo mass function). The new sampling method allows us to build precision emulators from just 26 cosmological models and to increase the emulator accuracy by adding new sets of simulations in a prescribed way. This allows emulator fidelity to be systematically improved as new observational data becomes available and higher accuracy is required. Finally, using one ΛCDM cosmology as an example, we study the demands imposed on a simulation campaign to achieve the required statistics and accuracy when building emulators for dark energy investigations. Subject headings: methods: statistical -cosmology: large-scale structure of the universe
Abstract-In this paper we address the difficult problem of parameter-finding in image segmentation. We replace a tedious manual process that is often based on guess-work and luck by a principled approach that systematically explores the parameter space. Our core idea is the following two-stage technique: We start with a sparse sampling of the parameter space and apply a statistical model to estimate the response of the segmentation algorithm. The statistical model incorporates a model of uncertainty of the estimation which we use in conjunction with the actual estimate in (visually) guiding the user towards areas that need refinement by placing additional sample points. In the second stage the user navigates through the parameter space in order to determine areas where the response value (goodness of segmentation) is high. In our exploration we rely on existing ground-truth images in order to evaluate the "goodness" of an image segmentation technique. We evaluate its usefulness by demonstrating this technique on two image segmentation algorithms: a three parameter model to detect microtubules in electron tomograms and an eight parameter model to identify functional regions in dynamic Positron Emission Tomography scans.Index Terms-Parameter exploration, Image segmentation, Gaussian Process Model. MOTIVATIONFor visual analysis image data often need to be segmented. Segmentation refers to the process of partitioning the image into multiple segments, i.e. sets of pixels or voxels, that form contiguous and semantically meaningful regions. If each of these regions is marked by a unique identifier, image segmentation simply means labelling of pixels or voxels. In biomedical imaging, where images are acquired using some kind of tomography or microscopy, segmented regions might correspond to anatomical structures in the case of nonfunctional imaging, and to regions with specific physiological activity in the case of functional imaging.In recent years a variety of semi-and fully automatic techniques have been developed to address the segmentation problem [32]. However, even the current state-of-the-art approaches fall short of providing a "silver bullet" for image segmentation. This has several reasons. One reason is that given some image, the segmentation problem is not well defined; in fact it depends on the application which regions are semantically meaningful. Another reason is that due to different image degradation factors such as low signal-to-noise ratio, imaging artifacts, partial volume effects and shape variability, different kinds of a priori knowledge need to be included. Additionally, the majority of the existing segmentation methods rely on and are sensitive to setting a number of parameters. For example, most of the algorithms contain weighting parameters between multiple competing image-driven or prior-driven cost terms in an attempt to mimic the cognitive capabilities of expert users (e.g. radiologists for medical images).A good parameter setting is usually found by a manual trial and error procedure. Th...
DimStiller is a system for dimensionality reduction and analysis. It frames the task of understanding and transforming input dimensions as a series of analysis steps where users transform data tables by chaining together different techniques, called operators, into pipelines of expressions. The individual operators have controls and views that are linked together based on the structure of the expression. Users interact with the operator controls to tune parameter choices, with immediate visual feedback guiding the exploration of local neighborhoods of the space of possible data tables. DimStiller also provides global guidance for navigating data-table space through expression templates called workflows, which permit re-use of common patterns of analysis.
Abstract-In this paper we introduce paraglide, a visualization system designed for interactive exploration of parameter spaces of multi-variate simulation models. To get the right parameter configuration, model developers frequently have to go back and forth between setting parameters and qualitatively judging the outcomes of their model. During this process, they build up a grounded understanding of the parameter effects in order to pick the right setting. Current state-of-the-art tools and practices, however, fail to provide a systematic way of exploring these parameter spaces, making informed decisions about parameter settings a tedious and workload-intensive task. Paraglide endeavors to overcome this shortcoming by assisting the sampling of the parameter space and the discovery of qualitatively different model outcomes. This results in a decomposition of the model parameter space into regions of distinct behaviour. We developed paraglide in close collaboration with experts from three different domains, who all were involved in developing new models for their domain. We first analyzed current practices of six domain experts and derived a set of design requirements, then engaged in a longitudinal user-centered design process, and finally conducted three in-depth case studies underlining the usefulness of our approach. ! LINKING FORMAL AND REAL SYSTEMSA T the heart of computational science is the simulation of real-world scenarios. As it becomes possible to mimic increasingly comprehensive effects, it remains crucial to ensure a close correspondence between formal model and real system in order to draw any practically relevant conclusions. A wellestablished practical problem in this setting is the calibration of good parameter configurations that strengthen the fitness of the model [33, Ch. 1]. Even after matching model output with measured field data, there may still be free parameters that can be controlled to adjust the behaviour of the computer simulation. This can happen, if the expressive power of the model exceeds the number of available measurements, or if the measurements are so noisy that several different model instances are equally acceptable. In such a case, a domain expert could be involved to interactively tune free parameters of the model in order to favour solutions that match prior experience, theoretical insight, or intuition.Towards that goal, we recognize that the optimization of parameters for some notion of performance is distinct from the objective to discover regions in parameter space that exhibit qualitatively different system behaviour, such as fluid vs. gaseous state, or formation of various movement patterns in a swarm simulation. Optimization is one focus of statistical methods in experimental design and has great potential for integration with visual tools, as for instance demonstrated recently by Torsney-Weir et al. [36]. The focus of this paper is on the latter aspect of qualitative discovery. This can support the understanding of the studied system, strengthen confidence • S. ...
Abstract-In this paper we investigate the effects of function composition in the form g( f (x)) = h(x) by means of a spectral analysis of h. We decompose the spectral description of h(x) into a scalar product of the spectral description of g(x) and a term that solely depends on f (x) and that is independent of g(x). We then use the method of stationary phase to derive the essential maximum frequency of g( f (x)) bounding the main portion of the energy of its spectrum. This limit is the product of the maximum frequency of g(x) and the maximum derivative of f (x). This leads to a proper sampling of the composition h of the two functions g and f . We apply our theoretical results to a fundamental open problem in volume rendering-the proper sampling of the rendering integral after the application of a transfer function. In particular, we demonstrate how the sampling criterion can be incorporated in adaptive ray integration, visualization with multi-dimensional transfer functions, and pre-integrated volume rendering.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.