Biomedical research and drug development increasingly involve the extraction of quantitative data from digital microscope images, such as those obtained using fluorescence microscopy. Here, we describe a novel approach for both managing and analyzing such images. The Open Microscopy Environment (OME) is a sophisticated open-source scientific image management database that coordinates the organization, storage, and analysis of the large volumes of image data typically generated by modern imaging methods. We describe FindSpots, a powerful image-analysis package integrated in OME that will be of use to those who wish to identify and measure objects within microscope images or time-lapse movies. The algorithm used in FindSpots is in fact only one of many possible segmentation (object detection) algorithms, and the underlying data model used by OME to capture and store its results can also be used to store results from other segmentation algorithms. In this report, we illustrate how image segmentation can be achieved in OME using one such implementation of a segmentation algorithm, and how this output subsequently can be displayed graphically or processed numerically using a spreadsheet.
As the number and complexity of workflow definition languages grows, ensuring interoperability between them becomes more difficult, with most effort being concentrated on achieving run-time interoperability. The contribution of this paper is to argue for an alternative approach-with interoperation being achieved during build-time, through the use of what we term an Intermediate Workflow Representation (IWR) format, with this format acting as a mediator to carry out language-to-language conversions. In this paper we identify the requirements of such an approach, with the main benefit being that legacy workflows, defined in different workflow definition languages, might be used in conjunction-and in a fashion that is seamless to the user.
No abstract
This extended abstract describes a proposed approach for obtaining software assurance by applying formal verification techniques to routines written as dataflow-based workflows following the approach of RedJack's new Fathom analytic framework. In the workflow model, a complex processing routine is implemented as a collection of "black-box" computations, connected by dataflow links. This simple model provides an intuitive framework for programmers to create processing routines, and also lends itself to efficient execution. We claim that workflow analyses also lend themselves well to formal analysis and verification, by adding formal specifications and code verification for critical or potentially vulnerable components. Formally verifying the security and functional properties of a workflow will provide at least two benefits: 1) it will provide assurances that workflow applications work correctly when executing on machines not under our control; and 2) it allows us to apply certain optimizations that are only safe when certain functional properties can be guaranteed to always hold. Our approach, called "divide-and-conquer" verification, is to apply existing verification techniques to the small set of prebuilt computation implementations, and to then use simpler composition techniques to prove properties about particular workflows. We believe this compositional approach will be more tractable on large-scale systems than existing techniques, which must consider the complex system as a single entity. The proposed approach is widely applicable to applications that must operate in unattended and possibly hostile environments. These include infrastructure controllers (SCADA), end user smart grid components, and the increasingly ubiquitous Internet appliances. OVERVIEWWhile formal specification and verification has a long history in trusted system development, it is often criticized for being fragile and labor-intensive. Automated tools should allow us to prove the security and behavioral properties of our software; however, real-world systems tend to be too large and complex to permit automated verification within reasonable time and space requirements. Moreover, because of the high level of coupling that often exists between the components of a large system, small changes to the software can require reproving all of our assurance arguments. This fragility and lack of flexibility has delayed the use of formal verification in modern software development.One approach to increasing the flexibility of formal verification is to use workflows, where large computations are broken down into small, black-box components, with dataflow links connecting them. Workflow-based designs enforce a loose coupling of the modular components, since workflow computations can only communicate with each other via the stream of data passed from one to another. Because of this decoupled design, we can prove low-level properties of each workflow component using automated tools and techniques. Since each component will be much smaller and less compl...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.