Samples containing low-copy numbers of DNA are routinely encountered in casework. The signal acquired from these sample types can be difficult to interpret as they do not always contain all of the genotypic information from each contributor, where the loss of genetic information is associated with sampling and detection effects. The present work focuses on developing a validation scheme to aid in mitigating the effects of the latter. We establish a scheme designed to simultaneously improve signal resolution and detection rates without costly large-scale experimental validation studies by applying a combined simulation and experimental based approach. Specifically, we parameterize an in silico DNA pipeline with experimental data acquired from the laboratory and use this to evaluate multifarious scenarios in a cost-effective manner. Metrics such as signal-to-noise resolution, false positive and false negative signal detection rates are used to select tenable laboratory parameters that result in high-fidelity signal in the single-copy regime. We demonstrate that the metrics acquired from simulation are consistent with experimental data obtained from two capillary electrophoresis platforms and various injection parameters. Once good resolution is obtained, analytical thresholds can be determined using detection error tradeoff analysis, if necessary. Decreasing the limit of detection of the forensic process to one copy of DNA is a powerful mechanism by which to increase the information content on minor components of a mixture, which is particularly important for probabilistic system inference. If the forensic pipeline is engineered such that high-fidelity electropherogram signal is obtained, then the likelihood ratio (LR) of a true contributor increases and the probability that the LR of a randomly chosen person is greater than one decreases. This is, potentially, the first step towards standardization of the analytical pipeline across operational laboratories.
We are facing an uncertain future in special collections-one that will most likely continue to require us to make tough decisions. With cutbacks and limitations on resources plaguing us, the expense and time required for item-level treatment make it necessary to set preservation priorities within our collections. 1 At the same time, digital initiatives continue to expand in scope and resource allocation, creating new opportunities and challenges in preservation management. Digitization has a valuable place in special collections as a supplement to physical preservation, but the danger arises when the digital begins to supplant the physical.So we need to ask ourselves-in rare book collections, what exactly are we trying to preserve? Do we maintain the physical object, or do we risk losing the information retained in its inherent characteristics? When the "form" and "substance" of a given object are indistinguishable, 2 we are challenged to evaluate collection materials in terms of their inherent value, which includes both the text and the intangible information the materials provide.The literature addresses the need for preservation priorities 3 and offers some subjective criteria that could be used for making such decisions, 4 but it is time to take the next step in establishing guidelines for setting preservation priorities. A need exists for a standardized and objective decision-making framework to guide itemlevel preservation and conservation activities in rare book collections. The presence of a standard could assist in justifying the use of limited resources for executing preservation decisions. I am presenting the following framework as one such tem-1.
The Broadview Introduction to Book History is designed as a companion to The Broadview Reader in Book History, published in 2014. In this new publication, Levy and Mole serve as authors, rather than editors. They have chosen to organize the volume under the same chapter headings as the previous work (“Materiality,” “Textuality,” “Printing and Reading,” “Intermediality,” and “Remediating”), and they state in their Introduction that the two books are intended to function in tandem.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.