Executive SummaryThe External Flowsheet Review Team (EFRT) expressed concern about the potential for Waste Treatment and Immobilization Plant (WTP) pipe plugging. Per the review's executive summary, "Piping that transports slurries will plug unless it is properly designed to minimize this risk. This design approach has not been followed consistently, which will lead to frequent shutdowns due to line plugging." To evaluate the potential for plugging, critical-velocity tests were performed on several physical simulants to determine if the design approach is conservative. Critical velocity is defined as the point where particles begin to deposit to form a moving bed of particles on the bottom of a straight horizontal pipe during slurry transport operations. The critical velocity depends on the physical properties of the particles, fluid, and system geometry.This report gives the results from critical-velocity testing and provides an indication of slurry stability as a function of fluid rheological properties and transport conditions that are typical of what the plant will see. The experimental results are compared to the WTP design guide on slurry-transport velocity in an effort to confirm minimum waste-velocity and flushing-velocity requirements as established by calculations and critical-velocity correlations in the design guide. The major findings of this testing are as follows:Experimental results indicate that for Newtonian fluids, the design guide is conservative. The design guide is based on the Oroskar and Turian (1980) correlation, a traditional industry-derived equation that focuses on particles larger than 100 m in size. Slurry viscosity has a greater effect on particles with a larger surface area to mass ratio, i.e. smaller particles. The increased viscous forces on small particles result in a smaller critical velocities. Since the Hanford slurry particles generally have large surface area to mass ratios, the reliance on such equations in the 24590-WTP-GPG-M-0058, Rev 0 design guide (Hall 2006) is conservative. Additionally, the use of the 95% percentile particle size as an input to this equation is conservative. The design guide specifies the use of the d 95 density, this term is ambiguous and needs clarification in the design guide. Nonetheless, this value is interpreted to mean the density of the d 95 particle. This density value is irrelevant for critical velocity calculations. Often this value is unknown, and Equation 1 of the 24590-WTP-GPG-M-0058, Rev 0 design guide (Hall 2006) will be used for design purposes. This equation calculates an average or composite density of all solids in the slurry. However, test results indicate that the use of an average particle density as an input to the equation is not conservative. Particle density has a large influence on the overall critical-velocity result returned by the correlation. The viscosity correlation used in the WTP design guide has been shown to be inaccurate for Hanford waste feed materials. Additionally, the recommendation of a 30% minimum margi...
human factors and social/behavioral science challenges through modeling and advanced engineering/computing approaches. This research focuses on the intelligence domain, including human behavior modeling with application to identifying/predicting malicious insider cyber activities, modeling socio-cultural factors as predictors of terrorist activities, and human information interaction concepts for enhancing intelligence analysis decision making. Dr. Greitzer's research interests also include evaluation methods and metrics for assessing effectiveness of decision aids, analysis methods and displays. Ryan Hohimer is a Senior Research Scientist at PNNL. His research interests include knowledge representation and reasoning, probabilistic reasoning, semantic computing, cognitive modeling, image analysis, data management, and data acquisition and analysis. He is currently serving as Principal Investigator of a Laboratory-directed Research and Development project that has designed and developed the CHAMPION reasoner. AbstractThe insider threat ranks among the most pressing cyber-security challenges that threaten government and industry information infrastructures. To date, no systematic methods have been developed that provide a complete and effective approach to prevent data leakage, espionage, and sabotage. Current practice is forensic in nature, relegating to the analyst the bulk of the responsibility to monitor, analyze, and correlate an overwhelming amount of data. We describe a predictive modeling framework that integrates a diverse set of data sources from the cyber domain, as well as inferred psychological/motivational factors that may underlie malicious insider exploits. This comprehensive threat assessment approach provides automated support for the detection of high-risk behavioral "triggers" to help focus the analyst's attention and inform the analysis. Designed to be domain-independent, the system may be applied to many different threat and warning analysis/sense-making problems.
In many insider crimes, managers and other coworkers observed that the offenders had exhibited signs of stress, disgruntlement, or other issues, but no alarms were raised. Barriers to using such psychosocial indicators include the inability to recognize the signs and the failure to record the behaviors so that they can be assessed. A psychosocial model was developed to assess an employee's behavior associated with an increased risk of insider abuse. The model is based on case studies and research literature on factors/correlates associated with precursor behavioral manifestations of individuals committing insider crimes. To test the model's agreement with human resources and management professionals, we conducted an experiment with positive results. If implemented in an operational setting, the model would be part of a set of management tools for employee assessment to identify employees who pose a greater insider threat.
With increasing understanding and availability of nuclear technologies, and increasing persuasion of nuclear technologies by several new countries, it is increasingly becoming important to monitor the nuclear proliferation activities. There is a great need for developing technologies to automatically or semi-automatically detect nuclear proliferation activities using remote sensing. Images acquired from earth observation satellites is an important source of information in detecting proliferation activities. High-resolution remote sensing images are highly useful in verifying the correctness, as well as completeness of any nuclear program. DOE national laboratories are interested in detecting nuclear proliferation by developing advanced geospatial image mining algorithms. In this paper we describe the current understanding of geospatial image mining techniques and enumerate key gaps and identify future research needs in the context of nuclear proliferation.Index Terms-Nuclear proliferation, low-level features, semantic classification, geospatial ontology GEOSPATIAL IMAGE MININGIncreasing resolution, volume, and availability of remote sensing imagery made it possible to accurately identify key geospatial features and their changes over time. Recent studies have shown the usefulness of remote sensing imagery for monitoring nuclear safeguards and proliferation activities [1]. Classification is one of the widely used technique for extracting thematic information. Classification is often performed *Contact: bhaduribl@ornl.gov on per-pixel basis; however proliferation detection requires identification of complex objects, patterns and their spatial relationships. One key distinguishing feature as compared to traditional thematic classification is that the objects and patterns that constitute a nuclear facility have interesting spatial relationships (metric, topological, etc) among themselves. These limitations are clearly evident from Figure 1. Classification technology is mature for extracting thematic classes such as buildings, forest, crops, etc. However, such thematic labels are not enough to capture the fact that the given image contains a nuclear power plant. What is missing is the fact that the objects (e.g., switch yard, containment building, turbine building, cooling towers) and their spatial relationships (arrangements or configurations) are not captured in traditional thematic classification. In addition, traditional image analysis approaches mainly exploit low-level image features (such as, color and texture and, to some extent, size and shape) and are oblivious to higher level descriptors and important spatial (topological) relationships without which we can not accurately discover these complex objects or higher level semantic concepts. One stumbling block in exploiting such relationships is in the description of compound objects and the spatial relationships among the object constituents. Therefore, for effective utilization of remote sensing imagery, first it is important to identify key concepts that des...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.