In response to the accident at the Fukushima Daiichi nuclear power station in Japan, the U.S. Nuclear Regulatory Commission (NRC) and Department of Energy agreed to jointly sponsor an accident reconstruction study as a means of assessing severe accident modeling capability of the MELCOR code. MELCOR is the state-of-the-art system-level severe accident analysis code used by the NRC to provide information for its decision-making process in this area. The objectives of the project were: (1) collect, verify, and document data on the accidents by developing an information portal system; (2) reconstruct the accident progressions using computer models and accident data; and (3) validate the MELCOR code and the Fukushima models, and suggest potential future data needs. Idaho National Laboratory (INL) developed an information portal for the Fukushima Daiichi accident information. Sandia National Laboratories (SNL) developed MELCOR 2.1 models of the Fukushima Daiichi Units 1, 2, and 3 reactors and the Unit 4 spent fuel pool. Oak Ridge National Laboratory (ORNL) developed a MELCOR 1.8.5 model of the Unit 3 reactor and a TRACE model of the Unit 4 spent fuel pool. The good correlation of the results from the SNL models with the data from the plants and with the ORNL model results provides additional confidence in the MELCOR code. The modeling effort has also provided insights into future data needs for both model development and validation.
We would like to thank our industry partners who attended this workshop. Without their participation, the workshop would have not been possible. To this end, we are particularly grateful for the effort and support Dustin Greenwood of NuScale, Patrick Kopfle of Dominion Energy, Jim Hill of Xcel Energy, and Asgeir Drøivoldsmo of the Institute for Energy Technology provided in developing presentations to share at the workshop. Indeed, these presentations were absolutely integral to the engagement and success of the workshop.
The nuclear industry, and the business world in general, is facing a rapidly increasing amount of data to be dealt with on a daily basis. In the last two decades, the steady improvement of data storage devices and means to create and collect data along the way influenced the manner in which we deal with information. Most data is still stored without filtering and refinement for later use. Many functions at a nuclear power plant generate vast amounts of data, with scheduled and unscheduled outages being a prime example of a source of some of the most complex data sets at the plant. To make matters worse, modern information and communications technology is making it possible to collect and store data faster than our ability to use it for making decisions. However, in most applications, especially outages, raw data has no value in itself; instead, managers, engineers and other specialists want to extract the information contained in it. The complexity and sheer volume of data could lead to information overload, resulting in getting lost in data that may be irrelevant to the task at hand, processed in an inappropriate way, or presented in an ineffective way. To prevent information overload, many data sources are ignored so production opportunities are lost because utilities lack the ability to deal with the enormous data volumes properly. Decision-makers are often confronted with large amounts of disparate, conflicting and dynamic information, which are available from multiple heterogeneous sources. Information and communication technologies alone will not solve this problem. Utilities need effective methods to exploit and use the hidden opportunities and knowledge residing in unexplored data resources. Superior performance before, during and after outages depends upon the right information being available at the right time to the right people. Acquisition of raw data is the easy part; instead, it is the ability to use advanced analytical, data processing and data visualization methods to turn the data into reliable information and comprehensible, actionable information. Techniques like data mining, filtering and analysis only work reliably for well-defined and well-understood problems. The path from data to decision is more complex. The ability to communicate knowledge during outages and emergent issues is crucial. This paper presents an approach to turn the unused data into an opportunity: applying principles from semiotics, human factors and visual analytics to transform the traditional way of processing outage data into media that will improve the collective situation awareness, knowledge, decisions, actions and overall performance of the entire outage team, and also support the reliability, quality and overall effectiveness of maintenance work. The application of the proposed visualization methods will become the medium of a semi-automated analytical process where humans and machines cooperate using their respective, distinct capabilities for the most effective results.
The workforce cost of operations and maintenance (O&M) in the United States nuclear power industry is mostly attributed to manual activities supplying information to a human decision-making process. Several manually-collected labor-intensive processes generate information that is not typically used beyond the intended target for collecting that information. The information is therefore expensive to collect, yet of limited use. This especially applies to surveillance activities and preventive maintenance, which represent most of the plant workforce activities.The industry has recognized the benefits of both reducing labor-intensive tasks by automating them and increasing the fidelity and uses of the data collected to enable advanced remote monitoring using data-driven decision making for O&M activities. These data-driven methods could include capabilities from performance trending to machine learning and advanced forms of artificial intelligence. This shift in O&M strategy results in significant cost savings, because it reduces labor requirements by automating the data-collection process and reduces the frequency of activities by using an on-need model. The frequency reduction results in additional cost savings by lowering labor and materials demand. This specific effort focuses on automation of monitoring data-collection processes. It is one in a series of efforts planned by the Department of Energy (DOE) Light-water Reactor Sustainability (LWRS) program to target multiple elements in migrating current O&M activities to a data-driven approach. These elements are data collection, data analytics, data management, visualization, value analysis, and change enablement. This effort will focus exclusively on data collection while the other five elements are explicitly researched in multiple ongoing efforts, or planned for future efforts. Out-of-the-box thinking was followed in this effort, which assumed no constraints from the other five elements.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.