As scientific communities grow and evolve, there is a high demand for improved methods for finding relevant papers, comparing papers on similar topics and studying trends in the research community. All these tasks involve the common problem of extracting structured information from scientific articles. In this paper, we propose a novel, scalable, semi-supervised method for extracting relevant structured information from the vast available raw scientific literature. We extract the fundamental concepts of aim, method and result from scientific articles and use them to construct a knowledge graph. Our algorithm makes use of domain-based word embedding and the bootstrap framework. Our experiments show the domain independence of our algorithm and that our system achieves precision and recall comparable to the state of the art. We also show the research trends of two distinct communities-computational linguistics and computer vision.
Pigging is a process used by operators to efficiently run oil and gas pipelines. It involves PIGs short form for Pipeline Inspection Gauge, to perform various maintenance operations. PIG tool is pushed in the pipeline using the pressure inside the pipeline. This is known as the pigging process. It is used for cleaning and inspection of pipelines among other functions. PIGs are launched and received using special purpose sections installed at various locations of the pipelines called launching and receiving stations. In this paper we will be discussing the use of smart or intelligent PIGs, used for inspection of pipelines. An intelligent PIG has many sensors along its circumference to capture the information about the health of the pipeline. Some of the most used sensors used for inspection are Magnetic Flux Leakage (MFL), Ultrasonic and Eddy current. PIG also contains other sensors like Gyro, Odometer used to estimate the location of the PIG and other diagnostics parameters for sensors and corresponding electronics. In this paper we are covering PIGs with MFL sensors used for the inspection. The tool contains a data storage device to store the data captured by the sensors for post processing and analysis. Once an inspection run is completed the storage device is used to download the data to computer system for analysis. Pigging companies generally use their proprietary software tools to analyze the inspection data. The data contains signatures from the inspected pipeline. A skilled analyst with the help of the software tool is able to classify the signatures. The classification is divided into known features of the pipeline like weld, magnetic marker, sleeve, flange and more. The other division is defects namely metal loss, dents and more. Efficient detection and classification of defects allows the operators to take proactive actions, increasing operational efficiency, life of pipeline and reducing chances of faults and unplanned shutdowns. Some of the critical parameters used by operators to analyze the corrosion in the pipeline are wall loss percentage due to metal loss, depth profile, repair factor and metal loss clusters. Major part of this paper covers the use of software to detect and classify different metal loss features in the pipeline.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.