The process setup of manufacturing processes is generally knowledge-based and carried out once for a material batch. Industry experts observe fluctuations in product quality and tool life, albeit the process setup remains unchanged. These fluctuations are mainly attributed to fluctuations in material parameters. An in-situ detection of changes in material parameters would enable manufacturers to adapt process parameters like forces or lubrication before turbulences like unexpectedly high tool wear or degradation in product quality occurs. This contribution shows the applicability of a deep learning time series classification architecture that does not rely on handcrafted feature engineering for the classification of hardness fluctuations in a sheet-metal coil using magnetic Barkhausen noise emission. This methodology is not limited to the detection of hardness fluctuations in sheet-metal coils and can potentially be applied for the in-situ material property classification in different manufacturing processes and for different material parameters.
Networking and digitization in manufacturing enable novel methods of data-driven analysis and optimization of processes through cross-process data availability. The creation of digital twins plays an important role in this. However, not all data relevant for a digital twin can be measured directly in the process. Therefore, methods are needed that enable the modelling of quantities that are difficult or impossible to measure directly in the process, such as the finite element method. In many companies, however, neither the know-how nor the necessary IT infrastructure for finite element simulations is available. External commissioning processes are also not suitable for achieving the goals of higher productivity and agility pursued with the digitization and networking of manufacturing processes. In this contribution, an architecture is presented that enables the fully automated use of finite element simulation as a service. The architecture is developed using the case study of fine blanking. First, the requirements of the architecture to be created are determined. Important characteristics of the architecture should be scalability as well as interfaces and means of payment suitable for machine communication. In addition, ensuring data integrity is an important requirement when creating the digital twin. Based on the identified requirements, an architecture is then presented that meets these requirements by using cloud computing and distributed ledger technologies and interfaces that can directly process measurement signals from the process and communicate with the architecture. Finally, the capability of the architecture is tested, possible applications and limitations are discussed, and future extensions are considered.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.