Based on research into the applications of artificial intelligence (AI) technology in the manufacturing industry in recent years, we analyze the rapid development of core technologies in the new era of 'Internet plus AI', which is triggering a great change in the models, means, and ecosystems of the manufacturing industry, as well as in the development of AI. We then propose new models, means, and forms of intelligent manufacturing, intelligent manufacturing system architecture, and intelligent manufacturing technology system, based on the integration of AI technology with information communications, manufacturing, and related product technology. Moreover, from the perspectives of intelligent manufacturing application technology, industry, and application demonstration, the current development in intelligent manufacturing is discussed. Finally, suggestions for the application of AI in intelligent manufacturing in China are presented.
Event extraction is an important research direction in the field of natural language processing (NLP) applications including information retrieval (IR). Traditional event extraction is realized with two methods: the pipeline and the joint extraction methods. The pipeline method determines the event by triggering word recognition to further implement event extraction and is prone to error cascading. The joint extraction method applies deep learning to implement the completion of the trigger word and the argument role classification task. Most studies with the joint extraction method adopt the CNN or RNN network structure. However, in the case of event extraction, deeper understanding of complex contexts is required. Existing studies do not make full use of syntactic relations. This paper proposes a novel event extraction model, which is built upon a Tree-LSTM network and a Bi-GRU network and carries syntactically related information. It is illustrated that this method simultaneously uses Tree-LSTM and Bi-GRU to obtain a representation of the candidate event sentence and identify the event type, which results in a better performance compared to the ones that use chain structured LSTM, CNN or only Tree-LSTM. Finally, the hidden state of each node is used in Tree-LSTM to predict a label for candidate arguments and identify/classify all arguments of an event. Lab results show that the proposed event extraction model achieves competitive results compared to previous works.
Connectionist temporal classification (CTC) based supervised sequence training of recurrent neural networks (RNNs) has shown great success in many machine learning areas including end-to-end speech and handwritten character recognition. For the CTC training, however, it is required to unroll (or unfold) the RNN by the length of an input sequence. This unrolling requires a lot of memory and hinders a small footprint implementation of online learning or adaptation. Furthermore, the length of training sequences is usually not uniform, which makes parallel training with multiple sequences inefficient on shared memory models such as graphics processing units (GPUs). In this work, we introduce an expectation-maximization (EM) based online CTC algorithm that enables unidirectional RNNs to learn sequences that are longer than the amount of unrolling. The RNNs can also be trained to process an infinitely long input sequence without pre-segmentation or external reset. Moreover, the proposed approach allows efficient parallel training on GPUs. For evaluation, phoneme recognition and end-to-end speech recognition examples are presented on the TIMIT and Wall Street Journal (WSJ) corpora, respectively. Our online model achieves 20.7% phoneme error rate (PER) on the very long input sequence that is generated by concatenating all 192 utterances in the TIMIT core test set. On WSJ, a network can be trained with only 64 times of unrolling while sacrificing 4.5% relative word error rate (WER).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.