Aspect-level sentiment classification (ASC) has received much attention these years. With the successful application of attention networks in many fields, attention-based ASC has aroused great interest. However, most of the previous methods did not analyze the contribution of words well and the contextaspect term interaction was not well implemented, which largely limit the efficacy of models. In this paper, we exploit a novel method that is efficient and mainly adopts Multi-head Attention (MHA) networks. First, the word embedding and aspect term embedding are pre-trained by Bidirectional Encoder Representations from Transformers (BERT). Second, we make full use of MHA and convolutional operation to obtain hidden states, which is superior to traditional neural networks. Then, the interaction between context and aspect term is further implemented through averaging pooling and MHA. We conduct extensive experiments on three benchmark datasets and the final results show that the Interactive Multi-head Attention Networks (IMAN) model consistently outperforms the state-of-the-art methods on ASC task. INDEX TERMS Natural language processing, aspect-level, sentiment classification, attention mechanism.
Aspect-level sentiment classification has been widely used by researchers as a fine-grained sentiment classification task to predict the sentiment polarity of specific aspect words in a given sentence. Previous studies have shown relatively good experimental results using graph convolutional networks, so more and more approaches are beginning to exploit sentence structure information for this task. However, these methods do not link aspect word and context well. To address this problem, we propose a method that utilizes a hierarchical multi-head attention mechanism and a graph convolutional network (MHAGCN). It fully considers syntactic dependencies and combines semantic information to achieve interaction between aspect words and context. To fully validate the effectiveness of the method proposed in this paper, we conduct extensive experiments on three benchmark datasets, which, according to the experimental results, show that the method outperforms current methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.