Aspect-level sentiment classification (ASC) has received much attention these years. With the successful application of attention networks in many fields, attention-based ASC has aroused great interest. However, most of the previous methods did not analyze the contribution of words well and the contextaspect term interaction was not well implemented, which largely limit the efficacy of models. In this paper, we exploit a novel method that is efficient and mainly adopts Multi-head Attention (MHA) networks. First, the word embedding and aspect term embedding are pre-trained by Bidirectional Encoder Representations from Transformers (BERT). Second, we make full use of MHA and convolutional operation to obtain hidden states, which is superior to traditional neural networks. Then, the interaction between context and aspect term is further implemented through averaging pooling and MHA. We conduct extensive experiments on three benchmark datasets and the final results show that the Interactive Multi-head Attention Networks (IMAN) model consistently outperforms the state-of-the-art methods on ASC task. INDEX TERMS Natural language processing, aspect-level, sentiment classification, attention mechanism.