Sentiment analysis based on statistics has rapidly developed in deep-learning. Bilateral attention neural network (BANN), especially Bidirectional Encoder Representations from Transformers (BERT), has reached high accuracy. However, with the increase of network depth and large-scale corpus, the computational overhead of BANN increases geometrically. How to reduce training corpus scale has correspondingly become an important research focus. This paper proposes a reduced corpus scale method called Concept-BERT, which consists of the following steps: firstly, using Formal Concept Analysis (FCA), Concept-BERT mines the association rules among corpus and reduces corpus attributes, and hence reducing corpus scale; secondly, reduced-corpus is inputed to BERT and the result is obtained; finally, the attention of Concept-BERT is analyzed. Concept-BERT is experimented for sentiment analysis on CoLA, SST-2, Dianping and Blogsenti, and its accuracy reaches 81.1, 92.9, 77.9 and 86.7 respectively. Our experimental results show that the proposed method has the same accuracy as BERT, using low-scale corpus and low overhead, and low-scale corpus doesn't affect model attention.