Sequential dynamics of users is an important feature of modern sequential recommender systems. It helps recommender systems to capture 'context' information about the user's activities. The Markov chain assumes that the user's next action is determined only by the current action and is uncorrelated with all previous actions in the time series. The RNN, on the other hand, is able to capture longer user sequences. In recent years, with the advancement of deep learning techniques, especially the various attention mechanisms and their architectures, it is possible to learn better user behaviour sequences. However, despite the great progress in sequence recommendation based on the above methods, both Markov chains, RNNs and various models based on attention mechanisms are encoded strictly according to user behaviour sequences, which is a unidirectional structure that limits the representation of user behaviour sequences, and the unidirectional structure is not practical in real life. In addition, these existing model structures have another drawback, they are all too idealized, thinking that the global preference of a user can be learned only from the user's behaviour sequences, in fact, even the attention mechanism can only extract the user's interest in the short term. To address the above issues, we propose a model, S-BERT, which is based on the BERT model and further improved to be more applicable in recommendation tasks. The model uses deep bidirectional self-attention to model the user's behavioural sequences, which is used to learn the user's short-term preferences, and the user's own embedding is added to learn the user's long-term preferences. The S-BERT model outperforms the same BERT-based model, BERT4Rec, with three real data sets and the NDCG@10 metric by about 2%.