It is a crucial component to estimate the similarity of biomedical sentence pair. Siamese neural network (SNN) can achieve better performance for non-biomedical corpora. However, SNN alone cannot obtain satisfactory biomedical text similarity evaluation results due to syntactic complexity and long sentences. In this paper, a cross self-attention (CSA) is proposed to design a new attention mechanism, namely self2self-attention(S2SA). Then the S2SA is introduced into SNN to construct a novel self2selfattentive siamese neural network, namely S2SA-SNN. In the S2SA-SNN, self-attention is used to learn the different weights of words and complex syntactic features in a single sentence. The means of the CSA are used to learn inherent interactive semantic information between sentences, and it employs self-attention instead of global attention to perform cross attention between sentences. Finally, three biomedical benchmark datasets of Pearson Correlation of 0.66 and 0.72/0.66 on DBMI and CDD-ful/-ref are used to test and prove the effectiveness of the S2SA-SNN. The experiment results show that the S2SA-SNN can achieve better performances with pre-trained word embedding and obtain better generalization ability than other compared methods.