s-UnsupervisedNeural Machine Translation is a crucial machine translation method that can translate in the absence of a parallel corpus and opens up new avenues for intercultural dialogue. Existing unsupervised neural machine translation models still struggle to deal with intricate grammatical relationships and linguistic structures, which leads to less-than-ideal translation quality. This study combines the Transformer structure and syntactic knowledge to create a new unsupervised neural machine translation model, which enhances the performance of the existing model. The study creates a neural machine translation model based on the Transformer structure first, and then introduces sentence syntactic structure and various syntactic fusion techniques, also known as the Transformer combines grammatical knowledge. The results show that the Transformer combines grammatical knowledge paired with Bi-Long Short-Term Memory proposed in this research has better performance. The accuracy and F1 value of the combined model in the training dataset are as high as 0.97. In addition, the time of the model in real sentence translation is controlled within 2s, and the translation accuracy is above 0.9. In conclusion, the unsupervised neural machine translation model proposed in this study has better performance, and its application to actual translation can achieve better translation results.