As a key skill for language learners, oral communication ability is one of the most important factors to measure one person’s overall quality. Being a widely used language, English has become an important communicative medium worldwide. Oral English teaching is an important part of the whole process of English learning and is generally an emerging topic for the education circle. However, the traditional oral English teaching model has no significant effect on improving oral English performance. Therefore, this paper studies the mixed oral English teaching model under the guidance of core qualities. The connotation of core literacy of oral English is analyzed and the general framework of the core literacy system of oral English is constructed. Then, the relationship between the four elements of English subject core literacy is examined based on the classification of the spoken English teaching resources. Next, the integration of spoken English teaching resources is realized and the oral English knowledge ontology and the cultural differences between Chinese and Western analysis are explored. Finally, a task-based oral English teaching hybrid model is developed. The experimental results show that there is a significant difference in the oral English level of the students before and after the experiment. The model can improve the overall quality and overall performance of English learning for our society.
Neural machine translation (NMT) has been bringing exciting news in the field of machine translation since its emergence. However, because NMT only employs single neural networks to convert natural languages, it suffers from two drawbacks in terms of reducing translation time: NMT is more sensitive to sentence length than statistical machine translation and the end-to-end implementation process fails to make explicit use of linguistic knowledge to improve translation performance. The network model performance of various deep learning machine translation tasks was constructed and compared in English-Chinese bilingual direction, and the defects of each network were solved by using an attention mechanism. The problems of gradient disappearance and gradient explosion are easy to occur in the recurrent neural network in the long-distance sequence. The short and long-term memory networks cannot reflect the information weight problems in long-distance sequences. In this study, through the comparison of examples, it is concluded that the introduction of an attention mechanism can improve the attention of context information in the process of model generation of the target language sequence, thus translating restore degree and fluency higher. This study proposes a neural machine translation method based on the divide-and-conquer strategy. Based on the idea of divide-and-conquer, this method identifies and extracts the longest noun phrase in a sentence and retains special identifiers or core words to form a sentence frame with the rest of the sentence. This method of translating the longest noun phrase and sentence frame separately by the neural machine translation system, and then recombining the translation, alleviates the poor performance of neural machine translation in long sentences. Experimental results show that the BLEU score of translation obtained by the proposed method has improved by 0.89 compared with the baseline method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.