Virtual experiment is an important field of human-computer interaction. With more and more virtual laboratories emerging, we found that problems regarding virtual experiments are rising. Such problems can be listed as follows: First, human-computer interaction has lower efficiency during the process of virtual experiment, which means the computer cannot understand the user's intention thus leading to incorrect operation. Second, there are less detections for false behavior during experiments. Third, the virtual laboratory's sense of operation and realism is not strong. In order to solve the above problems, the multimodal sensing navigation virtual and real fusion laboratory (MSNVRFL) was designed and implemented in this paper. We design a new set of experimental equipment with the function of cognition and study a multimodal fusion model and algorithm for chemical experiments, which are both finally verified and applied in MSNVRFL. By using multimodal fusion perception algorithm, the user's true intentions can be understood and the human-computer interaction efficiency can be improved. By carrying out a virtual experiment with the mold of virtual and real fusion, problems like resources wasting and dangers happened during experiment can be avoided, user's sense of operation and realism can be improved. In addition, teaching navigation and wrong operation behavior reminders are provided for users. The experimental result shows that our method can improve the efficiency of human-computer interaction, reduce the user's cognitive load, strengthen the user's sense of reality and operation and stimulate students' interest in learning. INDEX TERMS Multimodal fusion, virtual experiments, intelligent teaching, human-computer interaction.