In a realistic steady-state visual evoked potential (SSVEP) based brain-computer interface (BCI) application like driving a car or controlling a quadrotor, observing the surrounding environment while simultaneously gazing at the stimulus is necessary. This kind of application inevitably could cause head movements and variation of the accompanying gaze fixation point, which might affect the SSVEP and BCI’s performance. However, few papers studied the effects of head movements and gaze fixation switch on SSVEP response, and the corresponding BCI performance. This study aimed to explore these effects by designing a new ball tracking paradigm in a virtual reality (VR) environment with two different moving tasks, i.e., the following and free moving tasks, and three moving patterns, pitch, yaw, and static. Sixteen subjects were recruited to conduct a BCI VR experiment. The offline data analysis showed that head moving patterns [F(2, 30) = 9.369, p = 0.001, effect size = 0.384] resulted in significantly different BCI decoding performance but the moving tasks had no effect on the results [F(1, 15) = 3.484, p = 0.082, effect size = 0.188]. Besides, the canonical correlation analysis (CCA) and filter bank canonical correlation analysis (FBCCA) accuracy were better than the PSDA and MEC methods in all of the conditions. These results implied that head movement could significantly affect the SSVEP performance but it was possible to switch gaze fixation to interact with the surroundings in a realistic BCI application.