Virtual reality (VR) interaction safety is a prerequisite for all user activities in the virtual environment. While seeking a deep sense of immersion with little concern about surrounding obstacles, users may have limited ability to perceive the real‐world space, resulting in possible collisions with real‐world objects. Nowadays, recent works and rendering techniques such as the Chaperone can provide safety boundaries to users but confines them in a small static space and lack of immediacy. To solve this problem, we propose a dynamic approach based on user motion prediction named SCARF, which uses Spearman's correlation analysis, rule learning, and few‐shot learning to achieve prediction of user movements in specific VR tasks. Specifically, we study the relationship between user characteristics, human motion, and categories of VR tasks and provides an approach that uses biomechanical analysis to define the interaction space in VR dynamically.We report on a user study with 58 volunteers and establish a three dimensional kinematic dataset from a VR game. The experiments validate that our few‐shot learning model is effective and can improve the performance of motion prediction. Finally, we implement SCARF in VR environment for dynamic safety boundary adjustment.