Our research studies early face-body perception of socially anxious individuals during social interaction with a virtual agent (VA) in VR using EEG. VAs' expressiveness is manipulated during social interactions through their facial animations portraying realistic positive/negative/neutral expressions. Facial expressions of the VAs are recorded using real-time facial animation performance capture and pre-validated. Wearing an HMD, tactile-based VR controllers and a mobile EEG system, participants will interact with individual VAs in a virtual office setting (an employee meeting their employer with a handshake), a scenario known to generate anxiety. Behavioural, physiological, and EEG data will be analysed to reveal the effect of emotional valence on early face-body perception during social interactions. This study provides a framework for synchronised multimodal data recording and analysis.