Human sensory processing is sensitive to the proximity of stimuli to the body. It is therefore plausible that these perceptual mechanisms also modulate the detectability of content in VR, depending on its location. We evaluate this in a user study and further explore the impact of the user's representation during interaction. We also analyze how embodiment and motor performance are influenced by these factors. In a dual-task paradigm, participants executed a motor task, either through virtual hands, virtual controllers, or a keyboard. Simultaneously, they detected visual stimuli appearing in different locations. We found that, while actively performing a motor task in the virtual environment, performance in detecting additional visual stimuli is higher when presented near the user's body. This effect is independent of how the user is represented and only occurs when the user is also engaged in a secondary task. We further found improved motor performance and increased embodiment when interacting through virtual tools and hands in VR, compared to interacting with a keyboard. This study contributes to better understanding the detectability of visual content in VR, depending on its location in the virtual environment, as well as the impact of different user representations on information processing, embodiment, and motor performance.