Figure 1: The equivalent source technique accurately models realistic acoustic effects, such as diffraction, scattering, focusing, and echoes, in large, open scenes. It reduces the runtime memory usage by orders of magnitude compared to state-of-the-art wave solvers, enabling real-time, wave-based sound propagation in scenes spanning hundreds of meters: a) reservoir scene (Half-Life 2), b) Christmas scene, and c) desert scene.
AbstractRealistic sound effects are extremely important in VR to improve the sense of realism and immersion. It augments the visual sense of the user and can help reduce simulation fatigue. Sound can provide 3D spatial cues outside field of view and help create high-fidelity VR training simulations. Current sound propagation techniques are based on heuristic approaches or simple line of sight-based geometric techniques. These techniques cannot capture important sound effects such as diffraction, interference, focusing. For VR applications, there is a need for high-fidelity, accurate sound propagation. In order to model sound propagation accurately, it is important to develop interactive wave-based propagation techniques. We present a set of efficient approaches to model wave-based sound propagation for VR applications that can handle large scenes, directional sound sources, and generate spatial sound, for a moving listener. Our technique has been integrated in Valve's Source game engine and we use it to demonstrate realistic acoustic effects such as diffraction, high-order reflection, interference, directivity, and spatial sound, in complex scenarios.