An area of immersive storytelling in rapid evolution is that of extended reality. This emergent mode of experience employs spatial mapping, and both plane and object detection to superimpose computer-generated images in the volumetric context of a physical space via a head-mounted display. This in turn produces a unique set of challenges and opportunities for associated audio implementation and aesthetics. Creative development of this audio is often a function of evolving toolsets, and the associated workflow is far from standardized. This paper forms a context for such audio workflow, one that draws from precursor technologies such as audio for games and virtual reality and develops this into an outline taxonomy that is both representative of the state of the art, and forward facing towards evolution of the technology stack. The context is framed through a series of case studies. Between 2019 and 2021, BBC and Oculus TV commissioned Alchemy Immersive and Atlantic Productions to produce virtual reality and mixed reality experiences of several classic documentary series by Sir David Attenborough: Museum Alive, Micro Monsters, First Life VR, Museum Alive AR and Kingdom of Plants. This portfolio received numerous award nominations and prizes, including from the Raindance Festival and a double Emmy. The sound design, audio postproduction and spatial audio for these experiences were implemented by the company 1.618 Digital, and drawing from first-hand-creator involvement, the workflows are deconstructed and explored with reference to tools, technologies, techniques and perception. Such an exposition forms the basis for an analysis of both this and broader creative practice in the field of audio for extended reality, and this is subsequently used to present a speculative vision of audio in the future of immersive storytelling.