Revealing the functioning of compound eyes is of interest to biologists and engineers alike who wish to understand how visually complex behaviours (e.g. detection, tracking, navigation) arise in nature, and to abstract concepts to develop novel artificial sensory systems. A key investigative method is to replicate the sensory apparatus using artificial systems, allowing for investigation of the visual information that drives animal behaviour when exposed to environmental cues. To date, 'Compound Eye Models' (CEMs) have largely explored features such field of view and angular resolution, but the role of shape and overall structure have been largely overlooked due to modelling complexity. Modern real-time raytracing technologies are enabling the construction of a new generation of computationally fast, high-fidelity CEMs. This work introduces a new open-source CEM software (CompoundRay) that is capable of accurately rendering the visual perspective of bees (6,000 individual ommatidia arranged on two realistic eye surfaces) at over 3,000 frames per second. We show how the speed and accuracy facilitated by this software can be used to investigate pressing research questions (e.g. how low resolutions compound eyes can localise small objects) using modern methods (e.g. ML information exploration).