In this study, we offer a new type of hyperrealistic holographic display system that simultaneously displays both holographic 3D images and AR images. The proposed ultra-realistic display combines 360-degree computer generated holographic (CGH) 3D content to be reconstructed from an SLM and AR content (2D image) to be spatially projected from a microdisplay, allowing users to watch a clearly blended movie without crosstalk. To validate the proposed display's hyper-realistic 3D image characteristics with varied depths, a ruler-based spatial depth measurement method was used to show that images appearing in the real 3D space were clearly reconstructed at different depths. Furthermore, we demonstrate that CGH content synthesized using a deep learning model that extracts high-precision depth maps from RGB color images can be successfully applied to the proposed display system through numerical and actual optical reconstruction experiments. Thus, it is possible to provide users with the effect of maximizing three-dimensional (3D) expression and natural immersion by using the new hybrid display for both free depth expression and clear hyper-realistic 3D expression. Furthermore, the accommodation effect and free-depth control characteristics demonstrated in the proposed system allow viewers to enjoy super-realistic 3D content comfortably, implying that eye fatigue may be overcome even when watching metaverse content for extended periods.