This paper proposes a novel method capable of both detecting OOD data and generating in-distribution data samples. To achieve this, a VAE model is adopted and augmented with a memory module, providing capacities for identifying OOD data and synthesising new in-distribution samples. The proposed VAE is trained on normal data and the memory stores prototypical patterns of the normal data distribution. At test time, the input is encoded by the VAE encoder; this encoding is used as a query to retrieve related memory items, which are then integrated with the input encoding and passed to the decoder for reconstruction. Normal samples reconstruct well and yield low reconstruction errors, while OOD inputs produce high reconstruction errors as their encodings get replaced by retrieved normal patterns. Prior works use memory modules for OOD detection with autoencoders, but this method leverages a VAE architecture to enable generation abilities. Experiments conducted with CIFAR-10 and MNIST datasets show that the memory-augmented VAE consistently outperforms the baseline, particularly where OOD data resembles normal patterns. This notable improvement is due to the enhanced latent space representation provided by the VAE. Overall, the memory-equipped VAE framework excels in identifying OOD and generating creative examples effectively.