Deep learning-based question answering systems have transformed the discipline of natural language processing (NLP) by automating the extraction of answers from textual data. This survey paper provides a captivating overview of these systems, exploring methodologies, techniques, and architectures such as recurrent neural networks (RNNs), BERT model, and transformer models. Extractive and generative approaches are examined, alongside the challenges of handling complex questions, managing noisy input, and addressing rare or unseen words. This survey serves as a stimulating reference, offering valuable insights to researchers and practitioners, fueling innovation and advancement in question answering systems within NLP.