A lot of breakthroughs have happened in deep neural networks over a period of time. Traditional gradient-based networks require a lot of data to train and learn. When new data is encountered, the models must inefficiently relearn their parameters to effectively learn the new information without any major interference. Neural Turing Machines (NTMs) which have augmented memory capacities, offer the ability to quickly learn and retrieve new information, and hence can potentially remove the disadvantages of conventional models. Here, we demonstrate the ability of a memory-augmented neural network to rapidly assimilate new data, and use this data to make accurate predictions after only a few samples(One-shot). We also introduce a new method for accessing an external memory that focuses on memory content(content-based addressing), unlike previous methods that additionally use memory location based focusing mechanisms. We have also compared accuracy and learning loss of the MANN (Memory Augmented Neural Networks) with LSTM (Long-Short Term Memory) to show which model is a better choice.