With the ever-increasing potential of AI to perform personalised tasks, it is becoming essential to develop new machine learning techniques which are data-efficient and do not require hundreds or thousands of training data. In this paper, we explore an Inductive Logic Programming approach for one-shot text classification. In particular, we explore the framework of Meta-Interpretive Learning (MIL), along with using common-sense background knowledge extracted from ConceptNet. Results indicate that MIL can learn text classification rules from a small number of training examples, even one example. Moreover, the higher complexity of chosen example for one-shot learning, the higher accuracy of the outcome. Finally, we utilise two approaches, Background Knowledge Splitting and Average One-Shot Learning, to evaluate our model on a public News Category dataset. The outcomes validate MIL's superior performance to the Siamese net for one-shot learning from text.