The capability of incrementally learning new classes and learning from a few examples is one of the hallmarks of human intelligence. It is crucial to endow a practical recognition system with such ability. Therefore, in this paper, we conduct pioneering work and focus on a challenging yet practical Semi-Supervised Few-Shot Class-Incremental Learning (SSFSCIL) problem, which requires CNN models incrementally learn new classes from very few labeled samples and a large number of unlabeled samples, without forgetting the previously learned ones. To address this problem, a simple and efficient solution for SSFSCIL is proposed to learn novel categories using a self-training strategy in a semi-supervised manner and avoid catastrophic forgetting by distillation-based methods. Our extensive experiments on CIFAR100, miniImageNet and CUB200 datasets demonstrate the promising performance of our proposed method, and define baselines in this new research direction.