Prerequisite relations among concepts are crucial for educational applications. However, it is difficult to automatically extract domain-specific concepts and learn the prerequisite relations among them without labeled data.In this paper, we first extract high-quality phrases from a set of educational data, and identify the domain-specific concepts by a graph based ranking method. Then, we propose an iterative prerequisite relation learning framework, called iPRL, which combines a learning based model and recovery based model to leverage both concept pair features and dependencies among learning materials. In experiments, we evaluated our approach on two real-world datasets Textbook Dataset and MOOC Dataset, and validated that our approach can achieve better performance than existing methods. Finally, we also illustrate some examples of our approach.