Techniques that map the entities and relations of the knowledge graph (KG) into a low-dimensional continuous space are called KG embedding or knowledge representation learning. However, most existing techniques learn the embeddings based on the facts in KG alone, suffering from the issues of imperfection and spareness of KG. Recently, the research on textual information in KG embedding has attracted much attention due to the rich semantic information supplied by the texts. Thus, in this paper, a survey of techniques for textual information based KG embedding is proposed. Firstly, we introduce the techniques for encoding the textual information to represent the entities and relations from perspectives of encoding models and scoring functions, respectively. Secondly, methods for incorporating the textual information in the existing embedding techniques are summarized. Thirdly, we discuss the training procedure of textual information based KG embedding techniques. Finally, applications of KG embedding with textual information in the specific tasks such as KG completion in zero-shot scenario, multilingual entity alignment, relation extraction and recommender system are explored. We hope that this survey will give insights to researchers into textual information based KG embedding. INDEX TERMS Knowledge graph embedding, textual information, text-based embedding, text-improved embedding, embedding-based applications.