Multi-hop knowledge graph question answer (KGQA) is a challenging task because it requires reasoning over multiple edges of the knowledge graph (KG) to arrive at the right answer. However, KGs are often incomplete with many missing links, posing additional challenges for multi-hop KGQA. Recent research on multi-hop KGQA attempted to deal with KG sparsity with relevant external texts. In our work, we propose a multi-hop KGQA model based on relation knowledge enhancement (RKE-KGQA), which fuses both label and text relations through global attention for relation knowledge augmentation. It is well known that the relation between entities can be represented by labels in the knowledge graph or texts in the text corpus, and multi-hop KGQA needs to jump across different entities through relations. First, we assign an activation probability to each entity, then calculate a score for the enhancement relation, and then transfer the score through the activated relations and, finally, obtain the answer. We carry out extensive experiments on three datasets and demonstrate that RKE-KGQA achieves the outperformance result.