In the traditional methods, the low identification accuracy of cascade implicit causalities is caused by the lack of causal inference. To solve this problem, we propose a causality extraction model based on GCN to infer the causality of the text. It can analyze the cause-effect existing in the text and realize the deep extraction under semantic enhancement. First, the data are preprocessed, and BERT is used for pre-training. In the pre-training, the candidate entities are selected by entity links. The text encoding is used context semantics and embedding location coding. Then, the semantic dependency graph is used to obtain the relationship between entities. In addition, the nodes and edges obtained by the previous step are input into the first-stage GCN to extract the whole causality. Finally, the entity relation graph obtained by the first-stage GCN is introduced into the second-stage GCN for cascade inference. The causality of the cascade is inferred and extracted by a two-stage GCN. Experiments show that the model can find implicit causality more accurately generate new implicit causal entities on the original causality.