Discovering tactical knowledge aims to extract tactical data derived from battlefield signal data, which is vital in information warfare. The learning and reasoning from battlefield signal information can help commanders make effective decisions. However, traditional methods are limited in capturing sequential and global representation due to their reliance on prior knowledge or feature engineering. The current models based on deep learning focus on extracting implicit behavioral characteristics from combat process data, overlooking the embedded martial knowledge within the recognition of combat intentions. In this work, we fill the above challenge by proposing a dual fusion pipeline introducing graph representation learning into sequence learning to construct tactical behavior sequence graphs expressing implicit martial knowledge, named TBGCN. Specifically, the TBGCN utilizes graph representation learning to represent prior knowledge by building a graph to induce deep learning paradigms, and sequence learning finds the hidden representation from the target’s serialized data. Then, we employ a fusion module to merge two such representations. The significance of integrating graphs with deep learning lies in using the artificial experience of implicit graph structure guiding adaptive learning, which can improve representation ability and model generalization. Extensive experimental results demonstrate that the proposed TBGCN can effectively discover tactical knowledge and significantly outperform the traditional and deep learning methods.