As the evolution of telecommunication technology advances rapidly, the next-generation networks have undeniably become integral components of modern society. However, given the vast infrastructure requirement for their deployment and the intricacy of their protocols, they inevitably present considerable security challenges. Recently, fuzz testing has emerged as a prominent method of system assurance, gaining substantial traction in the realms of 5G vulnerability detection and performance assessment. It can test and exploit 5G networks to detect invalid integrity protection and security procedure bypasses. The fuzzed system behaviors contain information that could indicate the system's health and potential vulnerabilities. This paper proposes a novel modeling framework, coined as DEFT, to perform causation analysis in the system under test (SUT) in detecting the fuzzed location, therefore, DEFT can be further applied to identifying the type of attacks or abnormal inputs from the partial system profiling for the impacted behaviors. In particular, we show, for the first time, that by utilizing the DEFT, we can precisely detect the fuzzed layer in the log file which can then be further utilized to identify the root cause of vulnerabilities with high accuracy using only a tiny segment of the log file in real-time. To test the DEFT for real-world application scenarios, we generate an unbiased dataset by performing fuzz testing to trigger potential vulnerabilities and unintended behaviors on our 5G test bed and recording the system behaviors via regular logging information. We then analyze a random segment of the log files to detect the root cause of the vulnerabilities via the processing of the word embedding vectors that are enabled by two architectures, Skipgram and continuous bag-of-words (CBOW), learning the underlying word representations for each word using neural networks. The DEFT can effectively identify the fuzzed location and thereby the occurrence of unknown attacks, therefore, can be considered a crucial step in performing a timely attack response and the root cause analysis. We assess multiple machine learning models integrated with the two architectures by the input segment length of log files, computation complexity, and accuracy matrix. It is shown that, for the Skip-gram, an area under the curve (AUC) value of 0.938 is achieved for the Fully Connected Network (FCN) model while considering only 50% length of the original log file, thereby significantly reducing the computational complexity. By eliminating the necessity for a 100% comprehensive log file and instead utilizing a 50% log file, we have effectuated an 8% decrease in testing duration. We make an important observation that, compared with CBOW, Skip-gram shows superior performance when only a smaller fraction of the log file is considered for vulnerability detection, thereby, can readily be applied to real-time 5G applications. Our results provide a valuable direction in vulnerability detection and enable timely attack response.