2021
DOI: 10.1109/access.2021.3049378
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Similarity Computing Model Based on Multi Model Fine-Grained Nonlinear Fusion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(4 citation statements)
references
References 35 publications
0
4
0
Order By: Relevance
“…Within the semantic similarity approaches, as for instance the one recently presented in [47] for Neural Networks, or hybrid similarity measures combining the shortest path lengths and the depths of subsumers [30], below we restrict our attention to the methods based on the information content [14], [33], Formal Concept Analysis [13], [43], IFCA (Formal Concept Analysis with Interval Type-2 Fuzzy sets) [15], Geographical Information Systems [16], [40], and different application domains, such as health [1], [24], and network security [44], just to mention a few examples. However the IC approach, although recognized as "the state of the art on semantic similarity" [3], [8], has shown some limitations, as discussed below.…”
Section: Related Workmentioning
confidence: 99%
“…Within the semantic similarity approaches, as for instance the one recently presented in [47] for Neural Networks, or hybrid similarity measures combining the shortest path lengths and the depths of subsumers [30], below we restrict our attention to the methods based on the information content [14], [33], Formal Concept Analysis [13], [43], IFCA (Formal Concept Analysis with Interval Type-2 Fuzzy sets) [15], Geographical Information Systems [16], [40], and different application domains, such as health [1], [24], and network security [44], just to mention a few examples. However the IC approach, although recognized as "the state of the art on semantic similarity" [3], [8], has shown some limitations, as discussed below.…”
Section: Related Workmentioning
confidence: 99%
“…Ranasinghe et al used various combinations of GRU, Bi-LSTM, etc., in Siamese network structures to compare the representational power of various variants of the structure for text semantics [12]. Zhang et al combined TF-IDF and Jaccard coefcients with CNN to improve the representation of vectors for sentence features, but lacked semantic links between diferent words [13]. Guo et al analyzed the multiple semantic compositions of texts in terms of their frame structure and combined with the self-attention mechanism to enhance the representation of vectors for multiple semantic sentences, but lacked the representation of semantic relations between diferent texts [14].…”
Section: Introductionmentioning
confidence: 99%
“…Traditional sentence matching methods are mainly based on statistical characteristics of sentences ( Zhang et al (2021) ) or on word embedding ( Shen et al (2018) ) to directly calculate the similarity between sentences. But they often ignore the semantic features of sentences, which are not effective in complex situations.…”
Section: Introductionmentioning
confidence: 99%