2018
DOI: 10.1007/978-3-319-93417-4_29
|View full text |Cite
|
Sign up to set email alerts
|

A Tri-Partite Neural Document Language Model for Semantic Information Retrieval

Abstract: Abstract. Previous work in information retrieval have shown that using evidence, such as concepts and relations, from external knowledge resources could enhance the retrieval performance. Recently, deep neural approaches have emerged as state-of-the art models for capturing word semantics that can also be efficiently injected in IR models. This paper presents a new tri-partite neural document language framework that leverages explicit knowledge to jointly constrain word, concept, and document learning represen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
14
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(14 citation statements)
references
References 28 publications
0
14
0
Order By: Relevance
“…The learned representations are injected in a text-to-text matching process according to a query expansion technique. Nguyen et al [48] propose a tri-partite neural language model that leverages explicit knowledge to jointly constrain word, concept, and document representations. The authors employ the model in two retrieval strategies: document re-ranking and query expansion.…”
Section: Knowledge-enhanced Representation Modelsmentioning
confidence: 99%
See 3 more Smart Citations
“…The learned representations are injected in a text-to-text matching process according to a query expansion technique. Nguyen et al [48] propose a tri-partite neural language model that leverages explicit knowledge to jointly constrain word, concept, and document representations. The authors employ the model in two retrieval strategies: document re-ranking and query expansion.…”
Section: Knowledge-enhanced Representation Modelsmentioning
confidence: 99%
“…The authors employ the model in two retrieval strategies: document re-ranking and query expansion. Tamine et al [66] extend [47,48] to investigate the combined use of corpus-based information and external knowledge resources in different NLP and IR tasks. The authors compare the impact of the different learning approaches on the quality of the learned representations.…”
Section: Knowledge-enhanced Representation Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, these models suffer from two main limitations: they fail to discriminate polysemous words, as the different meanings of a word are conflated into a single representation; and they fail to learn close representations for synonyms occurring in different contexts, as they lack the relational knowledge required to identify synonymy relationships between words. To overcome these limitations, recent works that integrate external knowledge into the learning process of neural models have been proposed in the Natural Language Processing (NLP) community, but only a few have been applied in IR to reduce the effect of the semantic gap between queries and documents [147,169,168,170].…”
mentioning
confidence: 99%