Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2019
DOI: 10.1145/3292500.3330759
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Product Search

Abstract: We study the problem of semantic matching in product search, that is, given a customer query, retrieve all semantically related products from the catalog. Pure lexical matching via an inverted index falls short in this respect due to several factors: a) lack of understanding of hypernyms, synonyms, and antonyms, b) fragility to morphological variants (e.g. "woman" vs. "women"), and c) sensitivity to spelling errors. To address these issues, we train a deep learning model for semantic matching using customer be… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
65
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 74 publications
(68 citation statements)
references
References 25 publications
3
65
0
Order By: Relevance
“…In addition, CNN and LSTM are more difficult to optimize, as they require more computational resources, training time, and tend to overfit. Our results are in accordance with the findings in Nigam et al [2019]'s study.…”
Section: Average Pooling and Other Neural Architecturessupporting
confidence: 94%
See 3 more Smart Citations
“…In addition, CNN and LSTM are more difficult to optimize, as they require more computational resources, training time, and tend to overfit. Our results are in accordance with the findings in Nigam et al [2019]'s study.…”
Section: Average Pooling and Other Neural Architecturessupporting
confidence: 94%
“…The benefits of this model are in two folds: on one hand, despite non-linearity, the models are smooth and Lipschitz continuous with respect to embedding parameters, and therefore benefits from convergence of generalization error during training; on the other hand, using average pooling avoids extra parameters in the model, thus simplifies the model space and reduces tuning efforts during training, in accordance to the findings in Nigam et al [2019]'s work.…”
Section: Neural Model Architecturesupporting
confidence: 58%
See 2 more Smart Citations
“…Modern e-commerce search engines Nigam et al, 2019) typically consist of a retrieval stage and a ranking stage. The retrieval stage is responsible for collecting a set of relevant products with minimum computational resources.…”
Section: Introductionmentioning
confidence: 99%