2017
DOI: 10.1007/978-3-319-71273-4_13
|View full text |Cite
|
Sign up to set email alerts
|

MRNet-Product2Vec: A Multi-task Recurrent Neural Network for Product Embeddings

Abstract: Abstract. E-commerce websites such as Amazon, Alibaba, Flipkart, and Walmart sell billions of products. Machine learning (ML) algorithms involving products are often used to improve the customer experience and increase revenue, e.g., product similarity, recommendation, and price estimation. The products are required to be represented as features before training an ML algorithm. In this paper, we propose an approach called MRNet-Product2Vec for creating generic embeddings of products within an e-commerce ecosys… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
8
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 7 publications
1
8
0
Order By: Relevance
“…In terms of type level, Comparing to JOIE with the item-type view, P-Companion improves by 9.9% on the Electronics dataset and by 4.5% on the Grocery dataset. We observe similar phenomena on the larger "All-Group" dataset with a relative 3.8% increase on itemlevel Hit score on average and increase on type-level against JOIE 3 . We believe this is due to the following reasons, (i) P-Companion infers complementary products by targeting the complementary type first rather than only modeling product relationships in Sceptre and PMSC.…”
Section: Evaluation On Co-purchase Datasupporting
confidence: 68%
See 3 more Smart Citations
“…In terms of type level, Comparing to JOIE with the item-type view, P-Companion improves by 9.9% on the Electronics dataset and by 4.5% on the Grocery dataset. We observe similar phenomena on the larger "All-Group" dataset with a relative 3.8% increase on itemlevel Hit score on average and increase on type-level against JOIE 3 . We believe this is due to the following reasons, (i) P-Companion infers complementary products by targeting the complementary type first rather than only modeling product relationships in Sceptre and PMSC.…”
Section: Evaluation On Co-purchase Datasupporting
confidence: 68%
“…Current methods [17,23] often fail on such low-resource products, which widely exist in e-commerce. While most of the existing methods in recommender systems [21] focus on modeling user-item relationships by frequent pattern mining [8], matrix factorization [15], collaborative filtering [14,20], or other neural network based recommenders [2], only a few [3,8,10,16] target at explicitly modeling relationship between items. Among them, complementary relationship modeling has been scarcely investigated compared to the efforts made for modeling substitutes with similarity-based approaches.…”
Section: Goodmentioning
confidence: 99%
See 2 more Smart Citations
“…Reusing a shared set of highly informative pre-trained item embeddings significantly reduces computational costs, and allows us to focus our efforts on experimentally measuring the effectiveness of downstream collaborative filtering models. While there has been a number of item embedding algorithms proposed in the literature [Grbovic and Cheng, 2018, Grbovic et al, 2015, Biswas et al, 2017, Wu et al, 2017, Vasile et al, 2016, to the best of our knowledge, methods of leveraging such pre-trained item embeddings for user preference modeling have not been extensively studied.…”
Section: Introductionmentioning
confidence: 99%