2019
DOI: 10.48550/arxiv.1909.11810
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Mixed Dimension Embeddings with Application to Memory-Efficient Recommendation Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
32
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 20 publications
(32 citation statements)
references
References 44 publications
0
32
0
Order By: Relevance
“…Embedding Usage. When embedding vectors are assigned with different dimensions, all of the existing methods [4,8,12,18,19] have to design an additional layer to unify these vectors to a same length to fit the following uniform MLP layers in DLRMs. Unlike these methods, our method does not need any additional layers since the masked embedding vectors ê𝑖 have the same length by zeros paddings, and can be directly fed into the following layers.…”
Section: Methods 21 Basic Ideamentioning
confidence: 99%
See 2 more Smart Citations
“…Embedding Usage. When embedding vectors are assigned with different dimensions, all of the existing methods [4,8,12,18,19] have to design an additional layer to unify these vectors to a same length to fit the following uniform MLP layers in DLRMs. Unlike these methods, our method does not need any additional layers since the masked embedding vectors ê𝑖 have the same length by zeros paddings, and can be directly fed into the following layers.…”
Section: Methods 21 Basic Ideamentioning
confidence: 99%
“…They can be primarily divided into two categories. (1) Rule-based methods adopt human-defined rules, typically according to the feature frequencies, to give different embedding dimensions to different feature values[4] (see Fig.1(b) for an example). The problem with this category of methods is that they heavily rely on human knowledge and human labor.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…In most approaches, the embedding module allocates variable-length embeddings or multi embedding to different features. Such methods includes NIS [12], Mixed-dimension [6], and AutoEmb [27]. More models pay attention to model interaction between features in an implicit or an explicit manner based on the basic embedding algorithms.…”
Section: Related Workmentioning
confidence: 99%
“…In general, there are two directions to reduce the embedding table size, i.e., reducing the size of each embedding vector and reducing the number (i.e., 𝑁 ) of the embedding vectors in an embedding table. The embedding table size of the former methods (e.g., product quantization [5,9], K-D method [2,11,13,15,20], and AutoDim [6,10,16,17,28,29]) is still linearly increased with |𝐹 |, failing to tackle the memory problem caused by a large vocabulary size in web-scale applications [19]. Hence these methods are not considered in our paper.…”
Section: Introductionmentioning
confidence: 99%