2019
DOI: 10.1109/lsp.2019.2891134
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Approach to Quantized Matrix Completion Using Huber Loss Measure

Abstract: In this paper, we introduce a novel and robust approach to Quantized Matrix Completion (QMC). First, we propose a rank minimization problem with constraints induced by quantization bounds. Next, we form an unconstrained optimization problem by regularizing the rank function with Huber loss. Huber loss is leveraged to control the violation from quantization bounds due to two properties: 1-It is differentiable, 2-It is less sensitive to outliers than the quadratic loss. A Smooth Rank Approximation is utilized to… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 30 publications
(13 citation statements)
references
References 22 publications
0
13
0
Order By: Relevance
“…Let us analyse deeper the results in Table 2. The dataset contains more drugs (86) than viruses (23). In CV1 settings, when entries are randomly omitted, without skipping drugs and viruses our method performs the best.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Let us analyse deeper the results in Table 2. The dataset contains more drugs (86) than viruses (23). In CV1 settings, when entries are randomly omitted, without skipping drugs and viruses our method performs the best.…”
Section: Discussionmentioning
confidence: 99%
“…In communications and signal processing, this is actually never the case as those values are almost always quantized. This led to the quantized matrix completion problem [23,6]. One extreme case is binary matrix completion [17] where the matrix entries are represented by only one bit.…”
Section: Matrix Completionmentioning
confidence: 99%
“…Employing the entire closed-set data during the training procedure leads to inclusion of untrustworthy samples of the closed-set. Regularized or underfitting models (such as low-rank representations [70,71,72]) still suffer from memorizing effect of such samples, which exacerbate the separation between open and closed-set by adding ambiguity to the decision boundary between the closed and openset classes. To resolve this issue, we utilize our proposed selection method, KSP, which selects the core representatives.…”
Section: Open-set Identificationmentioning
confidence: 99%
“…Also, λ > 0 is a weighting factor, regularizing the sparsity and the rank. There are many suggested approaches to tackle the nuclear norm in optimization 3 such as smooth rank approximation [11], [17], [18].…”
Section: Problem Modelmentioning
confidence: 99%