2014
DOI: 10.1080/00207160.2013.854881
|View full text |Cite
|
Sign up to set email alerts
|

An inexact continuation accelerated proximal gradient algorithm for lown-rank tensor recovery

Abstract: The low n-rank tensor recovery problem is an interesting extension of the compressed sensing. This problem consists of finding a tensor of minimum n-rank subject to linear equality constraints and has been proposed in many areas such as data mining, machine learning and computer vision. In this paper, operator splitting technique and convex relaxation technique are adapted to transform the low n-rank tensor recovery problem into a convex, unconstrained optimization problem, in which the objective function is t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…It has been reported that the inexact scheme can achieve faster computational speeds and better local minima than can the exact scheme [22,24]. Moreover, the inexact scheme has been employed with APG for many problems in broad areas, such as quadratic semidefinite programming [25], maximal entropy [26], and tensor recovery [27]. Therefore, the inexact scheme and APG can naturally be extended to the NCP problem.…”
Section: Introductionmentioning
confidence: 99%
“…It has been reported that the inexact scheme can achieve faster computational speeds and better local minima than can the exact scheme [22,24]. Moreover, the inexact scheme has been employed with APG for many problems in broad areas, such as quadratic semidefinite programming [25], maximal entropy [26], and tensor recovery [27]. Therefore, the inexact scheme and APG can naturally be extended to the NCP problem.…”
Section: Introductionmentioning
confidence: 99%