2006
DOI: 10.1007/0-387-28395-1
|View full text |Cite
|
Sign up to set email alerts
|

Duality for Nonconvex Approximation and Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(11 citation statements)
references
References 127 publications
(245 reference statements)
0
11
0
Order By: Relevance
“…( 11) becomes smaller, which implies that the rate function can only become larger. Thus, since (r, J) was arbitrary, (15) implies R(J) ≥ M (J).…”
Section: A Protocol For the Quantum Information Bottleneck Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…( 11) becomes smaller, which implies that the rate function can only become larger. Thus, since (r, J) was arbitrary, (15) implies R(J) ≥ M (J).…”
Section: A Protocol For the Quantum Information Bottleneck Methodsmentioning
confidence: 99%
“…( 11) is not of convex type due to the sign of the inequality in the constraint (≥ rather than ≤). Just as in the classical case, it is a so-called "reverse convex problem" (see, e.g., [15]), for which many of the standard results, such as strong duality or convexity of the resulting function in J, are not known to hold or apply only in a weaker form. Nevertheless, we present numerical evidence for convexity of the RHS in Appendix B.…”
Section: A Protocol For the Quantum Information Bottleneck Methodsmentioning
confidence: 99%
“…Some of our results are inspired by -and bear some analogy with-those known from the theory of best approximation in normed linear spaces by elements of linear subspaces (see e.g. [21]), reformulated in terms of the "semi-scalar product" (see e.g. [18]).…”
Section: Introductionmentioning
confidence: 92%
“…The key idea of this framework is to globally approximate J using a sequence of quadratic functions [45]. Taking the subproblem over x as an example, after having found x k , we can construct a quadratic function ψ(x; x k ) to upper bound J (x) such that the following conditions hold:…”
Section: Convergence Analysismentioning
confidence: 99%
“…In non-convex optimization [45], a common assumption is that the non-convex function J (·) is "locally convex" around its "local" minimum. Suppose we assume that the initialization x 0 is "good", in that it is situated sufficiently close a local minimizer x * .…”
mentioning
confidence: 99%