2018
DOI: 10.1007/978-3-319-93031-2_39
|View full text |Cite
|
Sign up to set email alerts
|

Algorithms for Sparse k-Monotone Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
3
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 38 publications
0
3
0
Order By: Relevance
“…coordinate descent or proximal gradient descent) only possess sublinear convergence in general or linear convergence under certain conditions [27]. On the other hand, -FWA and -PAVA examined in [17] are not optimal.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…coordinate descent or proximal gradient descent) only possess sublinear convergence in general or linear convergence under certain conditions [27]. On the other hand, -FWA and -PAVA examined in [17] are not optimal.…”
Section: Discussionmentioning
confidence: 99%
“…-monotone regression is the extension of monotone regression to the general case of -monotonicity [17]. Both isotone and -monotone regression have applications in many fields, including the non-parametric mathematical statistics [3,18], the empirical data smoothing [19][20][21], the shape-preserving dynamic programming [22], the shape-preserving approximation [23][24][25].…”
mentioning
confidence: 99%
See 1 more Smart Citation