2008
DOI: 10.1016/j.jspi.2007.12.004
|View full text |Cite
|
Sign up to set email alerts
|

Merge and chop in the computation for isotonic regressions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…Recently, there has been a great deal of interest in developing new algorithms for isotonic regression [19], [20] and in generalizing the basic shape-constrained paradigms for developing statistically more sophisticated formulations that include isotonic regression constraints [21]- [23].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, there has been a great deal of interest in developing new algorithms for isotonic regression [19], [20] and in generalizing the basic shape-constrained paradigms for developing statistically more sophisticated formulations that include isotonic regression constraints [21]- [23].…”
Section: Related Workmentioning
confidence: 99%
“…5). The cases where the breakpoint count exceeds are avoided by not storing their state vectors in the list (lines [17][18][19][20]. Once each list is fully generated, to ensure local optimality, we need to do a local minimum search within each list on the ce values of the vectors having the same slope and number of breakpoints, and replace them with the minimum value.…”
Section: A Limiting the Number Of Level Setsmentioning
confidence: 99%