2013
DOI: 10.1080/00949655.2011.647317
|View full text |Cite
|
Sign up to set email alerts
|

Knot selection for least-squares and penalized splines

Abstract: Two new stochastic search methods are proposed for optimizing the knot locations and/or smoothing parameters for least-squares or penalized splines. One of the methods is a golden-section-augmented blind search, while the other is a continuous genetic algorithm. Monte Carlo experiments indicate that the algorithms are very successful at producing knot locations and/or smoothing parameters that are near optimal in a squared error sense. Both algorithms are amenable to parallelization and have been implemented i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
40
0
2

Year Published

2015
2015
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 52 publications
(42 citation statements)
references
References 40 publications
0
40
0
2
Order By: Relevance
“…The approach used in this application was to approximate the step function via a spline of degree 0. Genetic algorithm was used to find the optimal knot locations . The optimal locations are those which minimize the residual sums of squares between the approximation and the data collected.…”
Section: Regularization Approaches For Individual Missionsmentioning
confidence: 94%
See 1 more Smart Citation
“…The approach used in this application was to approximate the step function via a spline of degree 0. Genetic algorithm was used to find the optimal knot locations . The optimal locations are those which minimize the residual sums of squares between the approximation and the data collected.…”
Section: Regularization Approaches For Individual Missionsmentioning
confidence: 94%
“…Genetic algorithm was used to find the optimal knot locations. 5 The optimal locations are those which minimize the residual sums of squares between the approximation and the data collected. Denoting the covariate to be approximated by X ki , the residual sum of squares criterion for r knots for the kth factor on the ith mission is given by the following:…”
Section: Change Point Identificationmentioning
confidence: 99%
“…In addition, the knots need to be placed in optimal positions for the fit to be accurate. Common knot selection approaches are based on either fit statistics or cross‐validation . To find this set via fit statistics, one can place many equally spaced knots and then find the knot sequence that minimizes a goodness of fit statistic, such as AIC or BIC.…”
Section: Methodsmentioning
confidence: 99%
“…Common knot selection approaches are based on either fit statistics or cross-validation. 9 To find this set via fit statistics, one can place many equally spaced knots and then find the knot sequence that minimizes a goodness of fit statistic, such as AIC or BIC. For cross-validation, a subset of data is fit to each specified knot sequence, and the knot sequence that minimizes some sort of fit criterion, such as the mean squared error, is selected.…”
Section: Modelmentioning
confidence: 99%
“…For example, [19], [20], [21]. A stochastic search method was proposed in [22] for optimizing the knot locations by using a continuous genetic algorithm. Bayesian methods for fitting free-knot splines have been considered in some literatures [23], [24], [25],and [26].…”
Section: Introductionmentioning
confidence: 99%