2023
DOI: 10.1111/anzs.12387
|View full text |Cite
|
Sign up to set email alerts
|

On the selection of predictors by using greedy algorithms and information theoretic criteria

Abstract: Summary We discuss the use of the following greedy algorithms in the prediction of multivariate time series: Matching Pursuit Algorithm (MPA), Orthogonal Matching Pursuit (OMP), Relaxed Matching Pursuit (RMP), Frank–Wolfe Algorithm (FWA) and Constrained Matching Pursuit (CMP). The last two are known to be solvers for the lasso problem. Some of the algorithms are well‐known (e.g. OMP), while others are less popular (e.g. RMP). We provide a unified presentation of all the algorithms, and evaluate their computati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
12
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(12 citation statements)
references
References 37 publications
(128 reference statements)
0
12
0
Order By: Relevance
“…The main difference is that all the entries of β that correspond to the columns of X selected at the current and past iterations are updated by least squares, hence the IT criteria for Gaussian linear regression can be used. For example, in [8], 12 different IT criteria have been employed for selecting the best model from the candidates yielded by OMP.…”
Section: Background and Related Work 121 Greedy Algorithmsmentioning
confidence: 99%
See 4 more Smart Citations
“…The main difference is that all the entries of β that correspond to the columns of X selected at the current and past iterations are updated by least squares, hence the IT criteria for Gaussian linear regression can be used. For example, in [8], 12 different IT criteria have been employed for selecting the best model from the candidates yielded by OMP.…”
Section: Background and Related Work 121 Greedy Algorithmsmentioning
confidence: 99%
“…Relying on the idea of the complementary pairs from [15], we consider the subset O = O \ O [see (7) and (8) for the definitions of O and O ]. It is straightforward to obtain another estimator, S |O | , for the subset S of 'signal' variables by replacing O with O in (9) and then applying Lasso for solving the optimization problem.…”
Section: Stability-based Selectionmentioning
confidence: 99%
See 3 more Smart Citations