2019
DOI: 10.1088/1361-6420/ab129e
|View full text |Cite
|
Sign up to set email alerts
|

Nyström subsampling method for coefficient-based regularized regression

Abstract: Kernel methods are attractive in data analysis as they can model nonlinear similarities between observations and provide means to rich representations, both of which are useful for the regression problems in general domains. Despite their popularity, they suffer from two primary inherent drawbacks. One drawback is the positive definiteness requirement of the kernel functions, which greatly restricts their applications to some real data analysis. The other drawback is their poor scalability in massive data scen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 37 publications
0
2
0
Order By: Relevance
“…Note that initial theoretical work on the Nyström method focused on bounding the discrepancy between a given Gram matrix and its sub-sampled version [12,17], but these results do not provide direct information on the statistical behavior of downstream learning tasks. Recent researches have shifted toward studying the impact of Nyström approximations on specific learning tasks [23,26,28,32,38]. For example, Rudi et al [32] performed the first analysis for Nyström kernel ridge regression, demonstrating that the Nyström method can maintain guaranteed learning rates.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that initial theoretical work on the Nyström method focused on bounding the discrepancy between a given Gram matrix and its sub-sampled version [12,17], but these results do not provide direct information on the statistical behavior of downstream learning tasks. Recent researches have shifted toward studying the impact of Nyström approximations on specific learning tasks [23,26,28,32,38]. For example, Rudi et al [32] performed the first analysis for Nyström kernel ridge regression, demonstrating that the Nyström method can maintain guaranteed learning rates.…”
Section: Related Workmentioning
confidence: 99%
“…For example, Rudi et al [32] performed the first analysis for Nyström kernel ridge regression, demonstrating that the Nyström method can maintain guaranteed learning rates. Lu et al [26] expanded this work by considering the misspecified case, while Ma et al [28] explored coefficient-based kernel regression. However, most of these results have been obtained within the context of supervised learning, leaving the understanding of the unsupervised setting relatively limited.…”
Section: Related Workmentioning
confidence: 99%