2018
DOI: 10.1016/j.amc.2017.10.033
|View full text |Cite
|
Sign up to set email alerts
|

The convergence rate of semi-supervised regression with quadratic loss

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
4
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 23 publications
0
4
0
1
Order By: Relevance
“…where assume that the K-functional satisfies certain decay, for example, we assume [3][4][5][6]25,35…”
Section: The Main Resultsmentioning
confidence: 99%
“…where assume that the K-functional satisfies certain decay, for example, we assume [3][4][5][6]25,35…”
Section: The Main Resultsmentioning
confidence: 99%
“…In learning theory, we often assume the K-functionals satisfy certain decay, for example, we assume (see e.g. [53,55,56,62]).…”
Section: By (23) We Havementioning
confidence: 99%
“…On the other hand, we find that the quadratic function √ 1 + t 2 , t ∈ R plays an important role in constructing shape preserving quasi-interpolation and solving partial differential equations with meshfree method since its strong nonlinear property and its convexity(see [14,25,66,67]), and it has been used by [56] as the loss function which shows some advantages in forming semi-supervised learning algorithms. Encouraged by these researches, we want to make an investigation on how these parameters influence the learning rates of the online pairwise learning.…”
mentioning
confidence: 99%
“…Many mathematicians have paid their attentions to this field (see e.g. [18] [19] [21] [25] [26] [27] [28]).…”
mentioning
confidence: 99%