In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularization. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularization in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset $$\{x_i\}_{i=1}^n$$
{
x
i
}
i
=
1
n
and a set of noisy labels $$\{y_i\}_{i=1}^n\subset \mathbb {R}$$
{
y
i
}
i
=
1
n
⊂
R
we let $$u_n{:}\{x_i\}_{i=1}^n\rightarrow \mathbb {R}$$
u
n
:
{
x
i
}
i
=
1
n
→
R
be the minimizer of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When $$y_i = g(x_i)+\xi _i$$
y
i
=
g
(
x
i
)
+
ξ
i
, for iid noise $$\xi _i$$
ξ
i
, and using the geometric random graph, we identify (with high probability) the rate of convergence of $$u_n$$
u
n
to g in the large data limit $$n\rightarrow \infty $$
n
→
∞
. Furthermore, our rate is close to the known rate of convergence in the usual smoothing spline model.