Recent investigations on the error analysis of kernel regularized pairwise learning initiate the theoretical research on pairwise reproducing kernel Hilbert spaces (PRKHSs). In the present paper, we provide a method of constructing PRKHSs with classical Jacobi orthogonal polynomials. The performance of the kernel regularized online pairwise regression learning algorithms based on a quadratic loss function is investigated. Applying convex analysis and Rademacher complexity techniques, the bounds for the generalization error are provided explicitly. It is shown that the convergence rate can be greatly improved by adjusting the scale parameters in the loss function.1. Introduction. In recent years, online learning algorithms have been attracting the attentions of a lot of researchers in statistical learning theory(see e.g. [21,58,71,73,74] and references therein). In present paper, we shall give an investigation on the convergence analysis of kernel-based regularized online pairwise learning algorithm associating with the quadratic loss.1.1. Online learning algorithms. Let X be a given compact set in the d-dimensional Euclidean space R d , Y ⊂ R. Let ρ be a fixed but unknown probability distribution on Z = X × Y which yields its marginal distribution ρ x on X and its conditional distribution ρ(• | x) at x ∈ X. Then we denote by {z t = (x t , y t )} T t=1 the sample drawn i.i.d.(independently and identically distribution) according to distribution ρ. The aim of regression learning is to find a predictor f : X → R from a hypothesis space such that f (x) is a "good" approximation of y. Let V (r) : R → R +