We consider the problem of estimating the regression function f in the regression model y i = f (x i ) + ε i , where f is assumed to lie in a reproducing kernel Hilbert space (RKHS) and the errors are multivariate normal. This model has wide ranging applications, from regression with a functional covariate to (naive) classification.The main contribution of this paper is a proposal for an objective prior for the regression function f , defined as the distribution maximizing entropy subject to a suitable constraint based on the Fisher information on the regression function. The prior, which we call I-prior, is Gaussian with covariance kernel proportional to the Fisher information, and mean chosen a priori (e.g., 0). The I-prior has the intuitively appealing property that the more information is available about a linear functional of the regression function, the larger its prior variance, and, broadly speaking, the less influential the prior is on the posterior.The I-prior methodology can be used as a principled alternative to Tikhonov regularization, which suffers from well-known theoretical problems which we briefly review.We describe in some detail the case that the regression function lies in the multidimensional fractional Brownian motion RKHS, when the I-prior methodology has some particular appeal. Analysis of some real data sets and a small-scale simulation study show competitive performance of the I-prior methodology, which is implemented in the R-package iprior (Jamil, 2017).