In this paper we study the variational problem associated to support vector regression in Banach function spaces. Using the Fenchel-Rockafellar duality theory, we give explicit formulation of the dual problem as well as of the related optimality conditions. Moreover, we provide a new computational framework for solving the problem which relies on a tensor-kernel representation. This analysis overcomes the typical difficulties connected to learning in Banach spaces. We finally present a large class of tensor-kernels to which our theory fully applies: power series tensor kernels. This type of kernels describe Banach spaces of analytic functions and include generalizations of the exponential and polynomial kernels as well as, in the complex case, generalizations of the Szegö and Bergman kernels.solve the dual problem as well as to compute the solution of the primal (infinite dimensional) problem. This is what it is known as the kernel trick and makes support vector regression effective and so popular in applications [21].Learning in Banach spaces of functions is an emerging area of research which in principle permits to consider learning problems with more general types of norms than Hilbert norms [5,10,27]. The main motivation for this generalization comes from the need of finding more effective sparse representations of data or for feature selection. To that purpose, several types of alternative regularization schemes have been proposed in the literature, and we mention, among others, ℓ 1 regularization (lasso), elastic net, and bridge regression [8,11]. Moreover, the statistical consistency of such more general regularization schemes have been addressed in [5,6,8,15]. However, moving to Banach spaces of functions and Banach norms pose serious difficulties from the computational point of view [22]. Indeed, even though, in this more general setting, it is still possible to introduce appropriate reproducing kernels [27], they fail to properly represent the solution of the dual and primal problem, so that the dual approach becomes cumbersome. For this reason, the above mentioned estimation techniques are often implemented by directly tackling the primal problem and therefore, as a matter of fact, reduces to a finite dimensional estimation methods (that is to parametric models).In this work we address support vector regression in Banach function spaces and we provide a new computational framework for solving the associated optimization problem, overcoming the difficulties we discussed above. Our model is described in the primal by means of an appropriate feature map in Banach spaces of features and a general regularizer. We first study, in great generality, the interplay between the primal and the dual problem through the Fenchel-Rockafellar duality. We obtain an explicit formulation of the dual problem, as well as of the related optimality conditions, in terms of the feature map and the subdifferentials of the loss function and of the regularizer. As a byproduct we also provide a general representer theorem.Next, we ...