In some applications, the mean or median response is linearly related to some variables but the relation to additional variables are not easily parameterized. Partly linear models arise naturally in such circumstances. Suppose that a random samplewhere Y i is a real-valued response, X i # R p and T i ranges over a unit square, and g 0 is an unknown function with a certain degree of smoothness. We make use of bivariate tensor-product B-splines as an approximation of the function g 0 and consider M-type regression splines by minimization of;& g n (T i )) for some convex function \. Mean, median and quantile regressions are included in this class. We show under appropriate conditions that the parameter estimate of ; achieves its information bound asymptotically and the function estimate of g 0 attains the optimal rate of convergence in mean squared error. Our asymptotic results generalize directly to higher dimensions (for the variable T) provided that the function g 0 is sufficiently smooth. Such smoothness conditions have often been assumed in the literature, but they impose practical limitations for the application of multivariate tensor product splines in function estimation. We also discuss the implementation of B-spline approximations based on commonly used knot selection criteria together with a simulation study of both mean and median regressions of partly linear models.