A great deal of effort has been devoted to the inference of additive model in the last decade. Among existing procedures, the kernel type are too costly to implement for high dimensions or large sample sizes, while the spline type provide no asymptotic distribution or uniform convergence. We propose a one step backfitting estimator of the component function in an additive regression model, using spline estimators in the first stage followed by kernel/local linear estimators. Under weak conditions, the proposed estimator's pointwise distribution is asymptotically equivalent to an univariate kernel/local linear estimator, hence the dimension is effectively reduced to one at any point. This dimension reduction holds uniformly over an interval under assumptions of normal errors. Monte Carlo evidence supports the asymptotic results for dimensions ranging from low to very high, and sample sizes ranging from moderate to large. The proposed confidence band is applied to the Boston housing data for linearity diagnosis.