This paper provides estimation and inference methods for a structural function, such as Conditional Average Treatment Effect (CATE), based on modern machine learning (ML) tools. We assume that such function can be represented as a conditional expectation g(x) = E[Yη 0 |X = x] of a signal Yη 0 , where η0 is the unknown nuisance function. In addition to CATE, examples of such functions include regression function with Partially Missing Outcome and Conditional Average Partial Derivative. We approximate g(x) by a linear form p(x) β0, where p(x) is a vector of the approximating functions and β0 is the Best Linear Predictor. Plugging in the first-stage estimate η into the signal Y η , we estimate β0 via ordinary least squares of Y η on p(X). We deliver a high-quality estimate p(x) β of the pseudo-target function p(x) β0, that features (a) a pointwise Gaussian approximation of p(x0) β0 at a point x0, (b) a simultaneous Gaussian approximation of p(x) β0 uniformly over x, and (c) optimal rate of convergence of p(x) β to p(x) β0 uniformly over x. In the case the misspecification error of the linear form decays sufficiently fast, these approximations automatically hold for the target function g(x) instead of a pseudo-target p(x) β0. The first stage nuisance parameter η0 is allowed to be high-dimensional and is estimated by modern ML tools, such as neural networks, l1-shrinkage estimators, and random forest. Using our method, we estimate the average price elasticity conditional on income using Yatchew and No (2001) data and provide uniform confidence bands for the target regression function.