We give an algorithm to compute a one-dimensional shape-constrained function that best fits given data in weighted-L ∞ norm. We give a single algorithm that works for a variety of commonly studied shape constraints including monotonicity, Lipschitz-continuity and convexity, and more generally, any shape constraint expressible by bounds on first-and/or second-order differences. Our algorithm computes an approximation with additive error ε in O n log U ε time, where U captures the range of input values. We also give a simple greedy algorithm that runs in O(n) time for the special case of unweighted L ∞ convex regression. These are the first (near-)linear-time algorithms for second-order-constrained function fitting. To achieve these results, we use a novel geometric interpretation of the underlying dynamic programming problem. We further show that a generalization of the corresponding problems to directed acyclic graphs (DAGs) is as difficult as linear programming.Efficient Second-Order Shape-Constrained Function Fitting first and second derivatives of f ; their discretized equivalents are hence amenable to our new method. Shape restrictions that we cannot directly handle are studied in [28] (f is piecewise constant and the number of breakpoints is to be minimized) and [26] (unimodal f ). For a more comprehensive survey of shape-constrained function-fitting problems and their applications, see [14, §1]. Motivated by these applications, the problems have been studied in statistics (as a form of nonparametric regression), investigating, e.g., their consistency as estimators and their rate of convergence [13,14, 4].While fast algorithms for isotonic-regression variants have been designed [27], both [22] and [3] list shape constraints beyond monotonicity as important challenges. For example, fitting (multidimensional) convex functions is mostly done via quadratic or linear programming solvers [24]. In his PhD thesis, Balzs writes that current "methods are computationally too expensive for practical use, [so] their analysis is used for the design of a heuristic training algorithm which is empirically evaluated" [4, p. 1].This lack of efficient algorithms motivated the present work. Despite a few limitations discussed below (implying that we do not yet solve Balzs' problem), we give the first near-lineartime algorithms for any function-fitting problem with second-order shape constraints (such as convexity). We use dynamic programming (DP) with a novel geometric encoding for the "states". Simpler versions of such geometric DP variants were used for isotonic regression [25] and are well-known in the competitive programming community; incorporating second-order constraints efficiently is our main innovation.