We consider the problem of nonparametric regression under shape constraints. The main examples include isotonic regression (with respect to any partial order), unimodal/convex regression, additive shape-restricted regression, and constrained single index model. We review some of the theoretical properties of the least squares estimator (LSE) in these problems, emphasizing on the adaptive nature of the LSE. In particular, we study the behavior of the risk of the LSE, and its pointwise limiting distribution theory, with special emphasis to isotonic regression. We survey various methods for constructing pointwise confidence intervals around these shape-restricted functions. We also briefly discuss the computation of the LSE and indicate some open research problems and future directions.Observe that when d = 1, convexity is characterized by nondecreasing derivatives (subgradients). This observation can be used to generalize convexity to k-monotonicity (k ≥ 1): a real-valued function f is said to be k-monotone if its (k − 1)'th derivative is monotone; see e.g., [94,28]. For equi-spaced design points in R, this restriction constrains θ * to lie in the setwhere ∇ : R n → R n is given by ∇(θ) := (θ 2 − θ 1 , θ 3 − θ 2 , . . . , θ n − θ n−1 , 0) and ∇ k represents the k-times composition of ∇. Note that the case k = 1 and k = 2 correspond to isotonic and convex regression, respectively.Example 1.4 (Unimodal regression). In many applications f , the underlying regression function, is known to be unimodal; see e.g., [49,27] and the references therein. Let I m , 1 ≤ m ≤ n, denote the convex set of all unimodal vectors (first decreasing and then increasing) with mode at position m, i.e.,