Abstract. Consider approximating functions based on a finite number of their samples. We show that adaptive algorithms are much more powerful than nonadaptive ones when dealing with piecewise smooth functions. More specifically, let F 1 r be the class of scalar functions f : [0, T ] → R whose derivatives of order up to r are continuous at any point except for one unknown singular point. We provide an adaptive algorithm A ad n that uses at most n samples of f and whose worst case L p error (1 ≤ p < ∞) with respect to 'reasonable' function classes F 1 r ⊂ F 1 r is proportional to n −r . On the other hand, the worst case error of any nonadaptive algorithm that uses n samples is at best proportional to n −1/p .The restriction to only one singularity is necessary for superiority of adaption in the worst case setting. Fortunately, adaption regains its power in the asymptotic setting even for a very general class F ∞ r consisting of piecewise C r -smooth functions, each having a finite number of singular points. For any f ∈ F ∞ r our adaptive algorithm approximates f with error converging to zero at least as fast as n −r . We also prove that the rate of convergence for nonadaptive methods cannot be better than n −1/p , i.e., is much slower.The results mentioned above do not hold if the errors are measured in the L ∞ norm, since no algorithm produces small L ∞ errors for functions with unknown discontinuities. However, we strongly believe that the L ∞ norm is inappropriate when dealing with singular functions and that the Skorohod metric should be used instead. We show that our adaptive algorithm retains its positive properties when the approximation error is measured in the Skorohod metric. That is, the worst case error with respect to F 1 r equals Θ(n −r ), and the convergence in the asymptotic setting for F ∞ r is n −r . Numerical results confirm the theoretical properties of our algorithms.