We analyze the convergence rate of the alternating direction method of multipliers (ADMM) for minimizing the sum of two or more nonsmooth convex separable functions subject to linear constraints. Previous analysis of the ADMM typically assumes that the objective function is the sum of only two convex functions defined on two separable blocks of variables even though the algorithm works well in numerical experiments for three or more blocks. Moreover, there has been no rate of convergence analysis for the ADMM without strong convexity in the objective function. In this paper we establish the global linear convergence of the ADMM for minimizing the sum of any number of convex separable functions. This result settles a key question regarding the convergence of the ADMM when the number of blocks is more than two or if the strong convexity is absent. It also implies the linear convergence of the ADMM for several contemporary applications including LASSO, Group LASSO and Sparse Group LASSO without any strong convexity assumption. Our proof is based on estimating the distance from a dual feasible solution to the optimal dual solution set by the norm of a certain proximal residual, and by requiring the dual stepsize to be sufficiently small.KEY WORDS: Linear convergence, alternating directions of multipliers, error bound, dual ascent.