The l1 regularized least square problem, or the lasso, is a non-smooth convex minimization which is widelyused in diverse fields. However, solving such a minimization is not straightforward since it is not differentiable. In this paper, an equivalent smooth minimization with box constraints is obtained, and it is proved to be equivalent to the lasso problem. Accordingly, an efficient recurrent neural network is developed which guarantees to globally converge to the solution of the lasso. Further, it is investigated that the property "the dual of dual is primal" holds for the l1 regularized least square problem. The experiments on image and signal recovery illustrate the reasonable performance of the proposed neural network.Index Terms-sparse, l1 regularization, smooth, neural network.