In the late sixties, N. Shor and B. Polyak independently proposed optimal first-order methods for solving non-smooth convex optimization problems. In 1982 A. Nemirovski proposed optimal first-order methods for solving smooth convex optimization problems, which utilized auxiliary line search. In 1985 A. Nemirovski and Yu. Nesterov proposed a parametric family of optimal first-order methods for solving convex optimization problems with intermediate smoothness. In 2013 Yu. Nesterov proposed a universal gradient method which combined all good properties of the previous methods, except the possibility of using auxiliary line search. One can typically observe that in practice auxiliary line search improves performance for many tasks. In this paper, we propose the apparently first such method of non-smooth convex optimization allowing the use of the line search procedure. Moreover, it is based on the universal gradient method, which does not require any a priori information about the actual degree of smoothness of the problem. Numerical experiments demonstrate that the proposed method is, in some cases, considerably faster than Nesterov's universal gradient method.