Primal-dual hybrid gradient method (PDHG, a.k.a. Chambolle and Pock method [9]) is a well-studied algorithm for structured minimax optimization problems with wide applications, initially in image processing and more recently in large-scale linear programming. In this paper, we propose a new progress metric for analyzing PDHG, which we dub infimal sub-differential size (IDS). IDS is a natural extension of the gradient norm of smooth problems to non-smooth problem. Compared with traditional progress metrics of PDHG, i.e., primal-dual gap or distance to the optimal solution set, IDS always has a finite value, easy to compute and, more importantly, monotonically decays. We show that IDS has O( 1 k ) sublinear rate for solving convex-concave primal-dual problems, and linear convergence rate if the problem further satisfies a regularity condition that is satisfied by applications such as linear programming, quadratic programming, TV-denoising model, etc. Furthermore, we present examples showing that the obtained convergence rates are tight for PDHG. The simplicity of our analysis suggests that IDS is a right progress metric for analyzing PDHG. The analysis and results on PDHG can be directly generalized to other primal-dual algorithms, for example, proximal point method (PPM).