This paper is concerned with a class of nonmonotone descent methods for minimizing a proper lower semicontinuous KL function Φ, which generates a sequence satisfying a nonmonotone decrease condition and a relative error tolerance. Under mild assumptions, we prove that the whole sequence converges to a limiting critical point of Φ and, when Φ is a KL function of exponent θ ∈ [0, 1), the convergence admits a linear rate if θ ∈ [0, 1/2] and a sublinear rate associated to θ if θ ∈ (1/2, 1). The required assumptions are shown to be necessary when Φ is weakly convex but redundant when Φ is convex. Our convergence results resolve the convergence problem on the iterate sequence generated by nonmonotone line search algorithms for nonconvex nonsmooth problems, and also extend the convergence results of monotone descent methods for KL optimization problems. As the applications, we achieve the convergence of the iterate sequence for the nonmonotone line search proximal gradient method with extrapolation and the nonmonotone line search proximal alternating minimization method with extrapolation. Numerical experiments are conducted for zero-norm and column ℓ 2,0 -norm regularized problems to validate their efficiency.