This paper studies quasi-Newton methods for solving strongly-convex-strongly-concave saddle point problems (SPP). We propose greedy and random Broyden family updates for SPP, which have explicit local superlinear convergence rate of O 1 − 1 nκ 2 k(k−1)/2 , where n is dimensions of the problem, κ is the condition number and k is the number of iterations. The design and analysis of proposed algorithm are based on estimating the square of indefinite Hessian matrix, which is different from classical quasi-Newton methods in convex optimization. We also present two specific Broyden family algorithms with BFGS-type and SR1-type updates, which enjoy the faster local convergence rate of O 1 − 1 n k(k−1)/2 . Additionally, we extend our algorithms to solve general nonlinear equations and prove it enjoys the similar convergence rate.