Abstract. We study block-diagonal preconditioners and an efficient variant of constraint preconditioners for general two-by-two block linear systems with zero (2,2)-block. We derive block-diagonal preconditioners from a splitting of the (1,1)-block of the matrix. From the resulting preconditioned system we derive a smaller, so-called related system that yields the solution of the original problem. Solving the related system corresponds to an efficient implementation of constraint preconditioning. We analyze the properties of both classes of preconditioned matrices, in particular their spectra. Using analytical results, we show that the related system matrix has the more favorable spectrum, which in many applications translates into faster convergence for Krylov subspace methods. We show that fast convergence depends mainly on the quality of the splitting, a topic for which a substantial body of theory exists. Our analysis also provides a number of new relations between block-diagonal preconditioners and constraint preconditioners. For constrained problems, solving the related system produces iterates that satisfy the constraints exactly, just as for systems with a constraint preconditioner. Finally, for the Lagrange multiplier formulation of a constrained optimization problem we show how scaling nonlinear constraints can dramatically improve the convergence for linear systems in a Newton iteration. Our theoretical results are confirmed by numerical experiments on a constrained optimization problem.We consider the general, nonsymmetric, nonsingular case. Our only additional requirement is the nonsingularity of the Schur-complement-type matrix derived from the splitting that defines the preconditioners. In particular, the (1,2)-block need not equal the transposed (2,1)-block, and the (1,1)-block might be indefinite or even singular. This is the first paper in a two-part sequence. In the second paper we will study the use of our preconditioners in a variety of applications.