Abstract. One of the most efficient interior-point methods for some classes of block-angular structured problems solves the normal equations by a combination of Cholesky factorizations and preconditioned conjugate gradient for, respectively, the block and linking constraints. In this work we show that the choice of a good preconditioner depends on geometrical properties of the constraints structure. In particular, it is seen that the principal angles between the subspaces generated by the diagonal blocks and the linking constraints can be used to estimate ex-ante the efficiency of the preconditioner. Numerical validation is provided with some generated optimization problems. An application to the solution of multicommodity network flow problems with nodal capacities and equal flows of up to 64 million variables and up to 7.9 million constraints is also presented.Key words. interior-point methods, structured problems, preconditioned conjugate gradient, principal angles, large-scale optimization AMS subject classifications. 90C06, 90C08, 90C511. Introduction. Many real-world optimization problems exhibit a primal block-angular structure, where decision variables and constraints can be grouped in different blocks which are dependent due to some linking constraints Applications of this class of problems can be found, for instance, in multicommodity telecommunications networks, statistical data protection, multistage control and planning, and complex networks.Solution strategies to deal with this class of problems can be broadly classified into simplex-based methods [14,23], decomposition methods [17, 1, 2, 27], approximation methods [5], and interiorpoint methods [10,21]. One of the most efficient interior-point methods (IPMs) for some classes of block-angular problems solves normal equations by a combination of Cholesky factorizations for the block constraints and preconditioned conjugate gradient (PCG) iterations for the linking constraints [10,11]. The spectral radius of a certain matrix in the preconditioner, which is always in [0, 1) plays an important role in the efficiency of this approach. It was observed that for separable convex problems with nonzero Hessians this spectral radius is reduced, and the PCG becomes more efficient [13]. When the spectral radius approaches 1, switching to a suitable preconditioner may be an efficient alternative [7]. It is worth noting that computing approximate Newton directions by PCG does not destroy the good convergence properties of IPMs, as shown in [20]. There is an extensive literature on the use of PCG within IPMs for other types of problems (e.g., [3,4,9,18,25,28] to mention a few).However, it is not yet clear why for some classes of block-angular problems the above approach may be very efficient (see, for instance, the results of [13,12]), while it may need a large number of PCG iterations in others. The spectral radius may be used to monitor the good or bad behaviour, but an ex-ante explanation has to be found in the structural information of the block-angular constr...