The variational data assimilation problem can be solved in either its primal (3D/4D-Var) or dual (3D/4D-PSAS) form. The methods are equivalent at convergence but the dual method exhibits a spurious behaviour at the beginning of the minimization which leads to less probable states than the background state. This is a serious concern when using the dual method in operational implementations when only a finite number of iterations can be afforded. Two classes of minimization algorithms are examined in this article: the conjugate gradient (CG) and the minimum residual (MINRES) methods. While the CG algorithms ensure a monotonic reduction of the cost function, those based on the MINRES enforce instead a monotonic decrease of the norm of the gradient. In this article, it is shown that when applied to the minimization of the dual problem, the MINRES algorithms also lead to iterates for which their 'image' in physical space leads to a monotonic decrease of the primal cost function. A relationship is established showing that the primal objective function is related to the value of the dual cost function and the norm of its gradient. This holds for the incremental forms of both the three-and four-dimensional cases. A new convergence criterion is introduced based on the error norm in model space to make sure that, for the dual problem, the same accuracy is obtained in the analysis when only a finite number of iterations is completed.