In this papers we analyze the minimization of seminorms L · on R n under the constraint of a bounded I-divergence D(b, H·) for rather general linear operators H and L. The I-divergence is also known as Kullback-Leibler divergence and appears in many models in imaging science, in particular when dealing with Poisson data. Often H represents, e.g., a linear blur operator and L is some discrete derivative or frame analysis operator. We prove relations between the the parameters of I-divergence constrained and penalized problems without assuming the uniqueness of their minimizers. To solve the I-divergence constrained problem we apply first-order primal-dual algorithms which reduce the problem to the solution of certain proximal minimization problems in each iteration step. One of these proximation problems is an I-divergence constrained least squares problem which can be solved based on Morosov's discrepancy principle by a Newton method. Interestingly, the algorithm produces not only a sequence of vectors which converges to a minimizer of the constrained problem but also a sequence of parameters which convergences to a regularization parameter so that the corresponding penalized problem has the same solution as our constrained one. We demonstrate the performance of various algorithms for different image restoration tasks both for images corrupted by Poisson noise and multiplicative Gamma noise.