Sparse coding techniques for image processing traditionally rely on a processing of small overlapping patches separately followed by averaging. This has the disadvantage that the reconstructed image no longer obeys the sparsity prior used in the processing. For this purpose convolutional sparse coding has been introduced, where a shift-invariant dictionary is used and the sparsity of the recovered image is maintained. Most such strategies target the 0 "norm" or the 1 norm of the whole image, which may create an imbalanced sparsity across various regions in the image. In order to face this challenge, the 0,∞ "norm" has been proposed as an alternative that "operates locally while thinking globally". The approaches taken for tackling the non-convexity of these optimization problems have been either using a convex relaxation or local pursuit algorithms. In this paper, we present an efficient greedy method for sparse coding and dictionary learning, which is specifically tailored to 0,∞, and is based on matching pursuit. We demonstrate the usage of our approach in salt-and-pepper noise removal and image inpainting.1. Introduction. Sparse coding can be described as solving the following minimization problem, known as the P 0 problem [9]:where α ∈ R p is a sparse representation of a signal x ∈ R N in the dictionary D ∈ R N ×p . The columns of D, which are referred to as atoms, are a full and overcomplete set, and we will assume without loss of generality that they are normalized to unit 2 norm. The 0 "norm" 1 returns the number of nonzero elements in a vector, also called the sparsity.When modeling natural images, we allow some deviation from the model rather than seeking a perfect reconstruction:An alternative form, in which the sparsity k is known, is:As the P 0 and P k 0 problems are NP-hard, several approximation techniques have been proposed. Matching Pursuit (MP) [28] is a greedy algorithm that in each iteration updates * Accepted for publication in SIAM Journal on Imaging Sciences (SIIMS).