Edit propagation is an appearance-editing method using sparsely provided edit strokes from users. Although edit propagation has a wide variety of applications, it is computationally complex, owing to the need to solve large linear systems. To reduce the computational cost, interpolation-based approaches have been studied intensely. This study is inspired by an interpolation-based edit-propagation method that uses a clustering algorithm to determine samples. The method uses an interpolant, which approximates edit parameters with convex combinations of the samples. However, because the clustering algorithm generates samples that lie inside the set of pixels in a feature space, an interpolant with convex combinations does not allow for an exact reconstruction of the pixels outside the convex hull. To address this issue, this paper proposes a novel approximation model for interpolating image colors as well as edit parameters using affine combinations. In addition, this paper introduces sparse pixel sampling to determine the quantity and positions of samples and the weight coefficients of the affine combinations simultaneously. Sparse pixel sampling is performed by updating candidate pixels. Unnecessary pixels are discarded with compressive sensing, and new candidate pixels are greedily resampled following their approximation errors. This paper demonstrates that the B Tatsuya Yatagawa proposed model achieves better approximation in terms of both image colors and edit parameters, and discusses the properties of the proposed model with various experiments.