We propose an approach for Non-negative Matrix Factorization (NMF) with sparseness constraints on feature vectors. It has been believed that the non-negativity constraint in NMF contributes to making the learned features sparse, and some approaches incorporated additional sparseness constraints. However, previous approaches have not considered the sparsity of features explicitly. Our approach explicitly incorporates the notion of sparsity of features, in terms of independence of features and correlation of features. The proposed notion of sparsity is formalized as regularization terms in the framework of NMF, and learning algorithms with multiplicative update rules are proposed. The proposed approach is evaluated in terms of document clustering over well-known benchmark datasets. The results are encouraging and show that the proposed approach improves the clustering performance, while sustaining relatively good quality of data approximation.