New approaches have been proposed to detect and exploit sparsity in adaptive systems. However, the sparsity is not always explicit among the system coefficients, thus requiring some tools to reveal it. By means of the so-called feature function, we propose the low-complexity feature stochastic gradient (LF-SG) algorithm to exploit hidden sparsity. The proposed algorithm aims at reducing the computational load of the learning process, as compared to the least-mean-square (LMS) algorithm. We focus on block-lowpass systems, but the proposed approach can easily be adapted to exploit other kinds of features of the unknown system, e.g., highpass and bandpass characteristics. Then, we analyze some properties of the LF-SG algorithm, namely its steady-state mean squared error (MSE), its bias, and the choice of the step-size parameter. Simulation results illustrate the competitive MSE performance of the LF-SG in comparison with the LMS, but the former algorithm requires much fewer multiplication operations to identify lowpass systems. For instance, to identify a measured room impulse response, the LF-SG algorithm realized less than half of the multiplication operations required by the LMS algorithm.