This paper presents an efficient approach for the use of recursive least square (RLS) learning algorithm in Takagi-Sugeno-Kang neural fuzzy systems. In the use of RLS, reduced covariance matrix, of which the off-diagonal blocks defining the correlation between rules are set to zeros, may be employed to reduce computational burden. However, as reported in the literature, the performance of such an approach is slightly worse than that of using the full covariance matrix. In this paper, we proposed a so-called enhanced local learning concept in which a threshold is considered to stop learning for those less fired rules. It can be found from our experiments that the proposed approach can have better performances than that of using the full covariance matrix. Enhanced local learning method can be more active on the structure learning phase. Thus, the method not only can stop the update for insufficiently fired rules to reduce disturbances in self-constructing neural fuzzy inference network but also raises the learning speed on structure learning phase by using a large backpropagation learning constant.