This paper presents a novel method which uses neural networks to improve the performance of RLS algorithms in estimating the parameters of nonlinear processes. Its application in a process gain-adaptive GPC algorithm is also described.Normally in order to model the dynamics of a process, a delayed sequence of its input and output signals is used as input patterns to neural networks. This method however uses neural networks to learn the parameter updating process of standard RLS algorithms and to relate these parameters to the operating conditions. Experimental results when using this method in modelling and control of a heat transfer process of an airhandling plant are reported and show great potential.