This paper presents a new power consumption emulation model, for all possible scenarios of the RF subsystem, when transmitting a LTE signal. The model takes the logical interface parameters, Tx power, carrier frequency and bandwidth between the baseband and RF subsystem as inputs to compute the power consumption. An analysis of modeling approaches was conducted and the modeling approach with the least sum of squared errors is used to compute the emulation model. The neural networks applying the Pseudo-Gauss Newton algorithm for optimization proved to have the least sum of squared errors. This approach was validated against a real life scenario with a relative error of 5.77%.