A number of machine learning (ML) algorithm based small signal modeling of Gallium Nitride (GaN) High Electron Mobility Transistors (HEMTs) have been reported in literature. However, these techniques rarely provide any inkling about their suitability in modeling GaN HEMTs under varied operating conditions. In this context, this paper thoroughly investigates various ML based techniques and identifies their suitability for specific application scenarios. At first, an array of commonly employed modeling techniques based around Artificial Neural Network, RANdom SAmple Consensus, Support Vector Regression, Gaussian Process Regression, Decision Tree, and Genetic algorithm assisted Artificial Neural Network are used for development of modeling framework to exploit the bias, frequency and geometry dependence on S-parameter based outputs. Subsequently, the ensemble techniques namely Bootstrap aggregating, Random Forests, Extremely Randomized Trees, AdaBoost, Gradient Tree Boosting, Histogram-based Gradient Boosting, and Extreme Gradient Boosting are also explored to understand the capability of these algorithms in the development of GaN HEMT small signal models. Thereafter, an exhaustive analysis of bias and variance is carried out to figure out the most appropriate algorithms for specific applications. The discrepancies during model development are removed by tuning the hyperparameters of the respective models using Random search optimization with 5-fold cross validation technique. Post tuning, the models are evaluated in terms of generalization capability, Advanced Design System compatibility, computational efficiency, training and simulation time, models' capacity and parameters'tuning time.