Nonparametric supervised learning algorithms represent a succinct class of supervised learning algorithms where the learning parameters are highly flexible and whose values are directly dependent on the size of the training data. In this paper, we comparatively study the properties of four nonparametric algorithms, K-Nearest Neighbours (KNNs), Support Vector Machines (SVMs), Decision trees and Random forests. The supervised learning task is a regression estimate of the time lapse in medical insurance reimbursement. Our study is concerned precisely with how well each of the nonparametric regression models fits the training data. We quantify the goodness of fit using the R-squared metric. The results are presented with a focus on the effect of the size of the training data, the feature space dimension and hyperparameter optimization. The findings suggest k-NN's and SVM's algorithms as better models in predicting welldefined output labels (i.e, Time lapse in days). However, overall, the decision tree model performs better because it makes a better prediction on new data points than the ballpark estimates made from likelihood models-SVM's and k-NN's.