The aim of this study is to discuss the success of the Genetic Algorithm (GA) approach compared with the conventional Newton-Raphson (NR) and Nelder-Mead (NM) algorithms in the estimation of the Binary Logit Model (BLM) parameters over a simulated data and a real data set on Alopecia disease. NR algorithm requires restrictive assumptions such as the continuity of the objective function and determination of starting points for the model parameters in iterative process. As for the NM algorithm, it does not require differentiable objective function but still suffers from the starting point problem. In this study, the best set of parameters that maximize the likelihood function in BLM is found using both NR and NM algorithms. Then, considering the limitations of the conventional methods, the success of GA is investigated on condition that all the assumptions of the NR and NM methods are satisfied. The results show that when the assumptions of the classical techniques are valid, the GA approach can achieve to obtain very close result to NR and NM. This also implies that it is a good alternative to the NR and NM methods when the requirements of the classical methods cannot be satisfied. Model results of NR, NM and GA are compared in terms of the estimated values and the maximum likelihood function value.Copy Right, IJAR, 2016,. All rights reserved.
…………………………………………………………………………………………………….... Introduction:-When we study with a categorical dependent variable, regression model parameters can be estimated using the Maximum Likelihood Estimator (MLE). The aim is to find the best estimation result which maximizes the likelihood or log-likelihood functions. Because the likelihood equations obtained by taking the first order derivatives of the likelihood function with respect to the parameters are not linear, some iterative procedures such as Gauss-Newton, NewtonRaphson, Levenberg-Marquardt, Direct Search and Steepest Descent are developed within the category of the classic optimization techniques for the solution of an unconstrained nonlinear optimization problem.These algorithms require restrictive assumptions such as the continuity of the likelihood function. They also have the risk of not finding the best solution due to the risk of getting stuck on local optimums relative to the selected starting points. Newton-Raphson (NR) algorithm is the most popular one among the classical techniques. Nelder-Mead (NM) algorithm can be seen as a good alternative to the NR algorithm. The most important advantage of NM algorithm is that it eliminates the requirement of the objective function (in our case, the likelihood function) to be differentiable.