Recent developments in deep learning have shown significant improvement in the accuracy of acoustic impedance inversion results. However, the conventional gradient-based optimizers such as root mean square propagation (RMSProp), momentum, adaptive moment estimation (ADAM), etc., used in the deep learning framework, inherently tend to converge at the nearest optimum point, thereby compromising the solution by not attaining the global minimum. We apply a hybrid global optimizer, genetic-evolutionary ADAM (GADAM) to address the issue of convergence at a local optimum in a semi-supervised deep sequential convolution network-based learning framework to solve the non-convex seismic impedance inversion problem. GADAM combines the advantages of adaptive learning of ADAM and genetic evolution of genetic algorithm (GA), which facilitates faster convergence, and avoids sinking into the local minima. The efficacy of GADAM is tested on synthetic benchmark data and field examples. The results are compared with that obtained from a widely used ADAM optimizer and conventional least-squares method. In addition, uncertainty analysis is performed to check the implication of the optimizer's choice in obtaining efficient and accurate seismic impedance values. Results show that the level of uncertainty and minima of loss function attained using the GADAM optimizer are comparatively lower than that for ADAM. Thus, the present study demonstrates that the hybrid optimizer, i.e., GADAM is more efficient than the extensively used ADAM optimizer in impedance estimation from seismic data in a deep learning framework.