Many engineering problems require solutions to statistical optimization problems. When the global solution is hard to attain, engineers or statisticians always use the better solution because we intuitively believe a principle, called better solution principle (BSP) in this paper, that a better solution to a statistical optimization problem also has better statistical properties of interest. This principle displays some concordance between optimization and statistics and is expected to widely hold. Since theoretical study on BSP seems to be neglected by statisticians, this paper presents a primary discussion on BSP within a relatively general framework. We demonstrate two comparison theorems as the key results of this paper. Their applications to maximum likelihood estimation are presented. It can be seen that BSP for this problem holds under reasonable conditions; i.e., an estimator with greater likelihood is better in some statistical sense.