Metaheuristic algorithms are constructed to solve optimization problems, but they cannot solve all the problems with best solutions. This work proposes a novel self-adaptive metaheuristic optimization algorithm, named Optimal Stochastic Process Optimizer (OSPO), which can solve different kinds of optimization problems with promising performance. Specifically, OSPO regards the procedure of optimization as a realization of stochastic process, and with the help of Subjective Probability Distribution Function (SPDF) and Receding Sampling Strategy proposed in this paper, OSPO can control the explorationexploitation property online by the adaptive modification of the parameters in SPDF. This adaptive exploration-exploitation property of OSPO contributes to dealing with different kinds of problems; thus, it makes OSPO have the potential to solve at least a vast majority of optimization problems. The proposed algorithm is first benchmarked on uni-modal, multi-modal and composite test functions both in low and high dimensions. The results are verified by comparative studies with seven well-performed metaheuristic algorithms. Then, 21 real-world optimization problems are used to further investigate the effectiveness of OSPO. The winners of CEC2020 Competition on Real-World Single Objective Constrained Optimization, SASS algorithm, sCMAgES algorithm, EnMODE algorithm and COLSHADE algorithm are used as four comparative algorithms in real-world optimization problems. The analysis of simulations demonstrates that OSPO is able to provide very competitive performance compared to the comparative meta-heuristics both in benchmark functions and in real-world optimization problems; thus, the potential of OSPO to solve at least a vast majority of optimization problems is verified. A corresponding MATLAB APP demo is available on https://github.com/JiahongXu123/OSPO-algorithm.git.