In this paper, we study an optimal control problem in which their cost function is interval-valued. Also, a stochastic differential equation governs their state space. Moreover, we introduce a generalized version of Bellman's optimality principle for the stochastic system with an interval-valued cost function. Also, we obtain the Hamilton-Jacobi-Bellman equations and their control decisions. Two numerical examples happen in finance in which their cost function are interval-valued functions, illustrating the efficiency of the discussed results. The obtained results provide significantly reliable decisions compared to the case where the conventional cost function is applied.