Recurrent neural networks have proven to be effective in various domains due to their ability to remember key information during solution processes. The continuous Hopfield network is a recurrent neural network that can solve several complicated problems, including combinatorial optimization problems. The user of a continuous Hopfield network for solving combinatorial optimization problems must construct an energy function that combines the energy function and the constraints; this combination requires penalty hyper-parameters that have a direct impact on the quality of the solution and its feasibility. To ensure the convergence of the continuous Hopfield network and thus ensure the feasibility of the solutions to the combinatorial problems, we introduce a linear optimization model given by: the objective function represents the energy function of continuous Hopfield network, which controls the quality of the solution, and the constraints represent conditions on the parameters of the continuous Hopfield network penalty function extracted by the hyperplane procedure, which ensure the feasibility of the solutions. On the well-known NP-complete problem (task assignment problem, traveling salesman problem, weighted constraint satisfaction problem, max-stable problem, graph coloring problem, shortest path problem and portfolio selection problem) the genetic algorithm is used to solve the proposed model. The proposed method has shown its superiority over random methods for choosing continuous Hopfield network hyper-parameters: the solutions produced are all feasible, and the difference in accuracy between our method and random methods is 48.8%.