Physics‐informed neural networks (PINNs) are neural networks (NNs) that directly encode model equations, like Partial Differential Equations (PDEs), in the network itself. While most of the PINN algorithms in the literature minimize the local residual of the governing equations, there are energy‐based approaches that take a different path by minimizing the variational energy of the model. It is shown that in the case of the steady thermal equation weakly coupled to magnetic equation, the energy‐based approach displays multiple advantages compared to the standard residual‐based PINN: it is more computationally efficient, it requires a lower order of derivatives to compute, and it involves less hyperparameters. The analyzed benchmark problems are the single‐ and multi‐objective optimal design of an inductor for the controlled heating of a graphite plate. The optimized device is designed by involving a multi‐physics problem: a time‐harmonic magnetic problem and a steady thermal problem. For the former, a deep neural network solving the direct problem is supervisedly trained on Finite Element Analysis (FEA) data. In turn, the solution of the latter relies on a hypernetwork that takes as input the inductor geometry parameters and outputs the model weights of an energy‐based PINN (or ePINN). Eventually, the ePINN predicts the temperature field within the graphite plate.