Strongly interacting quantum systems described by non-stoquastic
Hamiltonians exhibit rich low-temperature physics. Yet, their study
poses a formidable challenge, even for state-of-the-art numerical
techniques. Here, we investigate systematically the performance of a
class of universal variational wave-functions based on artificial neural
networks, by considering the frustrated
spin-1/21/2J_1-J_2J1−J2
Heisenberg model on the square lattice. Focusing on neural network
architectures without physics-informed input, we argue in favor of using
an ansatz consisting of two decoupled real-valued networks, one for the
amplitude and the other for the phase of the variational wavefunction.
By introducing concrete mitigation strategies against inherent numerical
instabilities in the stochastic reconfiguration algorithm we obtain a
variational energy comparable to that reported recently with neural
networks that incorporate knowledge about the physical system. Through a
detailed analysis of the individual components of the algorithm, we
conclude that the rugged nature of the energy landscape constitutes the
major obstacle in finding a satisfactory approximation to the ground
state wavefunction, and prevents learning the correct sign structure. In
particular, we show that in the present setup the neural network
expressivity and Monte Carlo sampling are not primary limiting
factors.