The recently emerged symbol-level precoding (SLP) technique is a promising solution in multi-user wireless communication systems due to its ability to transform harmful multi-user interference (MUI) into useful signals, thereby improving system performance. Conventional symbol-level precoding designs have a significant computational complexity that makes their practical implementation difficult and imposes excessive computational complexity on the system. To deal with this problem, we suggest a new deep learning (DL) based approach that utilizes low-complexity designs of symbol-level precoding. This paper focuses on DL-based one-bit precoding approaches for downlink massive multiple-input multipleoutput (MIMO) systems, where one-bit digital-to-analog converters (DACs) are used to reduce cost and power. Unlike previous works, the optimized one-bit precoder for multiuser massive MIMO system (HDL-O1PmMIMO) for a wide range of signal-to-noise-ratio (SNR) has a low computational complexity, making it suitable for real precoding scenarios. In this paper, we first design an unsupervised DL-based precoder (UDL-O1PmMIMO) to address the low SNR scenarios, using which we then design a hybrid DL-based precoder (HDL-O1PmMIMO) to address both low and high SNR scenarios. The method suggested in this article utilizes a novel residual DL network structure, which helps overcome the problem of training very deep networks. Additionally, a novel customized cost function, specifically for one-bit precoding in massive MIMO systems, is introduced to optimize the performance of the system in handling interference. The results of an experiment conducted on a general test set using Python and MATLAB show that the proposed approach outperforms existing methods in three aspects: it has a lower bit error rate, it takes less time to generate the precoded vector, and it is more resistant to imperfect channel estimation.INDEX TERMS Massive MIMO, one-bit DAC, precoding, unsupervised deep learning.