Deep learning models based on NLP, mainly the Transformer family, have been successfully applied to solve many chemistry-related problems, but their applications are mostly limited to chemical reactions. Meanwhile, solvation is an important concept in physical and organic chemistry, describing the interaction of solutes and solvents. This interaction leads to a solvation complex, a molecular complex similar to a reactant-reagent complex. In this study, we introduced the SolvBERT model, which reads the solute and solvents through the SMILES representation of the solvation complex. SolvBERT is pretrained in an unsupervised learning fashion using a large database of computational solvation free energies. The pretrained model can be used to predict the experimental solvation free energy or solubility, depending on the fine-tuning database. To the best of our knowledge, this multi-task prediction capability has not been observed in previously developed graph-based models for predicting the properties of molecular complexes. Furthermore, the performance of our SolvBERT in predicting solvation free energy is comparable to the state-of-the-art graph-based model DMPNN, mainly due to the clustering feature of the pretraining phase of the model, as demonstrated by the TMAP visualization algorithm.