“…Bepler & Berger (2019) pre-trained LSTMs on protein sequences, adding supervision from contacts to produce embeddings. Subsequent to our preprint, related works have built on its exploration of protein sequence modeling, exploring generative models (Riesselman et al, 2019;Madani et al, 2020), internal representations of Transformers (Vig et al, 2020), and applications of representation learning and generative modeling such as classification (Elnaggar et al, 2019;Strodthoff et al, 2020), mutational effect prediction (Luo et al, 2020), and design of sequences (Repecka et al, 2019;Hawkins-Hooker et al, 2020;Amimeur et al, 2020).…”