“…Model Parameters BMGF-RoBERTa (Liu et al, 2020) 2.3M XLNet(base, cased) (Kim et al, 2020) 110M XLNet(large, cased) (Kim et al, 2020) 340M OTMT(XLNet-base) (Jiang et al, 2022a) 110M OTMT(XLNet-large) (Jiang et al, 2022a) 340M Fine-Tuning (T5-base) (Raffel et al, 2020) 220M Fine-Tuning (T5-large) (Raffel et al, 2020) 770M Prefix-Tuning (T5-base) (Li and Liang, 2021) 0.12M Prefix-Tuning (T5-large) (Li and Liang, 2021) 0.16M Prompt-Tuning (T5-base) (Lester et al, 2021) 0.12M Prompt-Tuning (T5-large) (Lester et al, 2021) 0…”