“…With recent advances in neural networks, researchers have built different neural models based on various strategies, e.g. latentvariables (Wiseman et al, 2018;Ye et al, 2020), structure awareness (Liu et al, 2018;Colin and Gardent, 2019), copy mechanism (Gehrmann et al, 2018;Puduppully et al, 2019a,b), and pre-trained language models (PLMs) (Chen et al, 2020a;Kale, 2020;Ribeiro et al, 2020) 2020) adapted the pre-trained GPT-2 model with different architectural designs, e.g. switch policy (Chen et al, 2020b) and content matching (Gong et al, 2020), to address the few-shot table-to-text generation problem.…”