“…Such systems include the ones by Lebret, Grangier, and Auli (2016), who use conditional language model with copy mechanism for generation, Liu et al (2017), who propose a dual attention Seq2Seq model, Nema et al (2018), who use gated orthogonalization along with dual attention, and Bao et al (2018), who introduce a flexible copying mechanism that selectively replicates contents from the table in the output sequence. Other systems revolve around popular data sets such as the WEATHERGOV data set (Liang, Jordan, and Klein 2009;Jain et al 2018), ROBOCUP data set (Chen and Mooney 2008), ROTOWIRE and SBNATION (Wiseman, Shieber, and Rush 2017), and the WEBNLG data set (Gardent et al 2017). Recently Bao et al (2018) andNovikova, Dusek, andRieser (2017) have introduced a new data set for table/tuple to text generation, and both supervised and unsupervised systems (Fevry and Phang 2018) have been proposed and evaluated against these data sets.…”