“…Similarly, the bootstrapping technique developed for traditional context-counting approaches (Peirsman & Padó, 2010;Vulić & Moens, 2013b) is an important predecessor to recent iterative self-learning techniques used to limit the bilingual dictionary seed supervision needed in mapping-based approaches (Hauer, Nicolai, & Kondrak, 2017;?). The idea of CCA-based word embedding learning (see later in Section 6) (Faruqui & Dyer, 2014b;Lu, Wang, Bansal, Gimpel, & Livescu, 2015) was also introduced a decade earlier (Haghighi, Liang, Berg-Kirkpatrick, & Klein, 2008); their word additionally discussed the idea of combining orthographic subword features with distributional signatures for cross-lingual representation learning: This idea re-entered the literature recently (Heyman, Vulić, & Moens, 2017), only now with much better performance.…”